Google is in the business of providing customers with the most appropriate answers to their queries. The company’s bottom line depends on delivering the most accurate results that match the searcher’s intent. That’s why we’re witnessing frequent changes to the search algorithm — it’s constantly getting better at providing us with exactly what we’re searching for.

However, Google’s search algorithms aren’t perfect. For one, they’re incredibly inefficient at detecting plagiarism. Not only that, but they have been known to rank content on a plagiarised web page higher than a page with original content. For a lack of a better word, search algorithms don’t have the “motivation” to detect and penalize plagiarism — they’re programmed to rank high-quality pages with excellent content above all others.

That isn’t saying that Google doesn’t care about plagiarism. They’ve stated time and time again that unique content matters and that it’s the foundation of successful websites. If you provide them with enough proof that someone’s been copying your content, and worse yet, ranked higher with it, they’ll take action against them. It’s just that they don’t do so proactively.

Finding yourself on the receiving end of content theft is not only unpleasant, but can have serious consequences for your website in general. Learn more about why Google ranks plagiarised pages better and what you can do to stop it.

Why Google ranks plagiarised content higher

Since Google is in the business of delivering high-quality results to users, it makes sense for them to rely on search algorithms to decide what qualifies as an appropriate result. The problem is that algorithms do not recognize nor care about plagiarised content. If they notice the same content on two different pages, they will choose the better one and rank it higher.

As you can see, the quality of the website is the determining factor as to why plagiarised content sometimes ranks higher. Search algorithms simply prioritize websites over the one with the original content, but lower quality.

What is it though that makes a website seem worse in the eyes of Google’s bots?

According to Google’s own analyst John Mueller, a website that’s constantly ranking lower than the copycat is probably suffering from poor quality. One of the more common reasons for that is lackluster content, but we can safely ignore that ranking factor in this case since we’re discussing plagiarism. Both pages have identical content, so it’s safe to assume that it’s not the issue here.

The other equally common reason for bots seeing your website as low quality is a lack of inbound links. While both external and internal linking are important signals, it’s the former that builds trustworthiness and authority. If your web page is lacking inbound links from different domains to yours, Google’s bots might have trouble assessing the quality of your page. If no one or very few people are linking to your web page, you don’t have their vote of confidence.

If that is the case, it might be time to look into link building strategies to increase brand awareness and show Google that your website is a reputable one.

It makes a lot of sense for inbound links to be the main culprits here. After Google launched the Panda update in 2011, a large portion of the internet was in disarray; the goal of the update was to eliminate low-quality websites and content farms by prioritizing inbound links and search queries for the site’s brand. This meant that a site with more inbound links and a more recognizable brand could just steal content and get ranked much higher.

After Panda rolled out, Google’s forums were flooded with complaints of content scrapers and copyright infringement. It took years to patch out all the shortcomings of Panda, but Google stuck by it.

Aside from weak (or missing) inbound links, another cause of continuous issues with the site quality may lie in misused domains. If the domain you’re currently the owner of used to be involved in spammy tactics and schemes, your site might be suffering from a legacy domain penalty. It’s rare, but it could be the reason why you’re having issues with site quality. It’s best to contact Google directly if that’s the case.

How to report stolen content to Google

Before you decide to do anything about stolen content, make sure to document everything. Take screenshots of your page and of any stolen content. Make sure to document all the URLs as well.

First, you might want to reach out to the owner of the website that’s plagiarizing your content. Give them the opportunity to do things the easy way, and if they’re reasonable, they will honor your request. You would be surprised at how many times reaching out directly saves everyone a heap of trouble.

If they will not be reasonable, it’s time to take more drastic measures. Your original content is protected by Digital Millennium Copyright Act (DMCA). You can initiate a DMCA takedown against a web page by visiting Google’s legal help page.

Provide Google with all the information that they ask for. Tell them the reason why you’re reporting content for removal and provide them with all the evidence you collected. You will then have to fill out a form where you supply all URLs guilty of plagiarizing your content.

It can take anywhere from a couple of days to two weeks to get the copied content removed.

Stay vigilant

Be on the lookout for plagiarized content. If you let plagiarized content stay online for too long, you might find yourself at the receiving end of penalization, especially if the page with stolen content ranks higher than yours.

Stay vigilant, and keep the quality of your website high. Getting high authority links from other websites seems to be the best way to keep the copycats at bay.