Google on Reported Correlation of Bad Links and 50% Traffic Loss

A publisher told Google’s John Mueller that Google Search Console (GSC) reported over five hundred referring pages from two domains. Google’s John Mueller commented on the correlation of referral links/bad links and a drop in traffic.

Google on Reported Correlation of Bad Links and 50% Traffic Loss

The publisher related that GSC showed referrals from two domains totaling up to four links to each page of their site. When visiting those pages the publisher saw that they were empty, there was no content on those page.

They stated that the appearance of those referring links correlated to a 50% drop in traffic.

The publisher asked:

“…is this a scenario where the disavow tool makes sense or does Google detect them as unnatural and will ignore them as a ranking signal?”

Google’s John Mueller commented on the mystery of the “empty” pages and what those might be:

“It’s really hard to say what you’re seeing here. It’s certainly possible there are pages out there that show an empty page to users and then they show a full page to Googlebot.”

That is a reference to a page that is showing one page to Google and another page to everyone else. This practice is called cloaking.

Mueller is explaining that the possibility that the page might be cloaking. This is an explanation of what the publisher might be seeing and not addressing the second issue of the rankings.

Mueller goes on to dismiss the referral pages as technical mistakes rather than a malicious attempt to sabotage the publisher’s rankings.

He said:

“From that point of view, I would just ignore those pages.”

He then suggested inspecting the pages with Google’s Mobile Friendly test to see what the page looks like when GoogleBot sees them. That’s a test for cloaking, to see if a page is showing one page to Google and another page to non-Googlebot visitors.

Mueller then commented on the correlation of bad links and the 50% drop in traffic:

“I don’t think this is something that you need to disavow.

It probably looks weird in the links report but I really wouldn’t worry about this.

With regards to the drop in traffic that you’re seeing, from my point of view that would probably be unrelated to these links.

There’s no real situation… where I could imagine that essentially empty pages would be causing an issue with regards to links.

So I would just ignore that.

If you decide to put them in the disavow file anyway… just keep in mind that this would not affect how we show the data in search console. So the links report would continue to show those.

I don’t think there’s any reason to use a disavow file in this particular case. So I would just leave them be.”

What Did the Publisher See?

What the publisher saw is an old and common phenomenon called referral spam. The original reason for referral spam was that in the early 2000’s certain free analytics programs published lists of referrers in the form of links.

That created the opportunity for spamming a site with fake referrals from the spam site to another site in order to create a link from the public analytics page.

This analytics page was not linked from any other page of a site. It simply existed at an automatically generated URL.

Most sites no longer have those analytics pages. But the practice continues, perhaps in the hopes that if enough publishers click on the links that Google will see it as a form of popularity that would help their rankings.

What the publisher in this hangout probably was looking at was a manufactured referral.

Does Referrer Spam Hurt Rankings?

In twenty years of creating websites, I have never published a site that did not attract referrer spam. Referrer spam and the ghost links to my sites have had no effect on my rankings.

Referrer spam is very common. It’s really a non-issue.

The reason why the publisher’s site lost fifty percent of their rankings lies in another explanation.

This news was originally posted on searchenginejournal.com