TRAFFIC
Avoiding Google’s Filters
How to avoid link anchor text filters and other flightless birds, by forensic SEO specialist, Paul Reilly.
Just the other day, my eldest
daughter, India, was telling me she has to
make a decision as to which subject she will
choose for her GSCEs. She was considering
history, something which I encouraged;
while I was never a fan of history at school,
I’ve come to realise in later life that history
is such an important tool. In any evolving
ecosystem, understanding the past enables
us to accurately predict the future.
If there was one crucially important skill
in SEO it is this: you must instinctively and
accurately anticipate Google’s next move.
So as well as a brief delve into the recent
historical algorithm archives, this article
will provide you with insight into how I
approach link strategy as well as providing
you with fresh (at the time of printing)
information from the MediaSkunkWorks
data set which will help you in your link
building efforts.
How to avoid triggering an
excessive anchor text filter
In order to help you understand the
method used, I’ll cover the first step. In
order to make sense of large quantities of
data, a simplified link profile is required.
This requires us to group all page-specific
anchor text data into the following groups:
●●
Brand – e.g. www.examplecasinobrand.
com, example casino brand
●●
Exact Match – the exact keyword/term,
e.g. ‘online casino’
●●
Phrase Match – keyword as part of a
larger phrase e.g. ‘the best online casino’
●●
Other – none of the above (often call-toaction or non-transactional query e.g.
‘click here’, or ‘visit site’)
Before we get into more of the analytical
method, here’s a little piece of history.
The first time I came across any solid
evidence of an unnatural anchor text
filter was around April/May 2010.
This filter appeared (based on my own
observations, which I must also caveat
citing Heisenberg’s Uncertainty Principle)
to punish specific pages, for specific
keywords, where anchor text repetition
appeared in abnormal proportions.
This coincided with the release of an
optimised crawling infrastructure, code
name: Caffeine, and just to make the
algorithm changes even trickier to isolate,
Google also introduced something called
the ‘May Day’ algorithm on May 1.
Google messing with our minds
We know this May Day algorithm wasn’t
related to the filter I had been observing
due to the timing. Rather than seeing the
impact on or around May 1, it was clear
that there had been two whole months of
increased SERP volatility starting in early
April 2010 and finishing late in May 2010.
Within a large SERP dataset, I had
observed numerous single keyword,
page-specific ranking drops; these drops
varied in severity and appeared to take
into account the commercial impact of
the positional change in order to limit the
volatility of the results.
If the observations were true, it was a
significant change in policy for Google’s
spam team. Previously, there had been
fixed penalties such as 30 position, 60
position, and brand-based penalties.
Previously, Google had a big problem when
it came to punishing spammers. In highly
monetisable SERPs such as ‘Online…
Casino/Poker/Bingo’ as well as the usual
tricky suspects, ‘Cheap Flights’, ‘Holidays’,
‘Car Insurance’ etc, everyone was cheating
the system.
Some agencies would even dominate
entire SERPS, yet Google still wasn’t in any
position to punish all the spammers.
Hidden in the crowd of spammers
Given that every major brand continued
to buy links, punishing them all would be
harmful to Google users. Not seeing the
big brands on page one would lead to
more users moving to alternative
engines, like Bing.
It was the presence of those big brands
which indicate to the user that Google
was returning the best results. We know
this to be true based on another history
lesson. Only a year earlier, when Google
implemented the ‘Vince’ update, an
artificial boost applied to big brands.
According to Google, this ‘update’ only
impacted a small number of queries and
was never referred to by Google as an
algorithm change.
It turned out to be an introduction of
a ‘white list’ to boost some big brands,
who by the very fact they were playing
by Google’s rules had been displaced by
the aggressive brands and an increasing
number of thin affiliates.
Google had previously denied the use
of such lists, but it was only when Eric
Schmidt admitted under oath to the DoJ
that white lists were in fact used, that it was
widely accepted by SEOs. Previously, it had
been a point of contention.
Now, everything had changed…
Google had finally solved the problem
by punishing sites on a granular level
(page-specific, keyword-specific), and now,
the severity of the punishment appeared
variable based on commercial impact. More
recently, Google has referred to this type of
penalty as a ‘granular penalty’.
The first iteration of the
granular penalty
The filter appeared to trigger a granular
penalty in cases where abnormal exact
match anchor text had been used. I
also made a second observation. It now
appeared that keywords or phrases either
in isolation or in broad-match groups – i.e.
‘casino online’, ‘online casino’ – could also
iGB Affiliate FEBRUARY/MARCH 2014
31