May 2015 The Silent Google Update

Earlier this month, a lot of chatter went around about some sort of Google update, with the usual losses and gains being discussed. Many claimed this ot be part of the Google mobile roll out (this would be those who wouldn’t know an analytics chart from a horses backside of course), others claimed it to be panda, while Google themselves claimed there was no algorithm update.

However, Google then within the last week or so  told Search Engine Land’s Barry Schwartz that there WAS a reason behind the speculation, but that it was a CORE algorithm update, which is quite a biggie to go unnoticed. Google claim the change to the core algo  were “how it (Google Algorithm) processes quality signals”.

So think about that, it was essential a core change to how the Google Algorithm grades the quality of a site and/or its links.

So this is the ‘quality update’ and if you want to know what Google thinks quality is about. then take a look here .

In short Google says if you are creating content it should be:-

  • Useful
  • Informative
  • More valuable than other sites
  • Credible
  • High Quality (unique)
  • Engaging

It also tells you what it doesn’t like which is :-

  • Errors such as broken links or wrong information
  • Grammar or spelling mistakes
  • Excessive amount of ads
    Spam such as comment or forum spam

That of course is the light version, as with over 250 ranking signals within the algorithm, it is a lot more complex. But as the saying goes, Rome wasn’t built in a day, get the foundations right as on page SEO becomes more and more important in the Google war on link spamming.

Guest Blogging And The New Google Guidelines

Some interesting stuff posted on the updated Webmaster tools guidelines on Link Schemes   See below red highlight done by myself

Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.

So some key words in there, ANY LINKS not some or maybe, but any links, INTENDED now this is the issue, how does an algorithm  work out intent? For me this is where the whole system falls down, because lets be honest when your site gets scraped, or you are hit with a negative SEO campaign,  you are not even aware it has happened, let alone aware of the intent.  CONSIDERED, again how does an algorithm consider anything? it simply isn’t that clever!  ANY BEHAVIOUR so now we are back to the door being opened to negative SEO big Time.  Finally we have TO your site or OUTGOING LINKS from your site, which clearly is telling people if you have crap spammy ‘guest posts’ on your site, then your site is guilty of spoamming, and EQUALLY as guilty of it, so we slap you also.

In short here are the referenced breaches, and My personal take on them:-

Buying or selling links – this has always been the case, if you buy or sell links that pass Pr you are heading for a fall if caught

Excessive link exchanging – again this has been the case for a long time, but ‘excessive’ in my mind refers to link farming type activities, and NOT to trading links with sites that are relevant to your own, when the links are editorially justifiable. You shouldn’t be afraid to reach out to other sites and work to drive traffic to each other.

Large Scale Article Marketing OR GUEST POSTING CAMPAIGNS with keyword rich anchor text links – this is the change, in that ‘large scale’ article marketing and Guest Blogging are now classed as in breach of the acceptable practice guidelines, and as such I expect to see penalties applied retrospectively on this practice. I wrote many times that formulaic guest blogging where there is practically no benefit to the reader will eventually be pulled in, and that it needs to be considered gray at best. What would the reader want to see, write for the reader, that is the difference, editorial integrity.

Using automated programs to create links to your site – anyone who thinks that the spam team don’t have copies of SENukeX etc is on cloud cukoo land, because they do, and they gobble up copies of pretty much any software that raises its head above the parapet. So think on when using bog standard automated software to build links to your site.

The aricle goes on to cover other items, which I guess are somehow seen as crossovers or pre-existing in the guidelines. :-

Text ads that pass pagerank – we all know about that right?

Optimised anchor text links in press releases  distributed on other sites. – this is fairly new and they are now saying that you can get penalised for using a press release service. for ME this is completely unnacceptable because even if you nofollow your press releases, when they get picked up by spammers ascarapers etc, they often get posted without the nofollow. Google have dropped the ball here,. adn IMo need to balance this out with a far FAR better system of link reporting and disavowing, because the current system isn’t accpetable, and potentially under EU law this could be seen as an illegal practice.

Low quality directory or Bookmark links – Now oddly enough no mention of PageRank passing here, so are they saying that ANY link is bad?

Links embedded in widgets that are distributed across various sites – we all knew about that again
Widely distributed links in the footers of various sites -what about those poor web designers, the answer now is NOFOLLOW your footer links, but ALSO consider the statement above that the HOST site can get penalised, so if you are using a wordpress theme with an anchor text footer link, then you need to check out the volume of links the target site has from that theme, as if it has been used wholesale, and the target site gets hit then YOU may get hit also.

Forum comments with optimised links – again we all knew about comment spamming.

Finally the jewel in the crown of statements

The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.

Keep reading the above, and remember, if it waddles and quacks, it is probably a duck :)

Penguin 2.0 Rolled out by Google

According to Matt Cutts the long awaited and feared Penguin 2.0 update has been released by Google, how have you fared?

Penguin2.0 is full on update rather than a tweak, and Google have stated that 2.3% of english language sites will be affected, which is a huge number by anyones standards.

In recent times Cutts has stated that they are going upstream in this update, that they are targeting spam more heavily, and that this update will rock a few boats, the video below explains this

Time will tell.

Google and Doorway Pages A Page for Each Area

Call them what you like, Doorway pages, Gateway Pages, even Zebra Pages, Google has never liked them. In the past Google has treated these as pages that are built, but have sneaky redirects in them, then they moved the goalposts of definition slightly, and now we have a clear statement from Google not to use them, and more importantly, exactly what Google themselves believe that definition to be.

Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase.

So that is clear and simple,  Don’t put up LARGE SETS, of POOR QUALITY pages on your site, and optimise it for a single term. Does that mean you can’t have a page for your important terms? Absolutely not, the key there is large sets, poor quality. Google wants, desires and yearns for large amounts of HIGH QUALITY pages on a topic.

In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination.

Here they give us another indicator of an element within their algorithm, set out to detect these sites. This would be typical of lead generation sites, or sites that  have a single action page with many  of these doorway pages built in.  A clear example of this would be a site that sells a single product directly or affiliate CPA with many pages surrounding it but having a single action page.

Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users.

This is actually debatable, as searching for something, landing on a page about that term, and finding what you want could be argued that it is delivering a GOOD user experience. Unfortunately while the argument may be debatable, Google are not up for debate.

Therefore, Google frowns on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the one they selected, and that provide content solely for the benefit of search engines.

Interestingly Google have switched back to doorway pages with redirects to a main site, they appear here to be talking about satellite sites, or micro-sites optimised for a term and then pointing to a main site.  Is this a legitimate method for advertising however? IF the links are nofollowed then it could be argued yes, the site is an advert rather than a doorway site. Google appear here to be attempting to prevent businesses from advertising their wares legitimately.

I worked with a well known company a few years ago, and they had a website ‘quote me happy’ which supported their Offline advertising campaigns.  It would be wrong of any search engine to prevent a company from doing this. My advice would be that if you are using such sites, then nofollow the links to the main site, and ensure the capture site has relevant content.

Google may take action on doorway sites and other sites making use of these deceptive practices, including removing these sites from Google’s index.

This is a scary prospect, as they appear here to be saying that they will take action against a doorway site AND other sites making use of these deceptive practices. So what is to stop a competitor setting them up and reporting you?

Finally in the piece, they give some examples of use.

Having multiple domain names targeted at specific regions or cities that funnel users to one page
Templated pages made solely for affiliate linking
Multiple pages on your site with similar content designed to rank for specific queries like city or state names

It is fair to say that anyone who is using a strategy of having hundreds of pages (THIN pages), that are geographically based or topic based, should review their strategy, and, if you have suffered a loss of traffic or a change to the terms you have received traffic for, then this could be the answer.

Linking From Your Homepage To Your Homepage, is it Bad?

An Interesting post on Search Engine Roundtable references the possibility that Google is using self page linking as a spam signal where Keyword rich anchor text is used. In case you are not sure what this means, let me explain.

Let’s say you have a site about blue widgets, and in your homepage content you have a link (or more than one link) pointing to the same page (homepage) with your main keyword set i.e. ‘blue widgets’.

So why use it as a signal, and what is this being based on? Well put simply, why would you link to the same page you are already on, if not to gain benefit that anchor text links bring. keep in mind that the  anchor page also gets given more weight to its anchor text as well as the target page.

This has been brought up as a result of a thread on Webmasterworld forum, where a member states that out of their 150 sites, 3 got hit, and the only common element was this home page self linking. Seems weak if you ask me, simple but an effective signal. it could be enough to drop the trust on the page which in turn could impact on other elements.