Archive for January, 2012
Announced a few hours ago on the Google inside Search Blog link to PDF was this little beauty of page layout analysis being part of the quality score within the ranking algorithm. What does this mean in reality?
Many years ago back in 2004, Microsoft created an algorithm that scored links based on their location, type and size within a page, this was called Block Level Link Analysis, and as a result, all links were no longer equal (which is a good thing). It has been believed that Google have used this for sometime, but they have now taken this a step further.
Google claim to be interested in user experience beyond all else (other than making money of course), and this change appears to be a quality control change, but this old bald guy can’t help think that there is huge potential for babies to go out in the bathwater.
So lets grab some snippets of that blogg post by Matt Cutts
As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change.
So they are acting on complaints (from their testers) that they want to see content above the fold.
Define Above the Fold?
Two HUGE things there are how do you define ‘content’ and how do you define ‘above the fold’? For those that don’t know, ‘above the fold’ is a term from the newspaper days of broadsheets which were traditionally folded in half. In the news displays, the papers would be placed with the masthead and headline showing out (top half of front page), anything in the printed part is ‘below the fold’ In this case it means the page requires scrolling to read it. The difficulty here is how do you define ‘above the fold’ in a world where monitor sizes vary dramatically, screen resolutions the same, how on earth do you define ‘above the fold’? Sorry Google but that is pretty much an impossible task.
On a photography site the content may be imagery, the content may be delivered by video, so HOW THEN are Google going to define the term ‘content’ Flash, Ajax, there are so many technologies that can render a page devoid of content, yet still deliver an enriching entertaining, satisfying user experience. But is this ‘content’ under the definition which Google apply?
and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.
That right there tells me they are going after excessive advertising, which was a large factor in the panda update(s), it is also clearly saying that going forward, such sites will not rank as highly.
This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page.
Now read this and think, “or make it hard to find Actual ORIGINAL content” so again this appears to be targeting content scrapers (like Google :p ) sorry I couldn’t resist! Back on topic, it looks like they are trying to get original content with a balance between ads & content. I was involved in traditional publishing and a 60/40 mix (with content being 60%) is about as high as you want to go without ruining the user experience, and getting ad blindness, which results in poor advertising ROI. Which is fine, but that single video could have 1000 words on it, all original, all relevant, all content . How will google handle this?
OK Google define ‘Above The Fold’ Please
If you believe that your website has been affected by the page layout algorithm change, consider how your web pages use the area above-the-fold and whether the content on the page is obscured or otherwise hard for users to discern quickly. You can use our Browser Size tool, among many others, to see how your website would look under different screen resolutions.
In short, they fail to tell you what THEY are classing as above the fold, admittedly they are recommending you check this using browser simulators and screen resolution simulators, BUT they give absolutely no reference point with which to gauge where your content sits.
I appreciate what Google are trying to achieve here, but I can’t help but thinking there will be a lot of good, honest websites disappearing from page 1, losing pretty much all their traffic, and revenue, PURELY because again Google has introduced an element that is vague, and doesn’t really stop the spammers (who right now are placing layers of content above the fold to combat this) and so protecting themselves from the algorithm change. While the good old boy watches his really popular image based website go down the pan (along with his earnings) killing not only his website, but his dreams of earning a living from the web, and his faith in Google’s ‘don’t be evil’ mantra.
Just noticed this when I did a search for analytics. Looks liek Google are rolling this out over the next couple of days
I can’t help thinking that Google are playing one heck of a high risk game with all these SERP changes. it really is getting more and more congested, and it is like ” I wonder what kinda SERP I am gonna get TODAY? Elements seem to come and go, site links one day, Blocks opf links the nest, adwords intertwined with places listings, I think I have seen maybe 3 different formats today alone.
It is now getting a little offputting as finding ANYTHING on a Google SERP is becoming a chore.
Just a rocket quick post to say that as of now Google are #1 again in the SERP’s .
YES! They are of course #1 top for the term ‘browser’ via Google Adwords :p
Buying or selling links that pass PageRank is in violation of Google’s Webmaster Guidelines and can negatively impact a site’s ranking in search results.
Too bad google didn’t follow their own advice, as Matt Cutts has just acted swiftly
in a complete Public Relations stunt sorry I meant in a wonderful act of fairness To issue a manual 60 day penalty against Google chrome for the term ‘browser’ , zeroing its page rank (which interestingly enough is an odd statement to make seeing as Google have repeatedly claimed to not allow Google pages to pass page rank (an investigation for another day)
I should really cite a couple of pages here first Aaron Wall (way to go Aaron) who originally posted about this Google buying paid links story, also to Danny Sullivan who then broke it further on Search Engine land and Matt Cutts (who has only paid for one link in this post so doesn’t get a BOGOF offer this time around) (joke)
Some considerations in the statement made by Matt:-
We double-checked, and the video players weren’t flowing PageRank to Google either.
Indicating that some players CAN and DO pass PageRank
Even though the intent of the campaign was to get people to watch videos–not link to Google–and even though we only found a single sponsored post that actually linked to Google’s Chrome page and passed PageRank, that’s still a violation of our quality guidelines
Crazy that even though they knew the intent, and knew it was an error, a penalty was issued. I am reading this that under normal circumstances of a manual review, a single back-link among so many would not have resulted in such a penalty.
Here is something a little different. On Matts page he states
In response, the webspam team has taken manual action to demote www.google.com/chrome for at least 60 days.
But on the correspondence sent to Danny at Searchengineland he states
We’ve investigated and are taking manual action to demote www.google.com/chrome and lower the site’s PageRank for a period of at least 60 days.
the key difference here “and lower the site’s PageRank” which kinda kills the claim of many that PageRank doesn’t matter with regard ranking.
I did some rank checking and currently the google.com/chrome page is ranking 46 in the US and 44 in the UK, but the download page is still ranking #1 in US & UK for the terms ‘Chrome’ & ‘Chrome browser’ which again tells me that it is a manual PHRASE BASED penalty (for the term ‘browser’, as well as a PR zero penalty.
As I posted on Matt’s page.. That’ll teach the Chrome Team to Send Matt Socks for Christmas
Finally, in his correspondence with SEL, Matt stated
While Google did not authorize this campaign, and we can find no remaining violations of our webmaster guidelines, we believe Google should be held to a higher standard, so we have taken stricter action than we would against a typical site.
(Bolding added by me) which raises the questions who DID authorise it, and more importantly WHO did authorise the work (if not Google) and WHO actually carried out this spamming?
AFTER ALL (taken from the google webmaster tools help)
Can competitors harm ranking?
There’s almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.