Archive for the ‘Industry News’ Category
According to Matt Cutts the long awaited and feared Penguin 2.0 update has been released by Google, how have you fared?
Penguin2.0 is full on update rather than a tweak, and Google have stated that 2.3% of english language sites will be affected, which is a huge number by anyones standards.
In recent times Cutts has stated that they are going upstream in this update, that they are targeting spam more heavily, and that this update will rock a few boats, the video below explains this
Time will tell.
An Interesting post on Search Engine Roundtable references the possibility that Google is using self page linking as a spam signal where Keyword rich anchor text is used. In case you are not sure what this means, let me explain.
Let’s say you have a site about blue widgets, and in your homepage content you have a link (or more than one link) pointing to the same page (homepage) with your main keyword set i.e. ‘blue widgets’.
So why use it as a signal, and what is this being based on? Well put simply, why would you link to the same page you are already on, if not to gain benefit that anchor text links bring. keep in mind that the anchor page also gets given more weight to its anchor text as well as the target page.
This has been brought up as a result of a thread on Webmasterworld forum, where a member states that out of their 150 sites, 3 got hit, and the only common element was this home page self linking. Seems weak if you ask me, simple but an effective signal. it could be enough to drop the trust on the page which in turn could impact on other elements.
In the first time during the last 5 years, Google market share of search in the UK dropped below 90% . We have to ask ourselves is the honeymoon over, or greater than that, has the marriage hit the rocks? With multiple anti trust investigations around the globe, massive fines for alleged illegality regarding things as far reached as assisting in dealing in prescription drugs into the US which it settled for $500m, and the UK/EU privacy fiasco where Google grabbed personal details from home computers with their street view cars, firstly claiming it to be an accident, only to later have it claimed to have been pre-meditated, while it also transpired that Google had not deleted the very information it was originally investigated over. The UK government re-opened their investigation.
Ok so with that sort of concern, some have tried to see if it is possible to live a life outside of Google.
At long last, after months of promising, Google have finally released their ‘don’t count these links’ tool or to give it its proper terminology ‘Disavow tool’. So, is this what we all need?The answer is a resounding yes, but is it enough of a tool to help ease the pain of many webmasters hit by the Penguin algorithm?
Sadly I believe there is a fatal flaw in this, and that flaw is that Google simply don’t report (in webmaster tools) ALL the links it is using in the evaluation of your site within its algorithms
Matt cutts head of web-spam at Google introduced the tool in the above video. This is quite a lonmg video, but tehre are a few key things he mentions and they are that you should look at RECENT links. Now this is a big thing, as only a few months ago, John Mu (another senior Google guy) stated that ALL links should be looked at as all links are evaluated. So looking at this statement by Matt it tells me that ‘maybe’ Google is using recent links as the trigger, and again, possibly to combat negative seo (something that thanks to the changes Google made, is now a real world practice. I have certainly managed to get some of my test sites blown out of the water. (sorry test sites but that is what you are there for, taking one for the team )
Next Matt says that Google are working on a system so that 2 or 3 link examples will be given in the warning message in WebMasterTools, again this is a real help, and I for one think that some ‘guest bloggers’ will be stunned to see that what they have is a rose by any other name, and the name of that rose is un-natural links. Understand that I am not for one second stating that all guest blogging is bad, but certainly it is likely to be the next thing on the spam radar for Google in my opinion , especially the stuff that isn’t really guest blogging, but is in fact blog link networking.
Next we have a negative. So you have loaded the tool up, you have spentlarge portions of your life tracking down these links, you have done all in your poser toget them removed, you have loaded your url list and asked Google to disavow the links so all will be well in the world because Google wil disavow those links right? WRONG! HOW WRONG is the fact it is wrong. Well matt says “we treat it as a strong suggestion, but we don’t treat it as something we absolutely have to abide by!” Now this old SEO thinks it is disgusting that Google now not only allow the actions of others to affect your business, NOW they are also refusing to accept your instruction to remove from their algorithm process links that you have instructed them to do so, WHY?
This tells me that there is a secondary effect from pages reported, that not only do they disavow links to YOU, they disavow links to everyone from that site. OR they may be feeding disavowed sites into a reduced trust factor sub algo. who knows.
A bit about the process, it is going to take weeks, because they apply what is essentially a ‘nofollow’ equiv to that link, but only at the point of next crawl/index.
Then we have the bombshell, where he says that during a re-inclusion request , don’t think that they are going to look at only the links not disavowed, they are going to look at the links that existed before the disavow. he actually states that the disavow tool is not the answer to your ills.
SO! Is this tool a real useful tool or is it more public relations for Google or is it just going to not help you at all, but help GOOGLE to identify spammy sites? The jury is out, but I will be working with some test sites and possibly some people who contacted me after being hit by penguin, to test, as we have already done all we can.
One last thing. A quick story. A client launched a new site 10 weeks ago, the site was built superbly well, optimised with 100% unique content, it was spidered and google liked what they saw. A bit of press coverage was gained, and all looked well. Then it crashed? On investigation, one of his competitors had carried out negative SEO by buying a load of blogrolls from spam sites. New sites are very vulnerable to negative SEO, so the algorithm is still in a mess, and Google are making it VERY easy for unscrupulous business people to sabotage any new competitors that enter the marketplace.
I am seeing a large increase in cases where a business has been hit through no fault of their own, i.e. saboutage negative SEO, and for me that is heartbreaking.
What are your thoughts on this?
HALLELUJHA at last there has been some logic in the ways of Google.
Matt McGee recently covered an up close and personal session at SMX with Matt Cutts and blogged about it at search engine land. for me the biggest thing to come out of it is the following quote from Cutts.
The story of this year has been more transparency, but we’re also trying to be better about enforcing our quality guidelines. People have asked questions about negative SEO for a long time. Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.
Some have suggested that Google could disavow links. Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three.
I have been shouting for this for a LONG time, many others have been saying the same thing. The only issue we have to face now is that google are still not showing us all the links they know about.
In some no scratch that, in MANY cases cases this could mean that the very links that are hurting a site, the very links Google is basing its negative views on, are simply not being shown to us, and as a result, we, (the site owner affected) will still be none the wiser.
It is a great move by google to bring this in, and truth be told, they have probably realised just how exposed they are to a law suit for damages. BUT (and this is a big but) they totally must give access to every single link they know about to the webmaster that has verified his site. Otherwise once again we will have been given a brush with no head, and a shovel with no handle, with which to clean up the mess.
I would say this is a pretty much an admission of failing.
Look at this beauty from Matt Cutts.
People have asked questions about negative SEO for a long time. Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.
Did you read that? they now admit that there have been cases where negative SEO has happened.
So to all those who said it was rubbish.. dream on and admit you were wrong.
Google Sued for Invasion of Privacy
It seems like not a day goes by when the once clean as a whistle start-up Google is accused of breaching some law or other. The story broke a day or two ago, when it was announced that those bad boys at Google had allegedly written code that by-passed the privacy settings in the Safari browser , allowing the giant to track them physically, and it got worse as it was announced that one disgruntled IPhone user has actually filed a lawsuit against Google claiming
Google’s “willful and knowing actions” violated federal wiretapping laws, among other statutes.
Now THAT is some serious stuff right there, and if I were Google I would be drawing straws to see who will carry the can and do the jail time that this sort of thing can result in.
NOW HANG ON, I hear people saying. This is just lil old safari, like 10% of the market so not that big a deal huh? But THEN we hear from Microsoft and their Internet Explorer browser, THE most important broswer on the planet, and they had the following to say about Google breaching Internet Explorer Privacy Settings
When the IE team heard that Google had bypassed user privacy settings on Safari, we asked ourselves a simple question: is Google circumventing the privacy preferences of Internet Explorer users too? We’ve discovered the answer is yes: Google is employing similar methods to get around the default privacy protections in IE and track IE users with cookies. Below we spell out in more detail what we’ve discovered,
So where does that leave us?
In the European Union, this has HUGE potential consequences for Google as it breaches the 1998 Human Rights Act. you know, that tiny little piece of legislation that forces the UK etc to release paedophiles and convicted terrorists into the community, but prevents us from sending them home? The LAW states that every EU citizen has the right to privacy in their home and family life. Google have breached what is a fundamental human right. and they have done so deliberately by by-passing a mechanism that was there for this very reason.
People have a legal right to privacy, and if a company can’t or won’t abide by that right, then they should not be allowed to trade within that community (in this case the EU) Google are currently being investigated on multiple fronts within the EU and anti trust case is as we speak being finalised and the outcome will be made sometime in March. This particular case is for illegally promoting Google owned properties, on the Google search results.
The above anti trust law pales into insignificance however compared to the alleged breach of human rights, legislation that is so powerful, it has recently prevented the deportation of a convicted Al Qaeda terrorist, such is the might of legislation Google have breached.
I would hope that this is not swept under the carpet, like so many other instances of illegal behaviour before them. Google has ENORMOUS power, and there are two saying that are ringing in my ears right now.
1. With power comes responsibility
2. Power corrupts, and absolute power corrupts absolutely.
As someone who witnessed the birth of Google within search, it pains me to see what they have become, and are becoming, and that is just another power hungry run rough shod over anyone to get what they wat, type corporation.
Announced a few hours ago on the Google inside Search Blog link to PDF was this little beauty of page layout analysis being part of the quality score within the ranking algorithm. What does this mean in reality?
Many years ago back in 2004, Microsoft created an algorithm that scored links based on their location, type and size within a page, this was called Block Level Link Analysis, and as a result, all links were no longer equal (which is a good thing). It has been believed that Google have used this for sometime, but they have now taken this a step further.
Google claim to be interested in user experience beyond all else (other than making money of course), and this change appears to be a quality control change, but this old bald guy can’t help think that there is huge potential for babies to go out in the bathwater.
So lets grab some snippets of that blogg post by Matt Cutts
As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change.
So they are acting on complaints (from their testers) that they want to see content above the fold.
Define Above the Fold?
Two HUGE things there are how do you define ‘content’ and how do you define ‘above the fold’? For those that don’t know, ‘above the fold’ is a term from the newspaper days of broadsheets which were traditionally folded in half. In the news displays, the papers would be placed with the masthead and headline showing out (top half of front page), anything in the printed part is ‘below the fold’ In this case it means the page requires scrolling to read it. The difficulty here is how do you define ‘above the fold’ in a world where monitor sizes vary dramatically, screen resolutions the same, how on earth do you define ‘above the fold’? Sorry Google but that is pretty much an impossible task.
On a photography site the content may be imagery, the content may be delivered by video, so HOW THEN are Google going to define the term ‘content’ Flash, Ajax, there are so many technologies that can render a page devoid of content, yet still deliver an enriching entertaining, satisfying user experience. But is this ‘content’ under the definition which Google apply?
and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.
That right there tells me they are going after excessive advertising, which was a large factor in the panda update(s), it is also clearly saying that going forward, such sites will not rank as highly.
This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page.
Now read this and think, “or make it hard to find Actual ORIGINAL content” so again this appears to be targeting content scrapers (like Google :p ) sorry I couldn’t resist! Back on topic, it looks like they are trying to get original content with a balance between ads & content. I was involved in traditional publishing and a 60/40 mix (with content being 60%) is about as high as you want to go without ruining the user experience, and getting ad blindness, which results in poor advertising ROI. Which is fine, but that single video could have 1000 words on it, all original, all relevant, all content . How will google handle this?
OK Google define ‘Above The Fold’ Please
If you believe that your website has been affected by the page layout algorithm change, consider how your web pages use the area above-the-fold and whether the content on the page is obscured or otherwise hard for users to discern quickly. You can use our Browser Size tool, among many others, to see how your website would look under different screen resolutions.
In short, they fail to tell you what THEY are classing as above the fold, admittedly they are recommending you check this using browser simulators and screen resolution simulators, BUT they give absolutely no reference point with which to gauge where your content sits.
I appreciate what Google are trying to achieve here, but I can’t help but thinking there will be a lot of good, honest websites disappearing from page 1, losing pretty much all their traffic, and revenue, PURELY because again Google has introduced an element that is vague, and doesn’t really stop the spammers (who right now are placing layers of content above the fold to combat this) and so protecting themselves from the algorithm change. While the good old boy watches his really popular image based website go down the pan (along with his earnings) killing not only his website, but his dreams of earning a living from the web, and his faith in Google’s ‘don’t be evil’ mantra.
Just noticed this when I did a search for analytics. Looks liek Google are rolling this out over the next couple of days
I can’t help thinking that Google are playing one heck of a high risk game with all these SERP changes. it really is getting more and more congested, and it is like ” I wonder what kinda SERP I am gonna get TODAY? Elements seem to come and go, site links one day, Blocks opf links the nest, adwords intertwined with places listings, I think I have seen maybe 3 different formats today alone.
It is now getting a little offputting as finding ANYTHING on a Google SERP is becoming a chore.
Just a rocket quick post to say that as of now Google are #1 again in the SERP’s .
YES! They are of course #1 top for the term ‘browser’ via Google Adwords :p
Buying or selling links that pass PageRank is in violation of Google’s Webmaster Guidelines and can negatively impact a site’s ranking in search results.
Too bad google didn’t follow their own advice, as Matt Cutts has just acted swiftly
in a complete Public Relations stunt sorry I meant in a wonderful act of fairness To issue a manual 60 day penalty against Google chrome for the term ‘browser’ , zeroing its page rank (which interestingly enough is an odd statement to make seeing as Google have repeatedly claimed to not allow Google pages to pass page rank (an investigation for another day)
I should really cite a couple of pages here first Aaron Wall (way to go Aaron) who originally posted about this Google buying paid links story, also to Danny Sullivan who then broke it further on Search Engine land and Matt Cutts (who has only paid for one link in this post so doesn’t get a BOGOF offer this time around) (joke)
Some considerations in the statement made by Matt:-
We double-checked, and the video players weren’t flowing PageRank to Google either.
Indicating that some players CAN and DO pass PageRank
Even though the intent of the campaign was to get people to watch videos–not link to Google–and even though we only found a single sponsored post that actually linked to Google’s Chrome page and passed PageRank, that’s still a violation of our quality guidelines
Crazy that even though they knew the intent, and knew it was an error, a penalty was issued. I am reading this that under normal circumstances of a manual review, a single back-link among so many would not have resulted in such a penalty.
Here is something a little different. On Matts page he states
In response, the webspam team has taken manual action to demote www.google.com/chrome for at least 60 days.
But on the correspondence sent to Danny at Searchengineland he states
We’ve investigated and are taking manual action to demote www.google.com/chrome and lower the site’s PageRank for a period of at least 60 days.
the key difference here “and lower the site’s PageRank” which kinda kills the claim of many that PageRank doesn’t matter with regard ranking.
I did some rank checking and currently the google.com/chrome page is ranking 46 in the US and 44 in the UK, but the download page is still ranking #1 in US & UK for the terms ‘Chrome’ & ‘Chrome browser’ which again tells me that it is a manual PHRASE BASED penalty (for the term ‘browser’, as well as a PR zero penalty.
As I posted on Matt’s page.. That’ll teach the Chrome Team to Send Matt Socks for Christmas
Finally, in his correspondence with SEL, Matt stated
While Google did not authorize this campaign, and we can find no remaining violations of our webmaster guidelines, we believe Google should be held to a higher standard, so we have taken stricter action than we would against a typical site.
(Bolding added by me) which raises the questions who DID authorise it, and more importantly WHO did authorise the work (if not Google) and WHO actually carried out this spamming?
AFTER ALL (taken from the google webmaster tools help)
Can competitors harm ranking?
There’s almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.