Archive for the ‘Technical Issues’ Category
Call them what you like, Doorway pages, Gateway Pages, even Zebra Pages, Google has never liked them. In the past Google has treated these as pages that are built, but have sneaky redirects in them, then they moved the goalposts of definition slightly, and now we have a clear statement from Google not to use them, and more importantly, exactly what Google themselves believe that definition to be.
Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase.
So that is clear and simple, Don’t put up LARGE SETS, of POOR QUALITY pages on your site, and optimise it for a single term. Does that mean you can’t have a page for your important terms? Absolutely not, the key there is large sets, poor quality. Google wants, desires and yearns for large amounts of HIGH QUALITY pages on a topic.
In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination.
Here they give us another indicator of an element within their algorithm, set out to detect these sites. This would be typical of lead generation sites, or sites that have a single action page with many of these doorway pages built in. A clear example of this would be a site that sells a single product directly or affiliate CPA with many pages surrounding it but having a single action page.
Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users.
This is actually debatable, as searching for something, landing on a page about that term, and finding what you want could be argued that it is delivering a GOOD user experience. Unfortunately while the argument may be debatable, Google are not up for debate.
Therefore, Google frowns on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the one they selected, and that provide content solely for the benefit of search engines.
Interestingly Google have switched back to doorway pages with redirects to a main site, they appear here to be talking about satellite sites, or micro-sites optimised for a term and then pointing to a main site. Is this a legitimate method for advertising however? IF the links are nofollowed then it could be argued yes, the site is an advert rather than a doorway site. Google appear here to be attempting to prevent businesses from advertising their wares legitimately.
I worked with a well known company a few years ago, and they had a website ‘quote me happy’ which supported their Offline advertising campaigns. It would be wrong of any search engine to prevent a company from doing this. My advice would be that if you are using such sites, then nofollow the links to the main site, and ensure the capture site has relevant content.
Google may take action on doorway sites and other sites making use of these deceptive practices, including removing these sites from Google’s index.
This is a scary prospect, as they appear here to be saying that they will take action against a doorway site AND other sites making use of these deceptive practices. So what is to stop a competitor setting them up and reporting you?
Finally in the piece, they give some examples of use.
Having multiple domain names targeted at specific regions or cities that funnel users to one page
Templated pages made solely for affiliate linking
Multiple pages on your site with similar content designed to rank for specific queries like city or state names
It is fair to say that anyone who is using a strategy of having hundreds of pages (THIN pages), that are geographically based or topic based, should review their strategy, and, if you have suffered a loss of traffic or a change to the terms you have received traffic for, then this could be the answer.
Ok now I know that most SEO’s know and have been saying this for a long time, but now a Google employee John Mu has posted the following content
We do Toolbar PageRank updates 3-4x/year but to be honest, it’s not something that you need to wait for. The PageRank shown in the Toolbar is an older snapshot of the PageRank that we use internally (which is continuously updated). Changes in Toolbar PageRank will not change anything with your site’s crawling, indexing, or ranking, so as a webmaster, I’d strongly recommend focusing on something else.If you are using PageRank as a means to sell PageRank-passing links, keep in mind that this is against our Webmaster Guidelines, and can negatively impact your site’s ranking in search results.CheersJohn
I’ve found that Google typically does a major scan of the web every 28-30 days (once a month) and updates PR every 3-4 months I’m just waiting for the next update so they can pick up my new site.
It appears that something is up with Google and their cache reporting. A lot of pages seem to have cache dates of August 12 or there abouts, sites that were previously cached regularly are reporting that they have not been cached.
So the questions now are :-
Is it the cache date reporting that is wrong or
Is it that Google has suddenly ramped down their caching?
Are they ‘full’ again as they were a couple of years back?
Is something BIG, HUGE, MONUMENTAL, about to happen, are we in for another update of ‘Florida’ proportions?
Much has been said of late at the apparent worsening of web spam handling within the Google index. Could we be standing on the precipice blissfully unaware
Today I thought i would mention the case of the curiously disappearing pages in google. I am now getting regular enquiries from people who are concerned about their site losing page saturation in the might G. No indexed page = no traffic of course, which in turn = no money.
having worked with a few sites who have suffered this, I have found that the main issues contributing to it were
By this I mean that the pages were pretty much duplicated with only a small element changing on each page. This is especially a problem with E com sites as the header, nav footer all stay the same. Visible content might look a little different but when you look at the code, you see that in hundreds of lines, maybe only 4 or 5 are unique.
Poor navigation/site structure
In this case I noticed that the actual site hierarchy was poorly designed. There was no clear structure for the search engines to apply weight to.
Poor linking structure
here I saw poor linking, in as much as most pages were linked to each other from most other pages. this only served to water down PageRank (link juice) to a point where the pages were all seen as unimportant (again related to poor structure/architecture
Too many links
This was a common theme as many carts have pop out or drop down option selections, which look innocent enough, but on further investigation can be seen to be causing problems . It is possible to have hundreds of links per page, and this isn’t good.,
While a flat file structure is OK for a small site, a clear linking and hierarchy MUST be evident in a large site. This allows Google to apply it’s weight to each page, its trust to each page, pro-rata.
Poor linking leads to poor PR spread, and that is not good in the eyes of Google. Despite what many say, actual PR matters to Google, it matters for many things, and gauging the value of a page is one of them.
All the above serve only to confuse the search engines as to the importance of pages within your site. Each site has a page saturation level, it is worked out by the two main elements in the google algorithm (yes there are really only 2 when it is all boiled down)
1. Importance – this is a measure of value in the eyes of google and is pretty much page rank
2. Relevance this is a textual value.
The above are further split in to the 250 or more sub elements that make up the algorithm, but when all said and done, it is those 2 that matter.
With most of a shopping carts pages being near duplicate content, and the page cross linking structure being higgledy piggldy at best, how is Google supposed to know what is important or relevant? Put simply, they can’t.
The result of this is that wile a site may be showing some 1500 or so pages indexed, when you get to between 200-300 pages, the cached versions stop showing. So the reality is even worse. No cache in the main index, yet often times a cache when you visit the individual page?
While Google announced the scrapping of the infamous supplementary index, it appears that just like in the case of Mark Twain (Samuel Langorne Clemens, rumours of its death were greatly exaggerated.
Finally, I will say this. If you read the Google webmaster technical advice pages, it tells you not to make the errors above that contribute to page dropping.
I recently changed hosts, and at the same time I moved the store from domain.com/store to the root domain.com/ Sales have disappeared and I am getting loads of 404 errors, what can I do? this is an absolute nightmare for me and the business.
Sadly you have made an all too common error in assuming that it is OK to simply move your site around without it having an affect. The search engines know all the URLs on your site, and expect them to be there. People will have bookmarked pages to come back later and buy from you. Now they are all getting 404 errors. The spiders feel unloved, and worse, potential buyers think you have gone bust. Read the rest of this entry »
Q: (privacy requested for obvious reasons)
Hi OWG, for some odd reason, Google has suddenly started spidering our site under the wrong domain name. The home page and a couple of others have been spidered under XXX.co.uk while the rest are under the original domain of yyy.co.uk. We really are at a loss as to why this has happened, can you help please, as yyy.co.uk is just an old domain name that we foreward to our real site.
A: No problem! This often happens and is a timebomb waiting to happen in many cases. It is a way of setting yourself up to be knocked down, due to poor redirection of domains. Read the rest of this entry »