You Can Now Force a Page Cache Update in the Google Index

Something that has always been a bit of a contradiction was that Google want the freshest content in their index, yet adjust the crawl date of statice pages to an ever growing length of time between crawls.  For instance, my consultancy site currently has a cache date of 3 days ago, but prior to that it was a good few weeks, well over a month. The reason of course is to make best use of resources within the google crawling and indexing structure. My site not changing often doesn’t require crawling often.

But what about when my site is overhauled? There could be a 6 week out of date version in the Google index, and THAt is not good for them nor me, freshness is king in the world of search.

Google have combated this by introducing a new submit a url after fetching as google bot. It is a natural progression isn’t it, and the beauty about this is thatr Google can instantly and algorithmically detect major change on your page, at which point submitting it to the index, will likely as not result in your ‘ linked target pages’ getting a crawl, and if THEY are changed, then probably the entire site.

Some key points are :-

This new functionality may help you in several situations: if you’ve just launched a new site, or added some key new pages, you can ask Googlebot to find and crawl them immediately rather than waiting for us to discover them naturally. You can also submit URLs that are already indexed in order to refresh them, say if you’ve updated some key content for the event you’re hosting this weekend and want to make sure we see it in time. It could also help if you’ve accidentally published information that you didn’t mean to, and want to update our cached version after you’ve removed the information from your site.

Now while the service is limited in volume (for obvious reasons), it is a great boon and a major step forward, and in fact reminds me of the old days when we would sit up all night, submitting to Altavista (Alta who I hear anyone under 30 saying), working in a round route of ‘optimise > submit> wait> check ranking change > optimise again> submit> wait for ranking change. In those days and doing that, it was possible to launch a site and have it at #1 within hours.

How trusting were the search engines back then, but then again, is giving them what they want wrong?

Whatever way you look at it, as Jonathan Simon & Susan Moskwa, state in that snippet, it can help make fresh content on irregularly crawled pages fresh, it can correct errors, in short, everyone wins!

 

Comments are closed.