Web site developers and search engine marketing practitioners typically follow every known rule, guideline, myth and rumor known to mankind when it comes to Google. This latest Google Central Blog article illustrates the point.
In a recently released post called Dynamic URLS vs Static URLs, the writers’ attempts to shatter myths about how Google handles dynamic URLS only served to freak out every one.
It’s much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters.
While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.
and this one, for those who don’t wish for ANY other search engine to index their web pages:
You may be able to remove some parameters which aren’t essential for Googlebot and offer your users a nice looking dynamic URL. If you are not able to figure out which parameters to remove, we’d advise you to serve us all the parameters in your dynamic URL and our system will figure out which ones do not matter.
and, hand over the Prozac now,
If you transform your dynamic URL to make it look static you should be aware that we might not be able to interpret the information correctly in all cases. If you want to serve a static equivalent of your site, you might want to consider transforming the underlying content by serving a replacement which is truly static. One example would be to generate files for all the paths and make them accessible somewhere on your site.
Grab your favorite brand of popcorn and beer, read the comments left by readers who spotted this article and then join in the further discussion at Cre8asiteforums, in Google Does 180 And Says Dont Use Pretty Urls, I believe this is Google’s jump the shark moment.
A comment from the discussion at Cre8asiteforums:
Here’s what Google didn’t say, however.
Handling transposed or unnecessary parameters comes with a cost; the page has to be spidered multiple times, with tests to determine what is transposed or unnecessary, ultimately deciding what returns unique content and what doesn’t. Those tests cost you bandwidth and Google time, both of which are finite quantities not to be squandered. I believe Google decides when those test should be run — as well as how many parameters it will accept — based on its perceived importance of the page. Yep, the dreaded Page Rank qualifier.
Does that sound like I’m advocating rewriting problematic URLs?
I’m not, because in my opinion a rewrite at the server level is almost always a band-aid on an open wound at the application level.
(Oops! Watch out Google! The forums’ URL may break your servers – http://www.cre8asiteforums.com/forums/index.php?s=f0c4aadcf6382cac13c9a03b09e372f9&showtopic=66939&view=findpost&p=278878)