Dynamic Website SEO Terror Level Downgraded to Yellow
Dynamic content used to be a red flag for search engine friendly design, but times have changed. Search engines now include dynamically-generated pages in their indexes, but some particulars of dynamic pages can still be obstacles to getting indexed.
Whether it’s keeping in synch with inventory or updating a blog, more than likely if you’re a website owner you have some level of dynamic or CMS-managed content on your site (and if not, you should really be looking into it for your next redesign). Follow the guidelines here to avoid major pitfalls and ensure that your dynamic body of work is search engine friendly from head to toe.
Rule #1: Be sure that search engines can follow regular HTML links to all pages on your site.
Any website needs individually linkable URLs for all unique pages on the site. This way every page can be bookmarked and deep linked by users, and indexed by search engines. But dynamic websites have an additional concern: making sure the search engine robots can reach all of these pages.
For example, suppose you have a form on your website: you ask people to select their location from a pull-down, and then when people submit the form your website generates a page with content that is specifically written for that geographical area. Search engine robots don’t fill out forms or select from pull-down menus, so there will be no way for them to get to that page.
This problem can be easily remedied by providing standard <a href> type HTML links that point to all of your dynamic pages. The easiest way to do this is to add these links to your site map.
Rule #2: Set up an XML site map if you can’t create regular HTML links to all of your pages, or if it appears that search engines are having trouble indexing your pages.
If you have a large (10K pages or more) dynamic site, or you don’t think that providing static HTML links is an option, you can use an XML site map to tell search engines the locations of all your pages.
Most website owners tell Google and Yahoo! about their site maps through the search engines’ respective webmaster tools (Links: Google Yahoo!). But if you’re an early adopter, you should look into the new system whereby a site map can be easily designated in the robots.txt file using sitemap autodiscovery. Ask.com, Google and Yahoo! currently support this feature. Cool!
Rule #3: If you must use dynamic URLs, keep them short and tidy
Another potential problem – and this is one that is subject to some debate – is with dynamic pages that have too many parameters in the URL. Google itself in its webmaster guidelines states the following: “If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.”
Here are a few guidelines you should follow for your website parameters:
- Limit the number of parameters in the URL to a maximum of 2
- Use the parameter “?id=” only when in reference to a session id [this is no longer a problem.]
- Be sure that the URL functions if all dynamic items are removed
- Be sure your internal links are consistent – always link with parameters in the same order and format
Rule #4: Avoid dynamic-looking URLs if possible
Besides being second-class citizens of search, dynamic-looking URLs are also less attractive to your human visitors. Most people prefer to see URLs that clearly communicate the content on the page. Since reading the URL is one of the ways that people decide whether to click on a listing in search engines, you are much better off having a URL that looks like this:
http://www.yoursite.com/church-bells/discount/
rather than this:
http://www.yourseite.com/prod.php?id=23485&blt=234
We also think that static-looking, “human-readable” URLs are more likely to receive inbound links, because some people will be less inclined to link to pages with very long or complicated URLs.
Furthermore, keywords in a URL are a factor, admittedly not a huge one, in search engine ranking algorithms. Notice how, in the above example, the static URL contains the keywords “discount” and “church bells” while the dynamic URL does not.
There are many tools available that will re-create a dynamic site in static form. There are also tools that will re-write your URLs, if you have too many parameters, to “look” like regular non-dynamic URLS. We think these are both good options for dynamic Intrapromote has a helpful post on dynamic URL rewriting.
Rule #5: De-index stubs and search results
Have you heard of “website stubs?” These are pages that are generated by dynamic sites but really have no independent content on them. For example, if your website is a shopping cart for toys, there may be a page generated for the category “Age 7-12 Toys” but you may not actually have any products in this category. Stub pages are very annoying to searchers, and search engines, by extension, would like to prevent them from displaying in their results. So do us all a favor and either figure out a way to get rid of these pages, or exclude them from indexing using the robots.txt file or robots meta tag.
Search results from within your website is another type of page for which Google has stated a dislike: “Typically, web search results don’t add value to users, and since our core goal is to provide the best search results possible, we generally exclude search results from our web search index.” Here’s our advice: either make sure your search results pages add value for the searcher (perhaps by containing some unique content related to the searched term), or exclude them from indexing using the robots.txt file or robots meta tag.
Bonus Points: Handling duplicate content
While it’s not a problem that’s specific to dynamic sites, this rule is one that dynamic sites are more likely to break than static ones.
If multiple pages on your site display materials that are identical or nearly identical, duplicates should be excluded from indexing using the robots.txt file or a robots meta tag. Think of it this way: you don’t want all your duplicate pages competing with each other on the search engines. Choose a favorite, and exclude the rest.[Editor’s note: we no longer (2009) recommend de-indexing duplicate content. A better approach is to either redirect your duplicate pages to the primary page using a server-side, 301 redirect, or to set up a <link rel=”canonical”> tag for any page that has been duplicated. A good explanation of best practices for handling duplicate content in 2009 can be found at Matt Cutts’ Blog]
Dynamic content is usually timely and useful, which is why users love it, and the search engines want to list it. And now you know how to help your dynamic website reach its full search engine potential.
2 COMMENTS
Its good.
Thank you for the valuable guidance.
Regards,
Subhashini
Really informative article.
Thank you for the valuable optimization tips.
Comments are closed.