Luca Vassalli's Website - Home page


Home
Index

1.Introduction

2.Ethical SEO

3.Spider's view

4.SEO spam

5.General topics

 

3.2 Dynamic pages

A dynamic website is a website where interactive technologies are used. Program languages like PHP, CGI, JSP or the like allow connection with a database, authentication of the users, storage of the user's session and generally to manage more dynamic pages with a single file. For instance with the file "product.php", according to the parameters that you will use, it will be possible to present all the product of your company to the users. You will have different distinct dynamic pages, for example "www.mydomain.com/product.php?id=1" and "www.mydomain.com/product.php?id=2", managed by the same PHP file. On the contrary with static files you would need to have as many static HTML files as pages of the websites, since no connection to a database is possible using just HTML, all the information will be hard coded in the files instead of stored in the database. This is the reason dynamic websites are increasingly popular, they are flexible, fast developing and efficient.
Unfortunately most of search engines do not like dynamic content. They usually will not go deep indexing links with parameters, give such files much less PageRank or even refuse to crawl dynamic pages at all. Static URLs are typically ranked better, and they are indexed more quickly than dynamic URLs Usually a spider cuts off the URLs after a specific number of variable strings (e.g.:?&=), if, for instance, it cuts everything after a question mark, instead of indexing two different URLs, in the example above, it will index just one. This can be a serious drain on the overall ranking of the website.
Another issue is that dynamic pages generally do not have any keywords in the URL. You have already seen that this is one of the key places where the search words are to be found. A research about the top ten results in very competitive words showed that Google has 40-50% of those top ten with the keyword either in the URL or the domain; Yahoo shows 60%; and MSN has 85%.

Anyway there are solutions for this problem.
The easiest and less powerful solution is the creation of traditional, static pages. The correct way to use these newly created static pages is to place links to the dynamic pages on the static pages and then submit the static pages to the major search engines according to each search engine's recommended guidelines. This technique is easily implemented with a site map that fully displays all the links to the dynamic pages across the website. While the crawlers cannot index the entire dynamic pages, they will index most of the content.
A bit trickier, and powerful, solution, if you are hosted on a Linux server, it is using the Apache Mod Rewrite Rule, which converts the dynamic URLs in static ones in a transparent way for the crawler. Every time the static address will arrive to the server, the module will convert it back to the dynamic one. For example:
http://www.mydomain.com/product/dishwasher_blue
Every time is converted back to the original dynamic version:
http://www.mydomain.com/product.php?id=2345
To implement this solution, you need to install the module on your server and to edit the .htaccess file adding, for each dynamic URL you want Apache to convert, a rewrite command, which is the rule used by the module to translate every static link in the original dynamic one. Since those rules are written in a particular syntax that the module can recognize, you can avoid learning it, using the URL rewriting tool (http://www.webconfs.com/url-rewriting-tool.php). With this tool for each dynamic page, you just need to enter the URL into the box, press submit, and copy and paste the generated code into your .htaccess file on the root of your website.
However also this solution has a drawback: you have to modify all the links of your website to the static address version in order to avoid penalties due to having duplicate URLs. In fact the old dynamic address still works; the only difference is that it does not require the translation of the server, but obviously there is no difference in the content between the two versions. It means that if a crawler is smart enough to read a dynamic page, or this is particularly simple, and within the website there are both, the dynamic and the static link, to the same page, you can be penalised for duplicate URLs. The solution is to hide the dynamic page with the Robots.txt file.

In case your website is not hosted on a Linux server, or in case you want a completely automatic solution, there are commercial URL rewriting tools. For instance LinkFreeze is a fast and easy tool with support the main scripting languages. XQASP from Exception Digital Enterprise Solutions is specially minded for ASP websites. There are also other good solutions to the problem. The main difference is the flexibility, the price and the provided assistance.