SEO Tips for Dynamic Websites :
The Many Pitfalls of Dynamic Sites And Optimizing For Them :
One might wonder with all those dynamic websites, and the shear impracticality of static websites , why hasn’t this issue still been solved ? The truth is that many search engines have already tried to improve support for dynamic websites, but the problem is hard to tackle. And in the meantime, those who find a way to optimize their dynamic site have an advantage of those who don’t. Which in practice means that having an optimized dynamic website could make you more profitable in the end, and deserves the effort.
Now that we know the benefits of having an optimized website , it’s time to consider how that is to be done. This would however require us to explain some things beforehand, so here comes the theory.
What exactly is the problem with dynamic websites ?
The idea which defines a website as dynamic is the fact that each page is created or “synthesized” on the fly, instead of being written once and only reproduced many times. The user would make a request, and based on that request, the site would generate results. The easiest way to explain this would be with an example.
Imagine we have a real estate website that deals with property in England and France. Say we want only houses in England. We put that on the site, and it produces a list of what we were looking for, on the spot, by searching its database, compiling the list, and providing the final page for us to see. Technically speaking, this information is passed through the site’s URL , as you’ve probably seen. In our case it would probably look like : ?country=England&type=house
However Google has a difficulty with passing options for all these probable URLs. And the spider bots (which crawl through the pages) don’t go after this information, having it remain unindexed.
So how do we remedy that ?
Dealing with the actual optimization
There are several approaches to help us with the indexing of dynamic websites. One of them is the all-popular and old time classic – sitemaps. Through the use of regular sitemaps, or XML sitemaps which could be submitted to search engines (Google offers that option, for example), one could give the search engine the exact information it needs to do its business properly. But aside from sitemaps, probably the most important and frequently used technique is URL rewriting.
URL rewriting is handled by special modules of the web server. For Linux based Apache servers, this is done by Mod Rewrite, and for Microsoft based servers, it could be done by ISAPI Rewrite. What this technique does basically is to create a hash table. Think of a hash table as a functionality which is used to check for a certain URL and translate it to another, which is more SEO compliant.
Remember the example from our previous step ? If we use URL rewriting modules , our former line :
will turn into:
This is far easier for search engines to crawl and index , and thus offers you an opportunity to present your dynamic site almost as well as a static one to crawler bots.
Much more can be learned if one is willing to explore this in-depth, but with only these two techniques, results can improve tenfold. Keep this in mind when deciding to optimize a dynamic website.