You are here: Home » HTML/PHP/JS/CSS » SEO Tips for Dynamic Websites

SEO Tips for Dynamic Websites

SEO Tips for Dynamic Websites :

The Many Pitfalls of Dynamic Sites And Optimizing For Them :

One might wonder with all those dynamic websites, and the shear impracticality of static websites , why hasn’t this issue still been solved ? The truth is that many search engines have already tried to improve support for dynamic websites, but the problem is hard to tackle. And in the meantime, those who find a way to optimize their dynamic site have an advantage of those who don’t. Which in practice means that having an optimized dynamic website could make you more profitable in the end, and deserves the effort.

Now that we know the benefits of having an optimized website , it’s time to consider how that is to be done. This would however require us to explain some things beforehand, so here comes the theory.

What exactly is the problem with dynamic websites ?

The idea which defines a website as dynamic is the fact that each page is created or “synthesized” on the fly, instead of being written once and only reproduced many times.  The user would make a request, and based on that request, the site would generate results. The easiest way to explain this would be with an example.

Imagine we have a real estate website that deals with property in England and France. Say we want only houses in England. We put that on the site, and it produces a list of what we were looking for, on the spot, by searching its database, compiling the list, and providing the final page for us to see. Technically speaking, this information is passed through the site’s URL , as you’ve probably seen. In our case it would probably look like : ?country=England&type=house

However Google has a difficulty with passing options for all these probable URLs. And the spider bots (which crawl through the pages) don’t go after this information, having it remain unindexed.

So how do we remedy that ?

Dealing with the actual optimization

There are several approaches to help us with the indexing of dynamic websites. One of them is the all-popular and old time classic – sitemaps. Through the use of regular sitemaps, or XML sitemaps which could be submitted to search engines (Google offers that option, for example),  one could give the search engine the exact information it needs to do its business properly. But aside from sitemaps, probably the most important and frequently used technique is URL rewriting.

URL rewriting is handled by special modules of the web server. For Linux based Apache servers,  this is done by Mod Rewrite, and for Microsoft based servers, it could be done by ISAPI Rewrite. What this technique does basically is to create a hash table. Think of a hash table as a functionality which  is used to check for a certain URL and translate it to another, which is more SEO compliant.

Remember the example from our previous step ? If we use URL rewriting modules , our former line :

?country=England&type=house

will turn into:

/countries/England/type/house

This is far easier for search engines to crawl and index , and thus offers you an opportunity to present your dynamic site almost as well as a static one to crawler bots.

Much more can be learned if one is willing to explore this in-depth, but with only these two techniques, results can improve tenfold.  Keep this in mind when deciding to optimize a dynamic website.

 

About admin

Ajinkya Thete is passionate Blogger,a self-taught web developer and a person who is always looking for innovation.

4 comments

  1. Hi Smith,

    I am Mark from Info CheckPoint, i am doing a research on how to make dynamic pages search engines friendly and got this blog a great article which made me ask some suggestion regarding how to solve our site issues. I am sure I will get the solution from Smith Grel.

    We have a website with 70 Static pages and million of profile pages which are all dynamic driven by database with static URL. Now the problem we are facing is after opening dynamic profile page and hit the view source code on browser no content from actual page is visible therefore search engines not detecting the pages that we want it to crawl and index.

    Please give us some valuable suggestions on how to make content visible in view source code and generate sitemaps for all million of dynamic pages

  2. Excellent goods from you, man. I’ve be mindful your stuff previous to and you’re simply extremely magnificent. I actually like what you’ve obtained right here, certainly like what you are saying and the way through which you say it. You are making it entertaining and you continue to take care of to keep it wise. I can’t wait to read much more from you. That is actually a tremendous web site.

  3. Excellent one. In depth wanted to know how exactly the search results page can be optimized, suppose if we have 10 pages for the search phrase, then how can we optimize it. Do we need to allow the search engine to crawl all 10 pages or only first page is enough? Please explain us to understand better here.

Leave a Reply

Scroll To Top