Search engine friendly websites is one of those often heard phrases, both from website development companies and from their clients. Everyone knows that this is important to have, and yet it is one of the things that is actually often overlooked.
Search engine optimisation companies actually spend a lot of their time analysing a website and removing barriers to the search engines ranking a site highly. At the web development level, it is possible to build a site that is perfectly search engine friendly. One of the hardest types of sites to get right though are databased websites. Listed below are ten of the most common issues that are created, often unknowingly, in the development process of a dynamically generated web site.
1. Pages with duplicate content - not enough differential areas within the pages, so that only small areas of the page change from page to page. It is essential that enough of the page text changes for the search engines to see an appreciable difference between one page and the next.
2. Pages with duplicate page titles - the page title is a great indicator to the search engines of the primary content of the page. Whilst this is often unique on sites such as e-commerce websites, it is often overlooked in other sites, particularly where small areas of the site are generated from a database, such as news pages.
3. Pages with duplicate meta descriptions - again, this is easy to overlook and set a global or category level meta description. These give the search engines a reason to penalise your site for not giving them enough information, and again, creating a unique meta description for every page is an essential SEO task.
4. Using auto-generation of pages as a shortcut instead of creating good content. This is linked quite closely to point 1, where it is possible to create pages that have only a tiny percentage difference between them. Databases are fantastic ways of storing information, but you still need to put the work in to fill them with content. Unique information about the subject of the page will immensely help both the long tail and the ability of the search engines to determine that a page is valuable.
5. Creating pages that are hidden behind form submissions or javascript postbacks that cannot be accessed by a search engine crawler. This is far more common that is generally realised. For instance .NET creates postback links by default instead of proper links - potentially making huge sections of a site unreachable. Likewise, it is easy to hide lovely content rich areas of your site behind a drop down selector in a form that means certain areas of the site are not visible.
6. Too many query strings - this is a common bugbear of the professional SEO, where complicated database selections create deep levels of pages, but with seven or eight &id= type strings. Additionally, some bad development methodology can leave pages with null query strings that appear in every URL but don't do anything. The answer to this is generally URL rewrites, creating much more search engine friendly and user-friendly URLs!
7. Putting query strings in different orders when accessed through different places - this can create duplicate content issues, which can cause major penalties. A little friendly website lol http://superseosubmitter.com