Search engine optimization (SEO) is the process of improving the volume or quality of traffic to a web site from search engines via “natural” or un-paid (“organic” or “algorithmic”) search results as opposed to search engine marketing (SEM) which deals with paid inclusion. Typically, the earlier (or higher) a site appears in the search results list, the more visitors it will receive from the search engine.
As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines.
Here is an overview of few basics to be considered for a good search engine optimization strategy
Proper Title Tags
Well-constructed title tags contain the main keyword for the page, followed by a brief description of the page content. It will be less than 65 characters and avoid using stop words such as: a, if, the, then, and, an, to, etc. Your title tag should also be limited to the use of alphanumeric characters, hyphens, and commas.
Proper Description Tags
Good description tags contain information about the page’s content and persuade search engine users to visit your web site. They should be between 25 and 35 words in length.
Proper Keywords Tags
Your keywords meta tag should contain between 5-10 keywords or keyword phrases that are also found in page content.
Proper Heading Tags
Each page of your site should use at least the H1 heading tag for the search engines that examine it when crawling your site.
Pages should have between 300 and 700 words of descriptive content that contains the keywords specified for the page.
Each page of your site should contain links to every other page so search engine spiders can find every page. This is a critical step for the proper indexing and page rank distribution of your site.
It’s important to use two site maps for your website–an XML version and a static version. The XML version can be created with Search Engine Visibility’s site map tool. The static version should sit on a static HTML page and contain links to every other page.
It’s important that search engine spiders find your robots.txt file that guides spiders to pages and directories you want crawled and denies entry to protected areas of your site.
Because search engines treat web sites as a grouping of pages and not a single entity, each page on your site should be unique so that the tags and content differ between each page. Doing so increases the number of pages that will rank.
Pages should contain 300 to 700 words of unique and descriptive content. A page’s meta tag keywords should also be those that occur most frequently on the page.
There is no hard and fast rule for an search engine optimization. The above strategy has been yielding a good results to me till date and i believe it will yield a good results to whoever adopts it. If I have missed out anything in this post, or if you have any of your search engine optimization tips i would be glad to see it in the comments