S.E.O :: Search engine optimization · 12 September 2005, 08:10 by Admin
Search engine optimization (SEO) is a set of methodologies aimed at improving the ranking of a website in search engine listings. The term also refers to an industry of consultants that carry out optimization projects on behalf of client sites.
To obtain maximum search engine visibility, a website must tailor itself to be found by its target audience through internet searches. If a site is to be found, it must contain keyword phrases that match the phrases the target audience is typing into search queries. These keywords are determined by search engine spiders that analyze web page content and keyword relevancy based on an algorithm. Search engine optimization is the process of configuring a website to be more visible to its target audience.
SEO began in the mid-1990s, as the first search engines were cataloging the early Web. Many site owners quickly learned to appreciate the value of a new listing in a search engine, as they observed sharp spikes in traffic to their sites.
Site owners soon began submitting their site’s URLs to the engines on a regular basis, and began modifying their site to accommodate the needs of search engine spiders, the software programs sent out to explore the Web. Special features such as meta tags became a common feature of sites that sought out high-ranking listings in search engine result pages (the so-called “SERPs”).
Consultant firms arose to serve the needs of these site owners, and attempted to develop an understanding of the search engines’ internal logic, or algorithms. The goal was to develop a set of practices for copywriting, site coding, and submissions that would ensure maximum exposure for a website.
As the industry developed, search engines quickly became wary of unscrupulous SEO firms that attempted to generate traffic for their customers at any cost (the most common problem being search results’ decreasing relevance). See unethical methods.
The search engines responded with a continuous series of countermeasures, designed to filter out the “noise” generated by these artificial techniques. In turn, several SEO companies developed ever-more-subtle techniques to influence rankings.
In the early 2000, search engines and SEO firms attempted to establish an unofficial ‘truce’. There are several tiers of SEO firms, and the most reputable companies employ content-based optimizations which meet with the search engines’ (reluctant) approval. These techniques include improvements to site navigation and copywriting, designed to make websites more intelligible to search engine algorithms.
Search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences and seminars. In fact, with the advent of paid inclusion, search engines now have a vested interest in the health of the optimization community.
Ethical methods require more effort and time than unethical methods. Search engines attempt to provide relevant search engine results, but relevancy is dependent on the user’s perception. However, ethical methods are most likely to result in quality, long-term results as search engines consistently combat spammers. Arguably, the most ethical method is to have worthwhile content to which many other web sites will voluntarily link. Another important ethical method is to improve the quality of coding by eliminating validation errors and unnecessary complexity that may interfere with search engine indexing.
Notify search engines
The first step is to notify the search engines of the existence of the site. This is accomplished by various means, the easiest of which is a link from a different site. Search engine spiders will follow that link and make a note of the site’s existence in the search engine’s index. This method is also preferred as it results in an inbound link. Another option is to submit your site’s URL manually to search engine and other directories, such as DMOZ. Thirdly, the major search engines have paid business directories (paid inclusion) that result in more timely inclusion.
A site may contain many commonly searched keywords but these will not show up on the first pages of a search if they are not deemed ‘important’ For example, if a webpage is started and it contains the word ‘Wikipedia’ and subsequently ‘Wikipedia’ is typed into Google, the chances of the newly created site appearing in even the first million results is slim.
The content of each webpage is objectively evaluated by analyzing included keywords. While previous analysis used the now outdated keyword density method, the analysis now includes factors such as proximity, distribution, occurrence, and on-topic issues.
Methods commonly used to optimize content include:
* Choosing a domain name that accurately represents the content
* Including keywords in the site’s directory and file names
* Using concise page titles that include keywords
* Writing META descriptions the summarize each page’s content
* Emphasizing each page’s main topic with H1 heading tags, and sub-topics with H2, H3, and lesser heading tags
* Centering each web page on a small number of keywords
* Provide alternative methods for viewing content that is not spidered well (such as Flash and frames)
* Including keywords in the alt attribute of graphics
* Emphasizing quality and originality (over copied or duplicated content is penalized by most search engines)
* Including fresh and frequently updated content
* Adding a “site map” page to guarantee effective spidering of entire site
* Avoiding excessive numbers of links on any page. Google recommends no more than 100.
* Organizing site navigation so that pages are only a few clicks away from the home page
* Dividing large pages into smaller, more easily focused pages
* Increasing the amount of content on each page to increases the chance of keyword phrase matches.
Link popularity is an important factor for high importance rankings. Google introduced the concept of PageRank as an indicator of an individual page’s value based on the quantity and quality of links pointing to it. Inbound links are weighted by the popularity of the linking site. For example, a link from en.wikipedia.org would be a greater contributor to a page’s rank than a link from an unpopular blog or unknown site.
Blogging can be a cheap but effective method of obtaining links. However, avoid spamming techniques such as link dropping. Forum moderators and blog owners find them annoying and will remove them quickly. The contents of blog surveyors, such as Blogpulse, will reflect what people write about in their blogs and is a good starting point for search engine optimization-oriented blogging.
Controversy can also bring in a lot of inbound links as both fans and haters link to the controversial site.
Conventional off-line advertising, such as flyers, T-shirts, hats, stickers, billboards, etc., can also be used to attract visitors to the site which will result in more inbound links. This method, however, is not as easy to track as online methods.
Paid inclusion is a fee-based model for submitting website listings to search engines. The fee structure is used by search engines as a filter against superfluous submissions, and also as a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be cataloged on a regular basis. A per-click fee may also apply. Each search engine is different. Some sites only allow paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo, mix paid inclusion (per-page and per-click fee) with results from web crawling. Others like Google do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).
Often the line between pay-per-click advertising and paid inclusion is debatable. Some have lobbied for any paid listing to be labeled as an advertisement. While, defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether its shown to any users.
SEOs often use paid inclusion, since they can quickly get their pages into the web index, test out of different approaches to improving ranking, and see the results, often within a couple days, instead of waiting weeks or months. Sometimes knowledge gained is then used to optimize other web pages, without payment.
Keyword spamming (or keyword stuffing) involves the insertion of hidden, random text at the bottom of a webpage. The inserted text usually includes words that are frequently searched (such as “sex”), with the goal of increasing rankings and access to large streams of traffic.
Spamdexing is the promotion of irrelevant, chiefly commercial, pages through abuse of the search algorithms. Many search engine administrators consider any form of search engine optimization used to improve a website’s page rank as spamdexing. However, over time a widespread consensus has developed in the industry as to what are and are not acceptable means of boosting one’s search engine placement and resultant traffic.
Cloaking refers to any of several means to serve up a different page to the search-engine spider than will be seen by human users. It can be an illegitimate attempt to mislead search engines regarding the content on a particular web site. It should be noted, however, that cloaking can also be used to ethically increase accessibility of a site to users with disabilities, or to provide human users with more or less equivalent content that a search engine would not be able to process or parse. A good benchmark on whether a given act of cloaking is ethical is precisely whether it enhances accessibility.
Link spam is the placing or solicitation of links randomly on other sites, placing a desired keyword into the hyperlinked text of the inbound link. Commonly called “Googlebombing”, it can be a prank (type “miserable failure” into Google to demonstrate), or a deliberate attempt to influence ranking for commercial gain.
The following techniques are also widely acknowledged as being spam, or “black hat”:
* Mirror sites
* Doorway pages
* Link farms
* Nigritude ultramarine
* Seraphim proudleduck
* Google consultant
* Search engine
* “Company Overview”. Google. URL accessed on May 26, 2005.
* “Editorial Guidelines for Ask.com”. Ask Jeeves. URL accessed on May 26, 2005.
* Brin, Sergey; Page, Lawrence (?). “The Anatomy of a Large-Scale Hypertextual Web Search Engine.”
* “Our Search: Google Technology”. Google. URL accessed on June 11, 2005.
* Garcia, E.: “The Keyword Density of Non-Sense”, E-Marketing News, (March 2005)
* Kent, Peter (2004) Search Engine Optimization For Dummies, Wiley Publishing Inc.. ISBN 0-7645-6758-6
* Guidelines for webmasters
o MSN Search