Driving web traffic to a site
go to top
Search Engine Optimization (SEO) is a systematic process of using search engines to draw or increase traffic to a website. SEO involves attaining higher rankings within search engines and directories by making changes to site code and content.
More than 2.1 billion documents compete for a user's time
A recent U.S. study found that the World Wide Web contains more than 2.1 billion documents and is growing at the rate of 7 million pages per day (source: http://www.cyveillance.com).
Most users begin with a Search Engine
Major search engines attract more distinct visitors than almost any other Website. Yahoo! alone boasts 65 million registered users! Additional research shows that 81% of Internet users rely on search engines and directories to find the information they need. (source: Forrester Research Inc.) A website's search engine ranking determines its share of this traffic.
Site Engine searches are more likely to buy
Users finding a site via search engines are more qualified targets for its products and services because they have actively sought out the site by typing a search phrase directly related to the site content. Being listed by a search engine is not enough to drive traffic. Actual search engine positioning is crucial to site success. If a site is not listed within the first 20 or 30 search engine rankings, it will lose traffic, no matter how many engines list the site.
Achieving a high ranking takes a combination of positioning techniques, patience and quality site content. This paper is intended to provide an overview of the types Search Engines and search processes in order to achieve listings and optimal ratings.
The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in different ways and it's important to distinguish between them and their data-gathering techniques.
Crawler-Based Search Engines
Crawler-based search engines, such as HotBot, create their listings automatically. They routinely "crawl" or "spider" the web, then a human editor searches through the results. This means that, unlike directories, the site is likely to have several if not many pages listed with them.
By correctly structuring web pages, crawler-based search engines find the pages, and determine how the site listed. Page titles, body copy and other elements are all part of the criteria.
A human-powered directory, such as Yahoo, depends on humans for its listings. A short description is submitted to the directory for the entire site, or editors write a description for sites they review. A search looks for matches only in the descriptions submitted.
Therefore, criteria that may be useful for improving a listing with a search engine could have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.
However, sites that are listed with directories are more likely to be found by crawler-based search engines and thus be added to their listings for free.
"Hybrid Search Engines" Or Mixed Results
In the web's early days, a search engine either presented crawler-based results or human-powered listings. Today, it extremely common for both types of results to be presented. Usually, a hybrid search engine will favor one type of listings over another. For example, Yahoo is more likely to present human-powered listings. However, it does also present crawler-based results (as provided by Google), especially for more obscure queries.
Components of a Crawler-Based Search Engine
Crawler-based search engines have three major elements. First is the spider, also called the crawler. The spider visits a web page, reads it, and then follows links to other pages within the site. This is what it means when someone refers to a site being "spidered" or "crawled." The spider returns to the site on a regular basis, such as every month or two, to look for changes.
Everything the spider finds goes into the second part of the search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated with new information.
Sometimes it can take a while for new pages or changes that the spider has found to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those searching with the search engine.
Some search engines index more web pages than others, some index web pages more often than others. The result is that no search engine has the exact same collection of web pages to search through. This naturally produces differences, when comparing search results.
Search engines check for search keywords that appear near the top of a web page, such as in the headline or in the first few paragraphs of text. They assume that any page relevant to the topic will mention those words right from the beginning.
Search engines will analyze how often keywords appear in relation to other words in a web page. Those with a higher frequency are often deemed more relevant than other web pages.
Changing page titles and adding meta tags is not necessarily going to help a page do well for target keywords if the content on the page has nothing to do with the topic. The keywords need to be reflected in the page's content.
Not all search engines read meta tags. The search engines that do read meta tags weight them differently.
Search engines may also penalize pages or exclude them from the index, if they detect search engine "spamming." An example is when a word is repeated hundreds of times on a page, to increase the frequency in an attempt to propel the page higher in the listings. Search engines watch for common spamming methods in a variety of ways, including following up on complaints from their users.
Crawler-based search engines have plenty of experience now with webmasters who constantly rewrite their web pages in an attempt to gain better rankings. Some sophisticated webmasters may even go to great lengths to "reverse engineer" the location/frequency systems used by a particular search engine. Because of this, all major search engines now also make use of "off the page" ranking criteria.
Link Analysis. By analyzing how pages link to each other, a search engine can determine what a page is about and whether that page is deemed to be "important" and thus deserving of a ranking boost. Link analysis is about "popularity” not volume. A website should link with quality web pages with related topics.
Search engines use sophisticated techniques to screen out attempts by webmasters to build "artificial" links created solely to boost their rankings.
Clickthrough Measurement. A search engine may watch the results a user selects for a particular search. Based on what sites are being selected from the list, the search engine may drop high-ranking pages that aren't attracting clicks, while promoting lower-ranking pages that do pull in visitors. As with link analysis, systems are used to compensate for artificial links generated by eager webmasters.
Sometimes 1,000 words are worth more than a picture. HTML text should appear on each page. Sites that present large sections of copy via graphics may be visually appealing, but search engines can't read those graphics. That means they miss out on text that might make the site more relevant. Some of the search engines will index ALT text and comment information, along with meta tags.
Submitting a listing to a Directory
Prior to attempting to submit a site, a written description (of 25 words or less) of the entire website should be developed. That description should make use of the two or three key terms that will return results.
The target keywords/phrases should always be at least two or more words. Usually, too many sites will be relevant for a single word and odds of success are lower. Using phrases of two or more words increases the rankings within a search engine.
Important: In order to achieve high rankings, follow the search engine rules. Keep the content useful, improve the link popularity and monitor the search engine positioning for improvement opportunities.