Joe Donnelly Design
Affordable Freelance Graphic, Logo, Print and Web Design Services
About Joe Donnelly DesignView our design portfolioDesign services offered by Joe Donnelly DesignGet a FREE no obligation design quoteMore questions? Read the FAQ for answersUseful design links, articles and resourcesPartners, Charities and AffiliationsContact info for Joe Donnelly Design

Browse SEO - Search Engine Optimization Glossary

#  A  B C  D  E  F  G  H  I  J 

  L  M  N  O  P  Q  R  S  T

  V  W  X  Y  Z

Get your FREE Design Quote

<< Go Back to Previous Page

Browse SEO - Search Engine Optimization Glossary :
# A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
To add a term or definition to our list for FREE, please click here. To report a bad or broken link, please click here

Click popularity
Click popularity measures the relevence of search results by monitoring user behaviour from the search results. If a user clicks on a result and returns to the SERPs within a short period, the site is viewed as less relevant and downgraded in the rankings. Similarly, if click-through rates on the first page of results are low and users are having to click through to the second or third page to find relevant results, this is taken into consideration when re-ordering results.

Click popularity algorithms are one of the most effective ways of presenting relevant search results. However, they are vulnerable to manipulation by click-bots which attempt to artificially boost click-through rates.

A pioneer in click popularity was Direct Hit. Elements of the Direct Hit algorithm are still used by Ask Jeeves.

Cloaking (IP delivery)
Cloaking is the method of using a script on the web server to serve highly optimised pages to search engine spiders and different pages to a normal user. This is done in order to present the search engine spiders with key phrase rich text that might be beneficial for search engine positioning purposes. It is also used to hide SEO work from the competition. Cloaking is mainly used in very competitive markets. Search engine spiders can be identified by their user-agent strings or by their IP addresses. Cloaking is considered as unacceptable practice by all major search engines because it can mislead their users. Webmasters employing cloaking are engaged in a constant game of cat-and-mouse as they attempt to keep up with the IP addresses employed by the search engines.

Clustering is the grouping of results from one domain under one listing in search engine results pages. This is to prevent multiple pages from one site appearing many times in the results. There is often a 'More results from this site' link to allow the viewing of these.

Not to be confused with a clustering engine.

Clustering engine
A clustering engine is a search engine which automatically groups search results into related themes. This can help focus results for ambigiuous terms. For example, a search for 'blues' might present the themes 'music', 'Oxford rowing team' and 'depression'. Choosing a theme will present results only in this area. A well known clustering engine is Vivisimo, and AltaVista uses a clustering engine called Prisma.

Cross-linking is the practice over-using links between a network of sites to artificially inflate link popularity and thus rankings. Google has updated its algorithm to identify clusters of sites which are strongly linked together but not well linked to the rest of the web, and applies a penalty to these sites. Many innocent sites have been affected by this cross-linking penalty as it is a common practice to link to a corporate headquarters from every page of a subsidiary site.


Directories are a subject guide, typically organised by major topics and sub-topics. The best-known directory is the one at Yahoo!. The biggest directory is the Open Directory Project, which has members of the public editing the categories. Many other sites now use a Yahoo-like directory including major portal sites.

Directories use human editors to review a site's subject matter and quality, and for this reason are considered authorities by other search engines which use directory listings to categorise their own indices.






Gateway pages, doorway pages, hallway pages
Gateway Pages are special highly optimised web pages that are created in order to rank well when a user enters a certain key phrase, or search term, into a search engine. This optimization is achieved by inserting the key phrases at strategic places in the html code and text on the page. Gateway pages are not designed to be integrated into the web site, but to serve as a "gateway" into the web site. The use of gateway pages is no longer considered to be good SEO practice as these pages very often are machine generated in order to gain rankings and serve no purpose to the user.

Google dance
The 'Google dance' was a term used by search engine marketers to describe the several days of instability of search results in Google as it was rolling its latest database update out across its many servers. A white-knuckle ride for SEO companies as they waited to see if their sites had gone up or down... However, as of March 2003, Google moved over to a rolling update effectively putting the dance to an end.




The process by which search engines collect information and include it into their database of search results. The process involves extracting the machine-readable text from web pages, and storing it in a format that can be efficiently searched. Indexing is carried out by search engine spiders.

Indices are search engines that automatically crawl the Internet for web page and news group content. Due to the automated nature of indices they can contain a huge amount of information that is regularly updated every time the search engine spiders new content. The major search engine indices include Google, Yahoo and Teoma.




Key phrase, keyword
A key phrase is a group of keywords which appear in the content of a site page. In order for a search engine to return a page in it list of results, it is vital that the targeted search terms appear as key phrases in the web site copy with the appropriate weighting so that its algorithm will find the page a suitable match. A good SEO will have experience in ensuring the copy of the page is optimised for the targeted search terms while still providing useful and informative copy for the user.

"Key phrase" is often (incorrectly) used interchangeably with "search term".


Landing page
A landing page is the page within a website that a user arrives at when entering a website. The page a user lands on from a search engine’s sponsored listing can be controlled by the destination URL that relates to each separate Ad or Search Term.

The most effective landing pages are 'deep-linked' to the page on the site that is most relevant to the user's search. A deep-linked landing page will therefore make the user's experience as smooth as possible and should improve conversion rates.

Link farm
A link farm is a set of web pages specifically set up to increase the number of links between websites and hence their link popularity.

Recent studies on the link structure of the Internet suggest that natural linking results in 'communities' of websites, all sharing roughly the same theme. Link farms, on the other hand, do not consist of 'naturally' found links, neither do they reflect specific themes. Link farms are therefore considered un-natural by search engines, and as such have been actively targeted in anti-spam policies, resulting in penalties for those websites that are link farm members. Link Farms should be avoided.

Link equity
An important factor in search engine optimization? and Google's simple but brilliant contribution to web search. Popular or relevant pages will are assumed to have more incoming? links from other websites, a type of 'vote of confidence' in the websites integrity and usefullness.? Websites that are well linked within the sector rank higher than sites that are not.

Additionally, link? equity is an important factor in combatting search spam as it is? difficult to manipulate artificially, and is thus less open to abuse.

Link equity is calculated firstly by counting the amount of websites that link to yours. However, more links do not necessarily mean better rankings as search engines also measure the quality of these incoming links. If your site enjoys links from other well-linked sites then your link? equity is further enhanced.

Additionally, some search engines, for example Google, are increasingly attempting to refine their linking calculations by paying attention to the 'theme' of a link. Some new search engines, like Teoma, are taking this a step further and only counting links within certain web 'communities' of similar themed websites.


Meta search engine
A Meta Search Engine compiles its results from many different search engines and returns them in one combined listing. An up-and-coming Meta search engine is Vivisimo. Other well-known meta search engines are IxQuick and Dogpile.

Many surfers use Meta search engines as a convenient way to collate the best results from many different sources. In fact, many surfers don't even realise they are using Meta search engines. A general trend has developed in the last couple of years, whereby some search engines will combine a number of different sources for their results. For example, Freeserve currently uses Overture and Yahoo! to enhance its listings.

Meta tags
Meta tags are pieces of information, invisible to the surfer, that are coded in the HTML of a page in order to describe the content of a page to a search engine spider or other bot.

It is a common misperception that the use of Mata Tags ensures good search engine positioning. Nowadays the 'Keywords' Meta Tag is a minor part of most search engine algorithms - some, such as Google, do not even consider it at all.

Most search engines use the 'Meta Description' html tag as a short description of the page when it is presented in a search results page. It is, therefore, important to ensure that this Meta Tag contains a readable, punchy, and interesting description of the pages content - this can give a real boost to the click through rates from listings in search results.




Off-page optimization
In addition to on-page factors, search engines are increasingly using off-page factors to calculate relevance. This is because off-page factors are more difficult to manipulate artificially.

The most important off-page factor is link popularity. Others include link text, link community and click popularity. Off-page optimization involves ensuring that these elements are in place to boost relevance for the targetted terms.

On-page optimization
Search engines use several factors in their ranking algorithms, one of which is on-page factors. These are elements which actually appear on the page (such as page title, headings and body text) and contribute to the engines's assessment of the subject matter and relevance of the page. On-page optimization involves ensuring that these factors are optimally included for the targetted search terms.

Organic search listing
A website’s position within the Organic Search results, also referred to as 'Natural Search' results, is determined by an automated calculation, sometimes referred to as the search engine algorithm. This is the process that determines how results are ranked when a user makes a search query. There are a variety of different search engine algorithms, factors that may influence your organic search position include:

  • The way a website's design is architectured.
  • The content available to the search engine.
  • The link community or link popularity of the website.
  • The popularity of the website.
  • How long the website has been established.

Organic search results should form the core of a search marketing strategy, complemented by strategic paid search campaigns.


Pay for inclusion (PFI)
Pay For Inclusion is a method of ensuring that individual web pages are reliably and regularly spidered and included into a search engine's database. Search engines are increasingly offering this option for inclusion as it offers them a revenue stream.

PFI offers a solution to the problem of sites with dynamic URLs that search engine may not be able to index. And, as the pages are refreshed every 48 hours, they can help ensure content in the indices is up to date.

Paying for inclusion does NOT influence the ranking of a page once it is included in the database. You are ONLY paying for the search engine to spider your page. Examples of PFI services are those offered by search engines such as AltaVista, FAST and Teoma.

Pay per click (PPC)
Pay Per Click search engines offer a 'bid-based' service in which top positions are auctioned for specific keywords. The highest bidder for a chosen keyword normally ranks highest in the search engine results. The price of the bid is charged to the advertiser whenever a user clicks on their entry.

Positions are separated from the main natural search results and are normally designated as 'sponsored links' or 'sponsored sites'.

PPC advertising is a fast growing search marketing channel due to its highly responsive and measurable nature. As more advertisers enter the market and click costs continue to rise, managing your Paid Search campaign strategically will be crucial to your brands success in this volatile but rewarding search channel.


Ranking algorithm
The methodology by which search engines calculate positioning results. Ranking algorithms can be influenced by a wide variety of factors including domain name, spiderable content, submission practices, HTML code and link popularity. Search engine ranking algorithms are closely guarded and constantly updated to attempt to filter out those sites which attempt to manipulate the results.


Search engine optimization (SEO)
Search engine optimization is the process of optimising a website or web page to increase its visibility within the search engine results.

Search engine optimization entails making sure that there is content relevant to the targeted key phrases on the web site, and that search engine spiders can find this content easily. Good search engine optimization will ensure that this content is also useful to the user. Without relevant content, SEO techniques can only be partially successful, and will probably stray into the wrong side of search engine Acceptable Use Policies. See also 'ranking algorithms'.

Search engine placement
SEP - Search Engine Placement (see also: Search Engine Positioning, Search Engine Optimization (SEO)) is the art of increasing a site's visibility within the search engine results.

Search engine placement is achieved by a combination of on-page SEO, link building and ensuring the site is included in the correct search engine databases to reach the target market. It may also involve other strategies such as Pay Per Click campaigns or Trusted Feeds. Simply increasing occurrences of key phrases will not, in itself, lead to higher rankings.

Search engine placement techniques are constantly evolving as new search engines emerge to target different markets and new inclusion methods are introduced.

Search engine positioning
Search Engine Positioning, also known as Search Engine optimization (SEO), is a phrase to describe the practice of positioning a web site within the search engine results.

A multitude of techniques are involved in successful search engine positioning. Not only must a web site be optimised, but it's link popularity must also be built. Good visibility in directories must be gained and other search engine marketing techniques pursued, such as Pay Per Click campaigns.

Search engine positioning has evolved over the last few years. While initially involving only search engine optimization, the industry has grown to include a whole array of additional techniques.

Search engine submission
Search engine submission is the process by which one makes search engines aware that ones website is ready to be indexed by the search engine spiders. In general search engines spider the web on a regular basis, and will eventually find your web site by following a link from a site already within its index. It is sometimes necessary however to manually submit a new site which has not been linked, or to use a paid for inclusion process to ensure quick inclusion into the database.

The search engine submission process involves going to a specified section of the search engine web site (the "Add URL" page) and inputting details which can include those web pages that one requires to be spidered. The term 'submission' also covers the process of requesting a listing in the directories.

Search term
A search term is the word or phrase entered by a user into a search engine in order to perform a search. The search engine or directory then uses its algorithm to search its database of pages or sites to find a matching key phrase and return a list of results. Users may enter general search terms, such as "insurance", or they may enter more focused terms, such as "uk insurance brokers".

A properly focused search term set forms the core of a good search engine positioning strategy and it is important to ensure that these are reflected in the actual content on the website. A search engine promotion that targets popular but relevant search terms has the advantage of driving targeted traffic, which can result in high conversion rates.

Gaining knowledge of the general trends and habits of searchers, and having experience of the complexities of search term selection, can make the difference between search engine promotion failure or success.

SEO - Search Engine Optimization
SEO is the abbreviation for 'Search Engine Optimization'. SEO is normally used to describe the process of manipulating a website's pages in order for them to rank higher in search engine indices.

Successful SEO can result in a site, which features prominently in a major search engine such as Google or MSN, delivering a significant amount of new visitors. SEO can contribute to the overall success of your website marketing.

The process of SEO can involve changing a web page's content and html code so that a search engine 'spider' can find specific information more easily. Additionally, SEO occasionally involves the re-coding of a websites linking architecture.

SEO combined with a set of targeted 'Key Phrases' (search terms people are using in search engines) can result in your website gaining high positioning for your most popular products or services. Furthermore, SEO can help target users focused to your area of expertise, enhancing user experience and eliminating excess unwanted click-throughs.

SEO is a key element in the online marketing of a website as it can help potential visitors find the information they are searching for before discovering a competitor's site.

SEO can also be known as 'Search Engine Positioning' or SEOR 'Search Engine optimization & Registration'.

SERPs is an acronym for 'Search Engine Results Pages'. A term that has been adopted by the search engine promotion community, this alludes to the list of search results returned from an enquiry. These usually consist of 10-15 results by default. It is important to have a search engine optimization campaign that gains listings in the first 3 SERPs. 65% of click-throughs come from the first SERP.

Not to be confused with State Earnings Related Pension Scheme!

Spamming, in general, is an attempt to feed misleading information to search engines in order to gain favourable positioning.

Search engines view spamming seriously, as it compromises the quality of their results. Unfortunately there is no exact definition of what is spam and what is not - and search engines disagree between each other, and often change their own definition of spamming a number of times in one year!

Here are MSN Search's spam guidelines:

"The following items and techniques are not appropriate uses of the MSN Search index. Use of these items and techniques may affect how your site is ranked within MSN Search and may result in the removal of your site from the index.

  • Loading pages with irrelevant words in an attempt to increase a page's keyword density. This includes stuffing ALT tags that users are unlikely to view.
  • Using hidden text or links. You should use only text and links that are visible to users.
  • Using techniques to artificially increase the number of links to your page, such as link farms."


Spiders are automatic programs that search engine indices use to catalogue the web and include into their databases. Spiders find web pages either by working through pages deliberately submitted to them, or by following links between web pages. They are often called by quirky names; for example, Google's spider is called 'googlebot', Yahoo!'s is 'slurp', AltaVista's is 'scooter'.

Spiders have difficulty following certain HTML code, in fact they can be thought of as very primitive browsers. Spiders cannot follow JavaScript links or index content in images. Professional search engine optimization should remove these obstacles and utilise code that allows spiders to freely roam the web site.

Sponsored listings
Sponsored listings are listings within search engine results pages which have been paid for by advertising sites. These generally have to be marked as such to differentiate them from normal search results after a class-action suit was brought against several major search engines for misleading their users. Sponsored listings are often supplied by pay per click engines such as Overture or Google Ads.


Text optimization
Text optimization is the process of constructing a website page that will be seen by search engines specifically to promote the relevancy of a certain key phrase. Good text optimization should not utilise random key phrases, but should maximise the value of existing content, such as specific informational topics or product information.
Traffic site

Traffic sites are sites built solely with the purpose of obtaining rankings in search engines. They generally consist of nothing but doorway pages and are designed to snare search engine traffic and pass it on to the main web site. Unscrupulous search engine optimisers often use many traffic sites which are extensively cross-linked to manipulate link popularity. This technique is also known as 'domain spamming' and is forbidden by all search engine terms of use. The use of this technique is a sure way of earning a ban from the index.

Google is growing increasingly adept at identifying clusters of traffic sites which use this technique to hoard link popularity, and removing them from its index.

Trusted feed
A trusted feed is a method of supplying website content from a large dynamically generated web site (for instance an ecommerce site with a large product catalogue) to a search engine database. Many crawling engines have difficulty in indexing content from these types of sites because of their dynamic URLs and frequent updates, and hence offer a trusted feed as a way for companies to include their catalogues into the search database. The site pays for any subsequent referrals by the click.