Absolute Link –
An Absolute Link specifies a transfer protocol, a domain name and most often a file name. An example of an Absolute Link is: www.yourdomain.com
It is the practice of making websites usable for disabled people, especially blind individuals. Because search engines are essentially blind for they can’t see pictures or use Flash Player, Accessible Websites tend to have better search engine rankings than inaccessible websites.
This is a fast and easy way for all website publishers to display relevant Google ads on their website’s content pages and earn money. Because the ads are related or a match to the characteristics and interests of the visitors of the site, it is a helpful way to both monetize and enhance the content pages. It’s also a way for website publishers to provide Google web and site search to their visitors, and to earn money by displaying Google ads on the search results pages.
This is Google’s CPC (Cost Per Click) based text advertising. AdWords takes clickthrough rate into consideration in addition to advertisers bid to determine the ads relative position within the paid search results. Google applies such a weighting factor in order to feature those paid search results that are more popular and thus presumably more relevant and useful. Google has also started taking into account the quality of the landing page and applying a quality score to the landing pages.
Agent Name –
This is the name of the crawler/spider that is currently visiting a page. Spider is a robot sent out by search engines to catalogue websites on the internet. When a spider indexes a particular website, this is known as ‘being spidered’.
It is an operational programming rule that determines how a search engine indexes content and displays the results to its users.
Currently, it is a search engine owned by Yahoo and using its database.
Alt Attribute –
The ALT Attribute is designed to be an alternative text description and provide a text equivalent for images.
Alt Tags –
An alternate text associated with a web page graphic that gets displayed when the Internet user hovers the mouse over the graphic. It should convey what the graphic is all about and contain good relevant keywords. Alt Tags also make web pages more accessible to the disabled. For example, a blind user may have a web browser that reads aloud the text and alt tags on a page. For those familiar with HTML, “alt” isn’t actually a tag by itself but an attribute to the “img” tag. Note that the value of Alt Tags for SEO have been discounted over time by the search engines to the point that it is of minimal value now.
Anchor Text –
The actual text part of a link which is usually underlined and used by search engines as an important ranking factor. Google pays particular attention to the text used in a hyperlink and associates the keywords contained in the anchor text to the page being linked to. It is also called “Google bombing.”
Animated Ad –
An ad with movement, often an interactive Java applet or Shockwave or GIF89a file.
Announce Site To Search Engines –
To “Announce” a website to the engines is to add a link to it from another site that’s already indexed by the search engines. Though sometimes search engine submission services that promises higher visibility in the SERPS are total a rip-off.
Announcing – See Submitting Definition
Also known as Application Program Interface, an API is a set of routines, protocols and tools for building software applications. It determines how a service is invoked through the application.
An acronym for Active Server Pages, a Microsoft-invented, proprietary programming language for building dynamic web sites. ASP is also an acronym for Application Service Provider, a hosted service available via the Internet.
Automated Submitting –
It is using automated software such as WebPosition Gold or an Application Service Provider (ASP) such as Microsoft b-central’s Submit-It service to submit web pages to the search engines. This tactic is usually frowned upon by the search engines. Indeed, some search engines such as AltaVista have completely automated submissions by requiring the user to re-key in a one-time use submission code that is displayed on the submission page as a graphic.
Back Links –
This are inbound links pointing to a web page. See Inbound Links Definition
Bait and Switch –
This is considered as a spam technique when used in SEO. It provides one page for a search engine or directory and a different page for other user agents at the same URL. Sometimes it creates an optimized page and submits to search engines or directory, but replaces with the regular page as soon as the optimized page has been indexed.
This happens when a search engine blocks a particular site from appearing in its search results.
Banner Ad –
A graphic image, usually a GIF or JPEG, that can be placed anywhere on a web page, most frequently centered across the top. The tile ad is a smaller counterpart, typically grouped with other tile ads along a side margin. The standard banner ad is 468 x 60 pixels; the most common size for tile ads is 125 x 125 pixels. The Interactive Advertising Bureau regulates guidelines and standards for display advertising sizes.
A line of code placed in an ad or on a web page that helps track the visitor’s actions, such as registrations or purchases among others. Also known as web bug, 1 by 1 GIF, invisible GIF or tracker GIF, a web beacon is often invisible because it’s only 1 x 1 pixel in size and has no color.
Beyond The Banner –
Any advertisement that is not a banner, such as an interstitial or a pop-up ad.
Bid Management Tool –
Software or an ASP service used to manage bids on pay-per-click search engines such as Yahoo Search Marketing (formerly Overture) and Google AdWords.
It means placing a bid price willing to be pais as an advertiser on a pay-per-click search engine. The highest bid for a given keyword achieves the top spot in the PPC search results. In Overture, the top three bids are “featured” on Overture’s partners’ sites, including AOL, Altavista, Infospace, and others. The minimum bid amount on Overture is 5 cents per clickthrough. See Pay Per Click / PPC Definition
Black Hat SEO –
This is sometimes called spamdexing – the opposite of White Hat SEO. Black Hat SEO can be any optimization tactics that cause a site to rank more highly than its content would otherwise justify or any changes made specifically for search engines that don’t improve the user’s experience of the site. In other words, Black Hat SEO is optimizations that are against search engine guidelines. A site may be penalized or even removed from the index if it steps too far over the mark. For example, adding product reviews to e-commerce site is encouraged, because it adds useful content to the site. However, using bait-and-switch techniques to create a doorway page that hooks people querying for information on other topics like soccer then leading to information about health products, will be unacceptable. The following Black Hat SEO tactics should be avoided to keep the site away from penalties:
Keyword, anchor text and domain name stuffing
Using hidden text or links
Using techniques to artificially increase the number of links to web pages, such as link farms
Excessively cross-linking sites to increase link popularity
Cloaking, delivering different pages depending on the IP address and/or agent who is requesting it
Doorway / Gateway / Jump Pages
Duplicate content taken from other sites
Auto-generated content of no value to the end user
Spamming forums or blogs
Excessive outbound links to websites that use high risk techniques or spam
Last but not least, staying close to search engine guidelines is always a good idea while optimizing the site. Some Search Engine Guidelines are:
Yahoo! Search Content Quality Guidelines
Google Information on Search Engine Optimizers
MSN Guidelines for successful indexing
Lists of search engine spammers compiled by either search engines or vigilante users, which may be used to ban or boycott said spammers from search engines.
Also known as a “weblog”, it is an online diary with entries made on a regular or daily basis. Some Blogs are maintained by an anonymous author who uses a nickname or handle instead of using his or her real name.
Body Copy –
The ‘meaty’ textual content of a web page, Body Copy refers to text visible to users. It doesn’t include graphical content, navigation, or information hidden in the HTML source code.
Short for robot. See Spider Definition
Bridge Page –
See Doorway Page Definition
Broad Match –
It is a form of “keyword matching” which refers to the matching of a search listing or advertisement to selected keywords in any order. Broad Match terms are less targeted than exact or phrase matches. This means if the selected keywords are “running shoes”, then ads or a search listing may be displayed if the users search upon the following example keywords:
Any Order: “shoes running”
Synonym: “running sneakers”
Plural, Singular: “running shoe”
Bulk Submission Services –
An ASP that submits many URLs to the search engines. For example: SubmitWolf – search engines don’t like these. See Automated Submitting Definition
A clickable graphic that takes the user to another page or executes a program, such as a software demo or a video player.
These are copies of web pages stored locally on an Internet user’s hard drive or within a search engine’s database. A Cache is the reason why web pages load so quickly when a user hits the Back button in their web browser, since the page is not being redownloaded off of the Internet. Google is unusual among search engines in that it allows Internet users to view the cached version of web pages in its index. Simply click on the word “Cache” next to the search result of interest and it will be taken to a copy of the page as Googlebot discovered and indexed it. This feature of Google makes it easy to spot cloaking.
Call To Action –
This refers to a copy used in advertising to encourage a person to complete an action as defined by the advertiser. Call To Action words are “doing words” such as “Click here”, “Buy Now”, “Enter Now” or “Click To Download”.
A “virtual” directory contained in URLs indicates a CGI (Common Gateway Interface) script is in use. This is a sure tip-off to the spider that a certain page is dynamic.
Click-down Ad or Click-within Ad – An ad that allows the user to stay on the same web page, while viewing requested advertising content. Click-downs display another file on the user’s screen, normally below or above the initial ad. Click-withins allow the user to drill down for more information within the ad.
The action of clicking an ad element, redirecting the user to another web page.
Clickthrough Rate –
The rate at which people click on a link such as a search engine listing or a banner ad. Studies show that Clickthrough Rates are six times higher for search engine listings than banner ads.
Serving different content to search engine spiders than to human visitors, Cloaking is basically a “bait and switch” tactic, where the web server feeds visiting spiders content that is keyword-rich, thus fooling the search engine into placing that page higher in the search results. Yet when the visitor clicks on the link they are given a different and totally unrelated content. Search engines frown upon this practice, thus penalizing and banning sites that are doing this.
Cold Fusion –
A web scripting language with limited capabilities, mostly centered around database access. ColdFusion program files are saved on the web server with a .CFM file extension.
Content Integration –
Advertising woven into editorial content or placed in a special context on the page, typically appearing on portals and large destination sites. Also known as “web advertorial” or “sponsored content”.
The act of converting a web site visitor into a customer and taking that visitor a step closer to customer acquisition. For example, convincing them to sign up for an e-mail newsletter.
Conversion Rate –
The rate at which visitors get converted to customers or are moved a step closer to customer acquisition.
This is information placed on a visitor’s computer by a web server. While the web site is being accessed, data in the visitor’s cookie file can be stored or retrieved. Mostly, cookies are used as unique identifiers (i.e. user IDs or session IDs) to isolate a visitor’s movements from others during that visit and subsequent visits. Other types of data that may get stored in a cookie include an order number, email address, referring advertiser, etc.
Cost Per Action (CPA) –
The cost incurred or price paid for a specific action, such as signing up for an email newsletter, entering a contest, registering on the site, completing a survey, downloading trial software, printing a coupon, etc.
Cost Per Click (CPC) –
The cost incurred or price paid for a clickthrough to a landing page.
Cost Per Lead (CPL) –
Pricing based on the number of new leads generated. For example, people who
click from an ad and then complete an inquiry form is considered to be a lead.
The advertiser would pay based on the number leads received.
Cost Per Order (CPO) –
This is also known as “cost-per-transaction” wherein pricing is based on the number of orders received as a result of an ad placement.
Cost Per Sale (CPS) –
This is also known as “cost-per-acquisition” or “pay-per-sale” wherein pricing is based on the number of sales transactions an ad generates. Since users may visit a site several times before making a purchase, cookies can be used to track their visits from the landing page to the actual online sale.
Cost Per Thousand (CPM) –
The cost incurred or price paid for a thousand impressions.
A simple program which tracks the total number of webpage impressions.
See Spider Definition
This stands for Cascading Style Sheet where it is used to control the design of a website.
Click Through Rate (CTR) –
It is a measure of the number of clicks received from the number of ad impressions delivered. The formula to calculate CTR is: number of clicks divided by number of ad impressions multiplied by 100.
Custom Error Page –
This is customizing the content as well as the look-and-feel of the default page that is displayed on your web server when a 404 File Not Found error occurs. A good 404 error page has a friendly message explaining that the page they requested doesn’t exist at the location, then has a site map to encourage the user to continue exploring the site, a search box so the user can conduct a search, and a look-and-feel that matches the rest of the site, including navigation. Creating a custom 404 error page not only helps keep visitors in your site, it is also an important part of the search engine optimization process. Inevitably pages on your site will get moved and removed over time. When a search engine spider returns to your site to reindex those now non-existent pages, they will have a set of links to explore in the form of the site map on the custom 404 page. You can test for whether a site has a custom 404 error page by trying to access a web page with a nonsense filename after the domain name in the web site address.
Also known as a “database-driven web site.” It means that the website is connected to a database and the web page content is based in part on information extracted from those databases.
Also known as a “database-generated web page.” It means that a web page is created dynamically “on-the-fly” from a database, in contrast with a static HTML page.
Daughter Window –
An ad that runs in a separate window associated with a concurrently displayed banner. In normal practice, the content and banner are rendered first and the daughter window appears a moment later.
Deep Submitting –
Submitting URLs of pages deep in a site to the search engines, similar to a webmaster of 200-page website submits each of those 200 pages. Some search engines does not like this tactic because it unnecessarily clogs up their submission database when the search engine spider could find those pages on its own by exploring links starting at the home page.
Human editors group websites into categories and provide site descriptions or edit descriptions that are submitted to them. With a directory, picking the right category and composing a description rich in key phrases will ensure maximum visibility. Contrast this with a search engine, which is unedited and concerned primarily with the HTML of a site’s constituent pages.
Doorway Page –
Also known as a “bridge page” or a “gateway page”, a Doorway Page is a web page full of keyword-rich copy that doesn’t deliver any useful information on it other than a link into the site, and whose sole purpose is to be fed to the search engines.
Generated “on-the-fly” from a database. See Database-driven Definition
Dynamic Rotation –
Delivery of ads on a rotating and random basis. Dynamic rotation allows ads to be served on different pages of the site and exposes users to a variety of ads.
Error Page –
A web page stating an error message such as “File Not Found”.
Exact Match –
This is a form of keyword matching where the search query must be exactly the same as the advertisement keyword. For example in searching the word “running shoes”, the term will only match ads or search listings that contain the exact words “running shoes”.
Exclusive Advertising –
A contract that allows advertisers to purchase all inventory on a given page or for chosen keywords.
Expandable Banner –
A banner ad that can expand to as large as 468 x 240 pixels after a user clicks on or moves the cursor over the banner.
This is how easily a site can be found using search engines.
See Mozilla Firefox Definition
A technology developed by MacroMedia Corporation that allows a web designer to embed interactive multimedia into web pages. It is often used for Flash intros, games, and animating navigation.
Flash Intro –
An “short” animation created using Flash that Internet users are made to sit through upon entry to a home page. Flash intros annoy users and they typically take the place of text content on a home page. Since search engines can’t “read” content embedded in Flash, the rankings of a home page with a Flash intro will suffer.
Floating Ads –
An ad that appears within the main browser window on top of the page’s normal content, appearing to “float” over the top of the page.
Shuffling of search engine positions in between major search engine updates.
Also known as “discussion forums”, it is a virtual community used by search engine optimizers and webmasters for information exchange. Users can post messages in different forums, either to the group at large or to certain users. However, all postings can be seen by anyone else who has access to that forum, so save sensitive materials for private email. Forums are also threaded, which means a reply to a particular posting becomes part of the “thread” of that posting that can be followed to provide a cohesive progression through a particular topic.
It is when separate web pages are combined into one, each potentially with its own scrollbar. A framed website is when part of the page scrolls while the rest of the page stays in place. Frames frustrate people because much of the time when the person tries to bookmark a specific page, it doesn’t actually work. Instead it bookmarks the “frameset” page which is typically the home page. Also, search engines don’t like frames. A framed web site is at a severe disadvantage compared to non-framed sites in terms of search engine marketing. Most search engines support frames, but only, as Google says in its FAQ section, “to the extent that [we] can.” Searchers clicking through to a framed page from search results sometimes end up on an orphaned page. You can use
A web page that is made up of frames. A useful analogy is if the individual’s frames that make up the frameset are the “children”, then the frameset is the “parent”.
The number of times an ad is delivered to the same browser in a single session or time period.
The term that Google uses to refer to frequently changing home pages. When Googlebot ascertains that a given home page is changing frequently, Googlebot will revisit and reindex this page daily.
Google Advertising Professional This is a free program, offered by Google, for professionals wishing to manage multiple Google Adwords clients.
Gateway Page –
Also called a “doorway page” or a “bridge page”. A Gateway Page is a low quality web page that contains very little content and exists solely for the purpose of driving traffic to another page. This is done through spamdexing or spamming the index of a search engine. Gateway Pages are often easy to identify in that they have been designed primarily for search engines, not for human beings.
Advertising that is distributed based on geographic location. In some markets, online advertising allows for targeting of countries, states, cities and suburbs.
Google is the world’s number one search engine with a 50.8% market share, ahead of Yahoo! (23.6%) and Live Search (8.4%). Google was founded by Stanford University students Larry Page and Sergey Brin in 1998. At the time of the company’s initial public offering in August 2004, $1.67 billion was raised, making Google’s worth $23 billion. Google’s success should be attributed to its unique algorithmic ranking system PageRank – a system that assigns a score to a web page based on the number of links to that page. Based in Mountain View, California, the company now employs 13,748 people. The company has a relaxed corporate atmosphere that is illustrated in the companies philosophy “Don’t be evil”. Central to Google’s profitability is Google Adwords launched in 2000. Google Adwords are text-based contextual ads relevant to keyword searches. In 2006 the company earned $10.492 billion in total advertising revenues which is about 90 times the revenue from other Google ventures. Google has acquired several start-up companies over the past few years including:
Pyra Labs, creators of Blogger in 1999
Upstartle, creators of Writely in 2006
Measure Map, a weblog statistics application in 2006
YouTube for a huge $1.65 billion in stock in 2006
JotSpot a developer of wiki technology in 2006
DoubleClick purchased for $3.1 billion in 2007
Postini an enterprise messaging security company in 2007
Current Google applications include: Web Search, Image Search, Google News, Google Product Search, Google Groups, Google Maps, Gmail, AdWords, Google Video, Google Checkout and Google Earth.
Google AdSense –
Paid ads webmasters may place on their websites.
Google Analytics –
Google Analytics is a free web analytics tool offering detailed visitor statistics. The tool can be used to track all the usual site activities: visits, page views, pages per visit, bounce rates and average time on site among others. But it can also be used to specifically track Adsense traffic, therefore helping webmasters to optimize Adwords adverts based on where visitors come from, time on site, click path and geographic location. Modeled on Urchin’s analytics tool, after Google purchased Urchin Software Group in 2005, Google Analytics was first rolled out in late 2005. The response was overwhelming and Google had to suspend sign ups only a few days later. After a short period of time they used a lottery type of invitation system to make the tool generally available in August 2006. The features of this tool include:
Updates in less than one hour
Users can add up to 50 websites
Integration with Google Adwords
User friendly interface – Dashboard format
Google Bombing –
This is when a group of sites such as blogs join forces to link to an unflattering page about a company such that this page rises to the top of the search results in Google. Google Bombing takes advantage of the power of hyperlink text and of PageRank. For example, if a group of sites with high PageRank all link to a page about XYZ Company’s inappropriate behavior with hyperlink text of “XYZ Company sucks”, then the linked page can shoot to the top of Google’s search results for the term “XYZ Company.”
Google Bowling –
It is a black hat SEO technique used to knock competitors down or out of search engine results. It is a form of SEO sabotage that is conducted by pointing hundreds of questionable links from low quality sites at a competitor’s site so hundreds of questionable links from low quality sites at a competitor’s site so they end up banned or penalized by Google. Generally newer sites are more susceptible to Google Bowling as older sites are better established with a range of exisitng high quality links.
Google Cache –
See Cache Definition
Google Checkout –
Google’s online payment processing service, Google Checkout, was designed to simplify the online purchase/payment process. It works by allowing users to store their credit card and shipping details on their Google Account. Purchases can be made at the click of a button, thus minimizing the amount of information they need to input at the point of purchase. Features include:
Ability to track purchases
Google Dance –
The Google Dance refers to when Google indexes are updated. This period of time often results in fluctuations in the index size and some noticeable changes in search engine result positions. The term Google Dance was adopted as while an update is being processed the position of a website in Google seems to “dance” as it fluctuates. The fluctuation is due to each of Google’s nine datacenters being updated out of sync – meaning for a time the results are different.
Google Juice –
Internet slang referring to the substance which flows between web pages via their hyperlinks. Pages with lots of links pointing to them acquire much “Google Juice” and pages which link to highly “juicy” pages acquire some reflected “Google Juice”.
Google Labs –
This is the home to Google’s latest innovations and beta products. It is a testing ground for new services in development. A number of popular products are graduates of Google Labs including: Google Reader, Google Docs & Spreadsheets, Google Video, Personalized Search, Google Desktop and iGoogle.
Current Google Labs products include:
Google Code Search
Google Music Trends
Google Extensions for Firefox
Google Page Creator
Google Dashboard Widgets for Mac
Google Web Accelerator
Google Ride Finder
Product Search for Mobile
Google Pack –
Free software specifically selected by Google wherein there are no trial versions or spyware and it’s readily available to use in just a few clicks. It currently includes:
Google Photos Screensaver
Norton Security Scan
Firefox with Google Toolbar
Google Supplemental Index –
It is a secondary database containing Supplemental Results, meaning pages which are deemed to be of less importance by Google’s algorithm or are less trusted. The primary measure of the pages’ importance is the number and quality of links pointing to that page. Pages in the Google Supplemental Index can still rank in search results, but this will depend on the number of pages in the main index relevant to the search. Some of the reasons pages may be in the Google Supplemental Index:
Lack of trust
A site with a large number of pages
Excessively long URLs
As of July 2007 Google discontinued the practice of placing a “Supplemental Result” tag on search results, making it near impossible to tell whether a result is in the supplemental index or the main one.
Google Toolbar for Internet Explorer –
It is an Internet browser add on which is available for both Internet Explorer and Mozilla Firefox. Features for Google Toolbar Internet Explorer include:
Search Settings Notifier
Address Bar Browse By Name
Google Search Box
AutoLink & AutoFill
Google Account Sign-In
Google Toolbar for Mozilla Firefox –
It is an Internet browser add on which is available for both Mozilla Firefox and Internet Explorer. Features for Google Toolbar Mozilla Firefox include:
Access to Google Docs & Spreadsheets
Customize the layout options
Handle mailto: links with Gmail
Google Search Box
AutoLink & AutoFill
Google Account Sign-In
Google Traffic Estimator –
This is a tool that indicates the number of clicks to expect on Google Adwords ads for particular keywords. The tool can be used to indicate search volume, average cost per click, estimated ad positions, estimated clicks per day and estimated cost per day. Google Traffic Estimator does not provide a numeric estimate of the number of search queries, instead it offers only a visual estimation of search volume in a small graphic.
Google Trends –
Another tool from Google Labs, it allows users to see how Google search volumes for a particular keyword have changed over a period of time. It shows the popularity of search terms from the beginning of 2004 onwards. Google Trends data is presented in a line graph. The horizontal axis represents time and the vertical axis shows how often a term is searched for. Data can be broken down further by region, city and language and can also compare multiple search terms.
Google XML Sitemap –
These are XML files that list the URLs available on a site. The aim is to help site owners notify search engines about the URLs on a website that are available for indexing. Webmasters can include information about each URL, such as when it was last updated and its importance in the context of the site. All pages on a site must be easily accessible to search engines without the use of Google Sitemaps. But there are situations when a site might benefit from Sitemaps Protocol, like if it is built in rich AJAX or Flash, or if it has large database driven site that isn’t well linked.
It is a search bot used by Google. It collects documents from the web to build a searchable index for the Google search engine. If a webmaster wishes to restrict the information on their site available to a Googlebot, or other well-behaved spider, they can do so with the appropriate directives in a robots.txt file.
The assortment of tools produced by Google that can be used to search, report, or play. This includes (but is not limited to):
A Google search query consisting of two words, that returns a single result.
Grey Hat SEO –
SEO using both Black Hat and White Hat techniques.
Hallway Page –
A page that serves as an index to a group of pages that would like the search engine spiders to find. Once a search engine spider indexes the hallway page, it should also follow all the links on that hallway page and in turn index those pages as well.
Heading Tag –
An HTML tag that is often used to denote a page or section heading on a web page. Search engines pay special attention to text that is marked with a heading tag, as such text is set off from the rest of the page content as being more important.
Hidden Keywords –
Keywords that are placed in the HTML source in such a way that these words are not viewable by human visitors looking at the rendered web page.
Hidden Text –
SEO Spam Tactic – This is an SEO spam tactic to hide contextual HTML text from human visitors to a webpage, however making it available to search engines to spider the text. The theory is that placing more relevant HTML text content on the page rich with targeted keywords will assist the page gaining ranking within search engine results. Some website owners do like text content on their page because they believe it negatively affects their brand and user web experience. So, they hide the text in the hope that the page will still rank for targeted keywords. Hidden Text is an illegal technique as search engines consider it search engine spam. By undertaking this practice, it will eventually harm natural search performance of a website. Google Quality Guidelines has specified to “avoid hidden text or hidden links”. Yahoo!’s Search Content Quality Guidelines also considers “the use of text or links hidden from the users” unwanted.
Hijacking of Websites –
This is a practice that makes search engines believe that a specific website resides at another URL. It is a form of search engine spam and cloaking. The reason why this method is undertaken by spammers is to increase rankings in search engine result pages. Webpage Hijacking is an illegal spam tactic. When spiders crawl websites and they discover two pages with the same content, the search engine will decide which is the main URL while the other is not indexed. Spammers will use tactics to ensure that their page is the one that is chosen by the search engine. An example of website hijacking is where there are two pages with exactly the same content but at different addresses – company.com (the real site) and company.net (the rogue site). Spammers use tactics to ensure their site ranks above the real site.
A download of a file from a web server, Hits do not correlate with web page visits. Every graphic on a web page counts as a hit. Thus, a single access of a web page with twenty (20) unique graphics on it register as twenty-one (21) hits – twenty (20) for the graphics and one (1) for the HTML page. Web metrics guru Jim Sterne says hits “stand for How Idiots Track Success.” People who talk in terms of Hits are usually either ignorant or are trying to show their boss into thinking the website is doing better than it really is.
This is the main page of a website. Like a cover of a book or the front of a store, its function is to welcome people and to inform them of the overall purpose of the website. It’s interesting to note that in some countries such as Japan, Korea and Germany, the term “homepage” usually refers to the whole website, not just the first page. The Homepage offers an index of navigation that organizes content and leads to other parts of the website. The Homepage usually accumulates the most PageRank score since its URL is usually where other sites link to the most. The URL of a HomepageHomepage are “front page”, “main web page” and “webserver directory index”. Even though the Homepage is designed to be the entry point of the website, people can go directly to other pages within the site without ever seeing the front page. usually ends in a domain name extension such as .com, .org, .edu, etc. Other terms used to describe a
This stands for HyperText Markup Language, which is the programming language used to mark up web content and display it in a formatted manner. It’s up to the web browser software, e.g. Microsoft Internet Explorer or Netscape, to render HTML source.
HTML Source –
The raw, unrendered programming code. It can be accessed in Internet Explorer by going to the “View” menu then selecting “Source”.
HTTP stands for Hypertext Markup Language and is the main markup language for creation of web pages. It defines how data is structured and informs the web browser how the page is to be displayed with the use of formatting text and images. Some of the page elements that can be coded with HTML include Page Titles, Text (paragraphs, lines and phrases), Lists (unordered, ordered and definition lists), Tables, Forms, Basic HTML Data Types (character data, colors, lengths, content types, etc) among others. The source HTML code of any webpage is available by simply clicking “Page Source” in a web browser such as Firefox or Internet Explorer. HTML is not a programming language and therefore is quite static in nature. It is considered to be a subset of SGML (Standard Generalized Markup Language). Tim Berners Lee first described HTML and it was publicly available in 1991 via a document called “HTML Tags”.
HTML became an international standard (ISO/IEC 15445:2000) and its specifications are maintained by the World Wide Web Consortium (W3C) of which commercial software vendors offer input.
HTTP 301 -Status Code Definition –
The 301 status code means the URL requested has “Moved Permanently” and has been assigned a new URL. Any future requests should use one of the returned URLs. It is best practiced to use 301 Redirects when multiple copies of the same document reside on different URLs. This will ensure that duplicate content is removed from the site and each and every unique page will only have one URL.
HTTP 302 -Status Code Definition –
The 302 status code means that the document requested is “Found” however temporarily resides under a different URL. Since a permanent redirect has not been used, the client should continue to use the original requested URL for future requests.
HTTP 400 – Status Code Definition –
The 400 status code means a “Bad Request” stating that the server is not able to understand the document request due to a malformed syntax. The user is required to modify its request prior to repeating it.
HTTP 401 – Status Code Definition –
The 401 status code means “Unauthorized”. This server requests user authentication prior to fulfilling the document request.
HTTP 403 – Status Code Definition –
The 403 status code means “Forbidden”. The server understood the request but it is refusing to fulfill it. The webmaster may wish to alert the user why their request has been denied. If the organization does not wish to provide this reason then a 404 (Not Found) status code can be displayed instead.
HTTP 404 – Status Code Definition –
The response error message “404” represents a document that is “Not Found”. This means that the client was able to communicate with the server, however could not find the requested document. Alternatively, the server could be configured to not fulfill the request and not provide a reason why.
HTTP 410 – Status Code Definition –
Similar to a 404 Not Found error message, the 410 status code states that the requested document is “intentionally gone”, is no longer available and there is no forwarding address. The 410 status code is usually used for limited display documents such as promotional information. It is up to the discretion of the web master to determine at what point to remove the 410 status message.
HTTP 500 – Status Code Definition –
The 500 status code error message states that there was an internal server error which has prevented the document from being fulfilled.
HTTP 501 – Status Code Definition –
The 501 status code message is displayed when the server does not recognize the document request method. The server is not capable of fulfilling this request and states the request was “Not Implemented”.
These are a range of centralized websites linking to many related topical Authority websites. Characteristics of Hubs are:
Many outbound links to sites which are typically Authority sites contains relevant content.
The content on the hub site is highly focused.
A site can either be a hub, an authority, both or neither. An authority or hub site will get preferential treatment by a search engine algorithm that incorporates “topic distillation”.
See Links Definition
The number of times a search ad is served to users by search engines.
Inbound links (IBL) –
Links that point to a certain site from sites other than your own. Inbound links are an important asset that will improve your site’s PageRank (PR).
A search engine’s database in which it stores textual content from every web page that its spider visits.
First introduced in September 1995, Inktomi Corporation from California was a key player in the search engine market where it pioneered online search technologies. It initially provided software to ISPs (Internet Service Providers) and then went into powering other well-known web search tools such as HotBot, Looksmart, MSN, and other regional search engines. Instead of operating everything on one machine, it ultimately displaced Alta Vista by starting to use a distributed network technology that enabled them to index more than 1.3 million documents on the web at that time. Inktomi was the first to launch a paid inclusion service that meant websites would receive regular and frequent re-indexing for a fee. It also invented a proxy cache for ISP web traffic called “Traffic Server”. During its short life, Inktomi acquired many businesses including Webspective, Infoseek, eScene Networks and FastForward Networks. Once the Internet bubble had burst in 2000, many of its acquisitions were sold off due to the financial collapse of most of its customer base. Yahoo! purchased Inktomi in 2003 which remains central to its search engine database today.
Synonymus to back links, which is popularized by Yahoo!
Insertion Order (I/O) –
A contract that specifies the details of your search advertising campaign, including placements options, keywords, ad creative, landing page, pricing, geo-targeting, and language options.
Internal Links –
This is a hypertext link that points to another page within the same website. Internal links can be used as a form of navigation for people, directing them to pages within the website. Links assist with creating good information architecture within the site. Search engines also use internal text links to crawl pages within a website. The way Internal Links are structured will impact the way in which search engine bots spider and subsequently index pages.
Sometimes called “The Net”, the Internet is a publicly accessible worldwide system of computer networks that enable people to send and receive information from other computers. The Internet uses the TCP/IP network protocols to facilitate data transmission. There are three levels of hierarchy which includes backbone networks, mid-level networks and stub networks. These include commercial (.com or .co), university (.ac or .edu) and other research networks (.org or .net).
The origins of the Internet began in 1962 where a government agency called RAND was commissioned by the US Air Force to develop a military research network that could survive a nuclear attack. Packet Switching was invented as a way of sending data. The first email program was created in 1972. The TCP/IP Protocol was developed in 1973 and by 1983 it became the core Internet protocol. The same year of 1983 saw the development of the Domain Name System (DNS) by the University of Wisconsin. The domain name system made it easier for people to access other servers rather than having to remember the corresponding long IP numbers. In 1992 the World Wide Web was released by CERN and the Internet Society was chartered who controls the Internet. The first graphical user interface to the WWW called “Mosaic for X” was released. By 1996, most Internet traffic was carried by independent ISPs. The Internet Society is building a new TCP/IP that will allow billions of addresses rather than the limited supply that we have today.
Internet Explorer –
Internet Explorer is a web browser produced by Microsoft and initailly introduced in 1995. It is also known as Explorer, IE, Microsoft Internet Explorer or MSIE. Internet Explorer is the most widely used browser in the world to view information on the World Wide Web (WWW). This browser didn’t get popular until version 3 was released in 1996 which supported CSS, ActiveX controls, Java applets, inline multimedia and the PICS system for content metadata. Version 9 is currently the latest version where further improvements are made to the look and security of the browser. Over the years Internet Explorer has been subject to harsh criticisms, particularly for its early releases, including its security architecture and its lack of support of open standards.
Interstitial Ad –
An ad page that appears for a short period of time before the user-requested page is displayed. Also known as a “transition ad”, “splash page” or “Flash page”.
Also known as “ad avail”, this is an advertising space available for purchase on a website. Based on projections, inventory may be specified as number of impressions or as a share of voice.
Invisible Web –
A term that refers to the vast amount of information on the web that is not indexed by the search engines. Dr. Jill Ellsworth coined the term in 1994.
IP Address –
IP Address stands for “Internet Protocol Address” and is sometimes referred to as “IP” or “Internet Address”. It is expressed as a four-part series of numbers separated by periods that identifies every sender and receiver of network data. The numbers represent the domain, the network, the subnetwork and the host computer. For example: 127.0.0.10, with each number ranging from 0 through to 255. Each server or device connected to the Internet is assigned a unique permanent (static) or temporary (dynamic) IP Address. The IP Address sometimes translates into a specific domain name.
Internet Service Provider An abbreviation for Internet Service Provider. An ISP provides a range of Internet related services to customers including Internet connectivity, email, website hosting, domain name registration and hosting. Usually provided for a monthly fee, an ISP can be a commercial business, a university, a government organization, a school or any other entity that provides access to the Internet to members or subscribers.
Java Applets –
Small programs written in the Java programming language that can be embedded into web pages. Java Applet programs run on the Internet user’s computer rather than the web server’s computer. Search engines can not run Java Applets. Consequently, if navigation or content is embedded in a Java Applet, it will be invisible to the search engines and will not get indexed. Java source code gets compiled into executable code called “bytecode.”
Jump Page Ad –
A microsite reached by clicking a button or banner. The jump page itself can list several topics, which can be linked to your site.
Junk Pages –
Meaningless documents that serve no purpose other than to spam the search engines with keyword stuffed pages in hopes that a visitor might click on an Adsense Ad.
Key Performance Indicators (KPIs) –
KPIs help organizations achieve organizational goals through the definition and measurement of progress. The key indicators are agreed upon by an organization and are indicators which can be measured that will reflect success factors. The KPIs selected must reflect the organization’s goals, must be measurable and they must be key to its success. Key performance indicators usually are long-term considerations for an organization.
Key Phrase (or Keyword Phrase) –
A search phrase made up of keywords. See Keyword Definition
A word that a search engine user might use to find relevant web page(s). If a keyword doesn’t appear anywhere in the text of the web page, it is highly unlikely that the page will appear in the search results, unless that keyword has been bid on in a pay-per-click search engine.
Keyword Density –
The number of occurrences that a given keyword appears on a web page. The more times that a given word appears on your page within reason, the more weight that word is assigned by the search engine when that word matches a keyword search done by a search.
Keyword Matching –
It is the process of selecting and providing advertising or information that matches the user’s search query. There are four types of Keyword Matching:
Keyword Popularity –
The number of occurrences of searches done by Internet users of a given keyword during a period of time. Both WordTracker.com and Overture’s Keyword Selector Tool provide keyword popularity numbers.
Keyword Prominence –
The location or placement of a given keyword in the HTML source code of a web page. The higher up in the page a particular word is, the more prominent it is and thus the more weight that word is assigned by the search engine when that word matches a keyword search done by a search engine user. Consequently, it’s best to have the first paragraph be chock full of important keywords rather than useless terms. This concept also applies to the location of important keywords within individual HTML tags, such as heading tags, title tags, or hyperlink text. So get in the habit of starting off your title tags with a good keyword rather than “Welcome to.”
Keyword Research –
Determining the words and phrases that people use to find something, then compiling them into a list for use on web pages among others.
Keyword Stuffing –
Placing excessive amounts of keywords into the page copy and the HTML in such a way that it detracts from the readability and usability of a given page for the purpose of boosting the page’s rankings in the search engines. This includes hiding keywords on the page by making the text the same color as the background, hiding keywords in comment tags, overfilling alt tags with long strings of keywords, etc. Keyword Stuffing is just another shady way of playing games with the search engines and, as such, its use should be strongly discouraged.
When a given page or bit of text is full of good keywords rather than a bunch of meaningless words or irrelevant words (e.g. “welcome”, “click here, “solution”).
Landing Page –
The Landing Page is a web page where people are directed once they click on an online advertisement or natural search listing. Landing pages are designed to be highly relevant to the advertisement or search listing and encourage users to complete a “call to action”. The Landing Page is also known as the “click through URL” or “destination URL”. Example uses of Landing Pages are newsletter sign up forms, download demonstration trial software and purchasing of a product or service.
Link Bait –
Useful or entertaining web content which compels users to link to it.
Link Building –
Requesting links from webmasters of other sites for the purpose of increasing the “link popularity” and/or “PageRank.” Directory submissions and press release syndication can be considered included for Link Building.
Link Farm –
A Link Farm is a group of highly interlinked websites with the purposes of inflating link popularity, known as PR. A Link Farm is a form of spamdexing, which means spamming the index of a search engine.
Link Popularity –
When other web sites link to a site, that site will rank better in certain search engines. The more web pages that link to that particular, the better its Link Popularity is.
Link Spam –
Links between pages that are specifically set up to take advantage of link-based ranking algorithms such as Google’s PageRank (PR).
These are text or graphics that, when clicked on, take the Internet user to another web page location. Links are expressed as URLs.
Log File –
All accesses to a web site can be logged by the web server. Data that is usually logged includes date and time, filename accessed, user’s IP address, referring web page, user’s browser software and version, and cookie data.
Long-Tail Keyword –
A more specific keyword term consisting of three or more words.
Don’t use software tools that professed to auto-generate doorway pages. These pages are usually devoid of meaningful content. Google, in particular, is working on ways to identify and exclude Machine-Generated doorway pages.
Manual Submitting –
Means submitting by hand to an individual search engine, rather than using an automated submission tool or service. Manual Submitting is a more polite way to submit, and as such is less likely to land someone in trouble with the search engines. But the best approach is not to submit at all and let the search engine spiders find your site through links from other sites to your site.
Meta Description –
A meta tag hidden in the HTML that describes the page’s content. It should be relatively short; around 12 to 20 words is suggested. The Meta Description provides an opportunity to influence how the web page is described in the search results, but it will not improve search rankings. Make sure the Meta Description reflects the page content or it may be accused of spamming.
Meta Keywords –
A meta tag hidden in the HTML that lists keywords relevant to the page’s content. Because search engine spammers have abused this tag so much, this tag provides little to no benefit at all to the search rankings. Of the major search engines, only Yahoo! still pays any attention to the Meta Keywords tag.
Meta Search –
Search results derived from several sources and consolidated into a single SERP.
Meta Tag Stuffing – Repeating keywords in the meta tags and using meta keywords that are unrelated to the site’s content.
Meta Tags –
Meta-information (information about information) that is associated with a web page and placed in the HTML but not displayed on the page for the user to see. There are a range of Meta Tags, only a few of which are relevant to search engine spiders. Two of the most well-known Meta Tags are the meta description and meta keywords; unfortunately these are ignored by most major search engines, including Google.
See Replica Definition
Miserable Failure –
A well known example of “Google bombing”, Miserable Failure poignantly illustrates how inbound text links can affect SERPs. In this instance, hyperlinks containing the keyword phrase Miserable Failure seem to reflect America & the world’s opinion of George W. Bush’s performance, while positioning his White House bio in the number one (1) position for that search term. So effective have the results been that today George W. Bush’s White House biography is positioned at number one (1) on Google for the single word term “failure” forever soiling the reputation of America and The Office of the President.
A module or plugin for Apache web servers that can be used to rewrite requested URLs on the fly. It supports an unlimited number of rules and attached rule conditions for each rule to provide a flexible and powerful URL manipulation mechanism. All of which can be used to offer both search engine friendly URLs, thus increasing indexing chances for a dynamic database driven website.
It is where hovering the mouse over a text or graphic link without clicking displays something new on the page. For example, a horizontal navigation bar may display further sub-section choices underneath the section hovered over.
Mozilla Firefox –
Mozilla Firefox is a web browser that has steadily grown in popularity over the last few years. It’s currently used by about 15% of the world’s Web browsers. Mozilla Firefox is free and open source software developed by the Mozilla Corporation and a community of external contributors. This cross-platform browser, provides support for Microsoft Windows, Mac OS X, and Linux.
Refers to Microsoft Network and their search engine.
Navigation bar (nav bar) –
A web site’s navigation icons, usually arranged in a row down the left hand side or along the top. It plays a crucial role in directing spiders to the site’s most important content and in getting site visitors to go deeper in the site.
Negative Keyword –
Negative Keyword is a term referenced by Google AdWords and is a form of keyword matching. This means that an advertiser can specify search terms that they do not want their ad to be associated with. For example, if the negative keyword “-nike” is added to the keyword “running shoes”, the ad will not be displayed if a person searches upon the term “nike running shoes”. Negative keyword matching ensures that only qualified traffic is clicking upon advertising.
Negative SEO –
The act of demoting a page or site from the SERPS. Most often used against a competitor that is above a certain site in the SERPS but can be used purely for fun.
Noframes Tag –
This is an alternative non-framed HTML on a frameset page for very old, non-frames capable web browsers and search engine spiders. Placing good keyword-rich text in Noframes Tags is a good idea if a site is framed, but a much better thing to do is to ditch frames altogether and rebuild the site properly. A framed web site is not search engine friendly as long as it uses Noframes Tags.
Refers to the content specific to a particular topic.
Outbound Links –
This refers to links that direct “off-site” to another website.
Page Title –
See Title Tag Definition
This refers to stealing high-ranking web page content from one site and placing it on another site in the hopes of increasing its search engine rankings. Pagejacking is yet another shady way of gaming the search engines and its use should be strongly discouraged.
PageRank (PR) –
Google uses a weighted form of link popularly called PageRank. Not all links are created equal. Google differentiates a link from an important site like CNN.com, as being better than a link from Jim-Bob’s personal home page. The Google Toolbar has a PageRank meter built into it, to see which web pages are considered important by Google and which aren’t. PageRank scoring ranges from 0 to 10, 10 being the best. PageRank scores get exponentially harder to achieve the closer to 10 they are. For example, increasing a homepage’s PageRank from a 2 to 3 is easy with not a lot of additional links, jumping from a 7 to an 8 is very difficult to achieve. The higher the PageRank of the page that’s linking to the site, the more the site’s PageRank will benefit. The better the PageRank, the better it will do in Google, all else being equal.
See Impression Definition
Paid Inclusion –
It means paying a search engine to have your web pages included in that search engine’s index.
Paid Placement –
This refers to paying a search engine to have the listings show up prominently. These listings are usually denoted as “sponsored listings.”
A pricing model based on delivering sales that can be directly attributed to the bottom line. Contrast this with traditional banner advertising which is based on impressions, a chunk of which come from people you have no desire or ability to do business with.
Pay-Per-Click (PPC) –
A pay-for-performance pricing model where advertising, such as banners or paid search engine listings, is priced based on number of clickthroughs rather than impressions or other criteria. Overture is an example of a search engine which charges advertisers on a Pay-Per-Click basis.
Pay-Per-Post (PPP) –
A website designed to help content creators such as bloggers find advertisers willing to sponsor specific content.
This stands for Adobe’s Portable Document Format, a file format that renders the page exactly as intended regardless of the computer used. Typically used for creating documents that will be printed, PDF is used instead of HTML when the content creator wants absolute control over the display of the document. In contrast, the display of an HTML document depends on the computer and web browser software used.
An “open source” programming language for building dynamic web sites, PHP can be used to write server-side programs that access databases. PHP is the most popular web programming language – more popular than Microsoft’s ASP (Active Server Pages), JSP (Java Server Pages), and Macromedia’s Cold Fusion. PHP is especially well-suited for Web development and can be embedded into HTML. PHP is secure, easy to learn, efficient, fast to code and fast to deploy. It is being used by over nine (9) million web sites (over 24% of the sites on the Internet), due largely to benefits that include quicker response time, improved security, and transparency to the end user.
Phrase Match –
This a form of keyword matching where an ad will be displayed if the user’s search query includes the exact phrase, even if their query contains additional words. For example if the terms “running shoes” are associated with an ad and the user searches upon the term “blue running shoes”, the ad will be displayed. However, the ad will not be displayed if the search query is “shoes for running”.
A pop-up that appears underneath the currently active web browser window. An annoying and shady tactic used by some web advertisers.
A web page that displays within a new, typically smaller, web browser window, rather than the currently active browser window. Search engine spiders don’t typically follow Pop-Up (or pop-under) links. Pop-Ups are often times used for promotions, ads, email newsletter invitations, survey invitations, and the like.
A site that functions as a point of access to information on the web. Portals are either authoritative hubs for a given subject or popular content driven sites.
Pull-Down List –
Is typically on a web form, where the user chooses from a list of items. For example, names of countries will typically be done using a Pull-Down List. It is usually displayed with the first item within a box and a down arrow immediately to the right. Clicking on the down arrow will display the full list to choose from. Search engine spiders can’t fill out forms or pull down on lists, so content that is only accessible through Pull-Down Lists will not be indexed and will be part of the “Invisible Web.”
A keyword, or phrase inquiry entered into a search engine or database. A person types in words and the search engine database returns results that matches the user’s Query.
Sometimes expressed as the percentage of the universe of a target audience, however it is measured by the total number of unique users who will see the ad over a specific period of time.
Reciprocal Linking –
The practice of trading links between websites.
It is where the Internet user is automatically taken to another web page address without him/her clicking on anything. Redirects are generally not good for search engine rankings, as they dilute PageRank. There is also the risk that the search engine spider will not follow your Redirect.
Referral Fees –
Fees paid in exchange for delivering a qualified sales lead or purchase inquiry. For example, an affiliate drives traffic to other companies’ sites, typically in exchange for a percentage of sales or a flat referral fee.
A web page, containing a link to your web page, that delivered your visitor to our web page. For example, if Google’s search results on a search for “britney spears” contained a link to a page on your site and the user clicked on that link.
The likelihood that a given web page will be of interest or useful to a search engine user for a keyword search.
Remnant Inventory –
Low-cost advertising space that is relatively undesirable or otherwise unsold.
Format and stylize HTML source code into the final format for the visitor’s screen. For example, text within tags will be made bold.
Repeat Visitor –
It is is a single individual or browser who accesses a website or webpage more than once over a specified period of time.
A copy of a dynamic web site or a group of web pages from a dynamic site, saved as static HTML files.
Resubmitting – Submitting a web page address to search engines after it has already been submitted previously or after the search engine has already included your site in its index. Search engines don’t like it when Resubmitting is done as it simply clutters their queue with duplicate requests.
This can refer to SERPs.
As in “URL rewriting”.
See Spider Definition
Text file placed in a websites root directory and linked in the html code. Allows for SEO’s to control the actions of search engine spiders on the site or even deny them access.
Return On Investment – The benefit gained in return for the cost of investing budget into advertising or project. ROI can be measured by the following calculation: “Total Revenues (generated from campaign or project) minus Total Costs”
Run of Site (ROS) –
The scheduling of ads across an entire site, often at a lower cost than the purchase of specific pages or sub-sections of the site. A Run-of-Site ad campaign is rotated on all general, non-featured ad spaces on a site.
Scraper Sites –
Designed to “scrape” or extract information from search-engine results pages or other sources of content, often without permission, to create content for a website. Scraper Sites usually redirect the user to other sites and are generally full of advertising.
Search Engine –
A web site that offers its visitors the ability to search the content of numerous web pages on the Internet. Search Engines periodically explore all the pages of a website and add the text on those pages into a large database that users can then search. With a Search Engine, publishing web pages that incorporate relevant key phrases, prominently positioned in particular ways, is critical. Contrast this with directories, which don’t siphon content out of the HTML of a site’s constituent pages, but instead are comprised solely of site names and descriptions written or edited by human reviewers.
Search Engine Marketing (SEM) –
Strategies and tactics undertaken to increase the amount and quality of leads generated by the search engines.
Search Engine Optimization (SEO) –
Tactics and strategies undertaken to influence the rankings of web pages in the search engines. Search Engine Optimization involves the three (3) steps of SEO which includes technical optimization, content optimization and link buidling.
Search Engine Results Page (SERP) –
A page of search results delivered by a search engine.
Search Term –
A keyword or phrase used to conduct a search engine query.
Search Term Popularity –
See Keyword Popularity Definition
Select List –
See Pull-Down List Definition
Acronym for Search Engine Marketing.
Acronym for Search Engine Optimization and/or Search Engine Optimizer.
An acronym for Search Engine Results Page.
Plural for SERP – Search Engine Results Pages.
See User Session Definition
Share of Voice –
Refers to the relative portion of exposure of an advertiser within a defined market sector over a period of time. Share of Voice can refer to the portion of exposure in advertising or the blogosphere.
An animated ad that is only long enough to play a message before settling into a stationary ad on the page. It moves across the browser, usually with sound effects.
A tall, thin ad unit that runs down the side of a web page. A Skyscraper can be 120 x 600 pixels or160 x 600 pixels.
Sniffer Script –
A small program or script that detects which web browser software an Internet user is using and then serves up the particular browser-specific cascading style sheet to match. Sniffer Scripts are also used to detect whether a user has the Macromedia Flash plug-in installed, and if so, a Flash version of the page is displayed.
Manipulation techniques that violate search engines.
See Spamming Definition
Keyword-rich gibberish used as search engine fodder instead of thoughtfully written, interesting content. Spamglish often includes meaningless sentences and keyword repetition.
This is most commonly associated with the act of sending unsolicited commercial email. But in the context of search engine optimization, Spamming refers to using disreputable tactics to achieve high search engine rankings. Such Spamming tactics include bulk submitting spamglish-containing doorway pages. This is also referred to as “spamming the search engines”.
Spiders are programs used by a search engine to explore the World Wide Web in an automated manner and download the HTML content from web sites, strip out whatever it considers superfluous and redundant out of the HTML, and store the rest in a database or its index. It is also known as a bot, robot, or crawler. Web Spiders are mainly used to create a copy of all the visited pages for later processing by a search engine, that will index the downloaded pages to provide fast searches. It can also be used for automating maintenance tasks on a web site, such as checking links or validating HTML code. Also, Spiders can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses which is usually for spam. A web crawler is one type of bot, or software agent. In general, it starts with a list of URLs to visit. As it visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit, recursively browsing the Web according to a set of policies. A Spider is a robot sent out by search engines to catalog websites on the internet. When a Spider indexes a particular website, this is known as “being spidered”.
Spider Trap –
An infinite loop that a spider may get caught in if it explores a dynamic site where the URLs of pages keep changing. For example, a home page may have a different URL and the search engine may not be able to ascertain that it is the home page that it has already indexed but under another URL. If search engines were to completely index dynamic web sites, they would inevitably have large amounts of redundant content and download millions of pages.
Splash Page –
A home page that is most often devoid of content and created in Flash. Splash Pages usually direct the visitor to the effect of “Enter Here” or “Choose our Flash-enabled site or the HTML version”. Splash Pages are an annoyance to Internet users as they introduce an extra hoop that the user has to jump through before they get to any meaningful content. Splash Pages are also damaging to search engine rankings. Because a home page is typically considered by search engines as the most important page of a web site, if the home page is a content-less Splash Page, then it’s a wasted opportunity.
Standards Compliant –
Sites that use valid XHTML and CSS, separate the content layer from the presentation layer. Because Standards Compliant sites are accessible and usable to both humans and spiders, they tend to rank better in search engines than non-compliant sites.
This means that the web page was not created dynamically from a database, but instead previously created and saved as a HTML file. Otherwise known as “static web page.”
Stemming – Search engines such as Google use a process called Stemming to deliver results based on a word’s root spelling. An example would be similar search results returned for bathe as for the word bathing.
Stop Character –
These are certain characters, such as ampersand (&), equals sign (=), and question mark (?), when in a web page’s URL, tip off a search engine that the page in question is dynamic. Search engines are cautious of indexing dynamic pages for fear of spider traps, thus pages that contain Stop Characters in their URL run the risk of not getting indexed and becoming part of the “Invisible Web.” Google won’t crawl more than one dynamic level deep. So dynamic pages with Stop Characters in its URL should get indexed if a static page links to it. Eliminating these from all URLs on your site will go a long way in ensuring that your entire site gets indexed by Google.
Stop Word –
Certain words, such as “the”, “a”, “an’, “of”, and “with”, are so common and meaningless that a search engine won’t bother including them in their index or database of web page content. So in effect, the Stop Words on your web pages are ignored as if those words weren’t on your pages in the first place. Including a lot of stop words in your title tag waters down the title tag’s keyword density.
Streaming Media –
This refers to an audio-visual content that is played as it is being downloaded. Thus, an Internet user could begin watching a video clip as the footage downloads rather than having to wait for the clip to download in its entirety.
Submitting your pages using an automated tool (see “Automated Submitting”), Submitting multiple pages of the same web site (see “Deep Submitting”) or Submitting multiple times particularly if those pages are already indexed (see “Resubmitting”), are techniques typically frowned upon by search engines. This is done to a web page address going to a search engine in the hopes that it will index it. It is suspected that some search engines apply a penalty factor to pages that were submitted versus those that the search engine spiders found on their own. Indeed, Inktomi was engaging in this practice before they discontinued accepting free submissions altogether.
Supplemental Pages –
These are pages which are indexed in Google but do not exist at this time. But when searched for a particular thing they are shown in the search result pages. These pages provides additional information about the particular search.
An option that allows you to extend your reach by distributing ads to additional partner sites.
Tagging, Tags –
Word descriptions. See Bookmarks Definition
Target Audience –
This the market in which advertisers wish to sell their product or service to. Target Audience are defined in terms of demographics, psychographics, purchase behavior, media or product usage.
Classification system of controlled vocabulary used to organize topical subjects, usually hierarchical in nature.
Search engine for blogs.
This is a user command and an underlying TCP/IP protocol for accessing remote computers.
An internet search engine.
Term Frequency –
The number of times a term occurs in a document.
Term Vector –
A database using relative vectors or relevancy for different search phrases and pages.
Text Ad –
A Text Ad is a concise, action-oriented copy describing the product or service that is being advertised. This appears alongside natural search results and links to a specified web page.
Text Link Ads –
An advert in the format of a text link.
The Tragedy of –
A dilemma or conflict between individual interests and the common good.
The main keyword focus of a web page.
Contains a lists of synonyms rather than definitions, similar to a dictionary.
Title Attribute –
This is intended to provide supplementary information about an element.
Title Tag –
The text displayed in the blue bar at the very top of the browser window, above “Back”, “Forward”, “Refresh”, “Print”, among others. Although inconspicuous to the user, the Title Tag is the most important bit of text on a web page as far as the search engines are concerned. Search engines not only assign the words in the Title Tag more weight, they also typically display the Title Tag in the search results, making it an important potential call-to-action as well. Thus, the wording of each page’s Title Tag should be thought through carefully. See Keyword Prominence Definition.
A tracer or tag attached by the receiving server to the address or URL of a page requested by a user. A Token lasts only through a continuous series of requests by a user, regardless of the length of the interval between requests. It can be also used to count unique users.
This is a browser add on usually including a search box. See Google Toolbar for Internet Explorer and Google Toolbar with Mozilla Firefox Definition.
A context sensitive ranking algorithm for web search.
A notification that someone has linked to a document on your site. This enables authors to keep track of who is linking to or referring to their articles.
Online advertising opens the opportunity to track audience response throughout the life of a campaign. Tracking and reporting tools can help you learn as you go, so you can refine your ad creative, placement options, and spending levels if you’re not seeing the results you expect. The publisher of your ads typically will provide reports on ad impressions and clickthrough. For additional analysis of your traffic and actual customer conversion rates, you’ll need to build Tracking mechanisms into your website.
A Trademark is a word, phrase, logo or symbol that identifies and distinguishes a product or service from others in the marketplace. Multiple Trademark owners may claim the right to the same term, as long as each owner operates in a different industry. Trademark ownership is location-based, and therefore must be obtained on a country-by-country basis. In search advertising, it is an unethical practice to use Trademark terms that do not belong to the advertiser within a campaign.
The amount of users that surf to a site.
Traffic Estimator –
Google’s AdWords Traffic Estimator is a tool that provides search volume, average cost-per-click, and position estimates for search advertising in Google’s search results and content network. It can be used to predict advertising performance before starting a campaign.
This refers to passing trust scores from well trusted sources through to other sites throughout the web.
A blogging service that hosts blogs and small businesses’ web sites.
Unethical SEO –
Also called Unethical Search Engine Optimization techniques that are considered unscrupulous and can result in getting sites banned from the search engines. Examples of this include Keyword Stuffing, where the site consists of a long list of keywords. There is also Hidden Text is when the text on the page is the same color as the background and often consists of lists of keywords that are put there in hope of tricking search engine spiders. And Doorway Pages that are designed for search engines and spiders in an attempt to trick them into indexing the web site into a higher position.
Unique Visitors –
These are a count of individual users who have accessed your web site. It should be noted that the “user session” metric does not yield an accurate Unique Visitor count, as multiple user sessions can be generated by one Unique Visitor.
These term in the world of advertising means the total population of the target audience.
Acronym stands for Uniform Resource Locator. It is used interchangeably with web address. URLs can specify the location of a web page, an email address, or a file on an FTP server, among other things.
URL Rewrite –
A technique used to help make web site URLs more user and search engine friendly.
This refers to how user friendly a web site is. Considering also the ease of use that a user can perform an action or task through the user interface.
Bulletin board network featuring thousands of newsgroups.
User Agent –
The name of the browser or spider that is currently visiting a page. For example, “Googlebot/2.1 (+http://www.google.com/bot.html)”.
User Generated or Content User or Generated Content or USG –
is content created and published by the end-users online. It is comprised of videos, podcasts and posts on discussion groups, blogs, wiki’s and social media sites. USG allows for a wider content provider base and the chance for all users to share their opinions online. Credibility and quality issues are the usual criticism for USG.
User Session –
This refers to an instance of an Internet user accessing your web site for a length of time, then suddenly leaving. During a user session any number of pages may be accessed. A user session is considered finished once an arbitrarily chosen period of inactivity – typically 30 minutes – is exceeded.
This refers to how well-placed your web site is in the search engines for relevant keyword searches. See Invisible Web Definition.
See User Session Definition
Web browser –
Software installed on the Internet user’s computer that allows him or her to view web pages. Popular Web Browsers include Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Safari and Opera.
Web Crawler –
It is a program or automated script which browses the World Wide Web in a methodical and automated manner. This is also known as a “web robot” or “web spider.
Web Standards –
Web Standards are widely adopted guidelines for CSS, XHTML etc. It helps ensure that web sites are accessible on a wide variety of platforms and to a wide range of users including users with disabilities.
This refers to the new generation of web based services and communities characterised by participation, collaboration and sharing of information among users online. Web2.0 applications include wikis, folksonomies, blogs and social networking sites which encourage User-Generated Content (USG) and social interaction online.
Also referred to as a Blog, this is an online journal. Weblogs are something of a phenomenon and have become increasing mainstream. Blog search engine Technorati listed 71 million Weblogs as of May 2007. Weblog authors choose whether to blog openly or anonymously. Weblog entries are made regularly and chronologically but are displayed in reverse chronological order. The range of topics covered is endless, some focusing on subjects like travel, fashion astrology or for personal online diaries. Weblogs typically are made up of posts, images, videos, comments and links. Popular blogging platforms include: Blogger, WordPress, Typepad, LiveJournal and Dreamhost
White Hat SEO –
Ethical SEO approved by the search engines.
Billed as one of the world’s ten most visited websites, Wikipedia is now the largest and fastest growing encyclopedia online. It is launched in January 2001 by Jimmy Wales and Larry Sanger. As of September 2007 the English Edition of Wikipedia had a massive 2 million articles and 609 million words, making it around 15 times the size of the Encyclopedia Britannica. The site has more than 100 servers set up to deal with the 10,000 to 35,000 page requests every second. Wikipedia is multilingual and is currently available in 253 languages. Operated by the Wikimedia Foundation, a non-profit organization, it is a collaboratively written by volunteers around the world. Contributing to Wikipedia entails that:
It is a peer-reviewed publication.
It does not require contributors legal names.
It’s contributions are supported by published and verifiable sources.
Due to its open nature, Wikipedia has been criticized as an easy target for trolls, vandals, internet marketers, advertisers, and even those with a political agenda to push. However studies have shown that vandalism is usually short-lived and that generally the site is as accurate as other encyclopedias. Wikipedia uses MediaWiki as its software platform, MediaWiki is free open source software built on a MySQl database.
WordPress is an excellent open-source web publishing system or content management system. Created primarily as blogging software, WordPress is written in PHP and backed by a MySQL database, it is ideal for managing content that is frequently updated. Distributed under the GNU General Public License, WordPress 2.2.3, was released on Sept 8, 2007. WordPress is described as a “state-of-the-art semantic personal publishing platform with a focus on aesthetics, web standards, and usability” by its creators. Its features include:
Integrated link management
Search friendly permalink structure
Wordtracker is a popular keyword research tool established by Andy Mindel and Mike Mindel in 1997. It is designed to assist search marketing professionals and webmasters in identifying important keywords and phrases relevant to their website. It provides detailed information on the number of searches, predicted number of daily searches, competing pages and KEI data. Information can be broken down by search engine. Search terms are collected from – Dogpile and Metacrawler which represent around 1% of searches online. Due to its relatively small sample size Wordtracker is vulnerable to competitor spamming, and errors in the database can be magnified. Some terms can appear to be more popular than they really are and others will be omitted all together.
A software tool to check broken links.
Acronym for Extensible Markup Language (filename.xml), it is a scripting language that allows the programmer to define the properties of the document.
One of the oldest and most established directories which is also one of the top 3 search engines. It was founded in 1994 by Stanford PhD candidates David Filo and Jerry Yang as a way for them to keep track of their personal interests on the Internet.