SEO Dictionary: 85 Search Engine Optimization Phrases Defined
From keyword analysis to backlinks and Google search engine algorithm updates, our search engine optimization glossary lists 85 SEO terms you need to know.
Search engine optimization (SEO) is different techniques and strategies used by website owners and content developers to increase the amount of visitors to a website by obtaining a high-ranking placement in the search results page of a search engine .
From keyword analysis to backlinks and Google search engine algorithm updates, this SEO glossary lists our many SEO-related terms and definitions in one convenient place. Simply click any link in the brief descriptions below to read Webopedia's full, in-depth tech term definition or related article.
SEO Dictionary Checklist:
- Getting Started: Basic Web Terms to Know (5)
- General Search Engine Optimization (13)
- Search Engines & Algorithms (16)
- Keywords and Keyphrases (11)
- Linking & Backlinks (9)
- SEO and Content (15)
- SEO Tools and Analytics (12)
- Related Webopedia Reference Articles (4)
5 Key Terms to Know
search engine optimization (SEO)
Search engine optimization is a methodology of strategies, techniques and tactics used to increase the amount of visitors to a website by obtaining a high-ranking placement in the search results page of a search engine. SEO helps to ensure that a website is accessible to a search engine and is typically a set of white hat best practices that webmasters and web content producers follow to help them achieve a better ranking in search engine results.
black hat SEO
Black Hat SEO refers to the use of aggressive SEO strategies, techniques and tactics that focus only on search engines and not a human audience. Usually, black hat SEO does not obey most search engine's guidelines.
Facebook Graph Search optimization (GSO)
GSO is a type of search engine optimization that focuses on optimizing an individual's or company's content to improve search results on Facebook's Graph Search.
Organic SEO is the phrase used to describe processes to obtain a natural placement on organic search engine results pages.
Rank refers to where a website or webpage is ranked within search engine results. For example, if your website is about microphones, when a person queries "microphones" in a search engine, your ranking indicates where in the search results your page is listed (e.g. within the top 5 results, on the first page, the 30th page and so on).
search engine marketing (SEM)
SEM is a type of Internet marketing associated with the researching, submitting and positioning of a website within search engines to achieve maximum visibility. SEM involves things such as search engine optimization (SEO), keyword research, competitive analysis, paid listings and other search engine services that will increase search traffic to your site.
Did You Know...? The Difference Between SEM and SEO: SEM is a broader term than SEO. Where SEO aims to provide better organic search results, SEM uses the search engines to advertise your website or business to internet customers and send a more targeted traffic to your website.
search engine services
Search engine services describes a collection of services offered by a third-party vendor that are designed to assist organizations and businesses to obtain exposure and a better search engine ranking. Typical services include search engine optimization (SEO) services and search engine marketing (SEM) services as well as website promotion and optimization.
search engine results page (SERP)
The search engine results page (SERP) is the webpage that a search engine returns with the results of its search. The major search engines typically display three kinds of listings on their SERPs. Listings that have been indexed by the search engine's spider, listings that have been indexed into the search engine's directory by a human, and listings that are paid to be listed by the search engine.
SEO services (SEO provider, SEO agency)
An SEO service provider, typically an individual, group or an agency, uses search engine optimization techniques to obtain high-ranking placements in organic search results for clients.
social media optimization (SMO)
Social media optimization is the process of increasing the awareness of a product, brand or event by using a number of social media outlets and communities to generate viral publicity. Some SEO provides will offer SMO in addition to more traditional SEO services.
SEO PR is a combination of the words search engine optimization (SEO) and public relations (PR). It is a process of writing press releases and other marketing papers for search engine optimization purposes to generate leads and website traffic.
SEO spam generally means SEO manipulation techniques that are used to increase search engine ranking, but violate the search engine's Terms of Service.
white hat SEO
White hat SEO refers to the usage of optimization strategies, techniques and tactics that focus on a human audience opposed to search engines and completely follows search engine rules and policies.
An algorithm is a formula or set of steps for solving a particular problem. It requires a set of rules that must be unambiguous and have a clear stopping point. Algorithms can be expressed in any language. Search engine algorithms are typically used to elevate high-quality sites and web pages to the top of the organic search results while lowering, or penalizing, the rank of lower-quality sites and pages. The goal of changing algorithm is to improve the search engine user's experience. See Google Hummingbird, Panda, PayDay Loan and other Google algorithm notes below for additional information.
Search engines are programs that search documents for specified keywords and returns a list of the documents where the keywords were found. A search engine is really a general class of programs, however, the term is often used to specifically describe systems like Google, Bing and Yahoo! Search that enable users to search for documents on the World Wide Web.
Google Hummingbird is a new algorithm project that seeks to improve the Google search engine experience for users by going beyond just keyword focus.
Google Panda is a series of on-going algorithm updates and data refreshes for the Google search engine to improve the value of search query results.
Google Panda 4.0 Update: The fourth major Google Panda algorithm update and data refresh for the internal search algorithm used in the Google search engine. The Google Panda 4.0 update was rolled out in late May 2014 and is believed to have affected more than 7% of all English-language search queries.
Google PayDay Loan
Google Payday Loan filters out spammy websites that use blackhat techniques to boost rankings for heavily trafficked Google search key word queries.
Google Penguin refers to a set of algorithm updates for the Google search engine to help enhance the value of search query results for users.
Google Pigeon is a codename that refers to a new algorithm update for Google's Local Search search engine that debuted on July 24th, 2014.
Manual submission means adding your site URL and details to search engines individually by hand, rather than using a link submission service or software.
A natural search is one where results are returned based on the natural indexing of the website, as opposed to those that are returned based on paid advertising and editorial changes made by the search engine itself.
Spider is a program that automatically fetches webpages. Spiders are used to feed pages to search engines. Another term for these programs is webcrawler.
Paid search is a type of contextual advertising where site owners pay a fee to have their website displayed in top search engine results page placement.
Robots.txt is common name of a text file that is uploaded to a website's root directory and linked in the HTML code of the site. The robots.txt file is used to provide instructions about the website to crawler bots and spiders.
Sitemap is a hierarchical visual model of the pages of a website. Site maps help users navigate through a site that has more than one page by showing the user a diagram of the entire site's contents. A sitemap can make it easier for a search engine spider to find all a site's pages.
SERP - search engine results page
Search engine results page (SERP) is the page that a search engine returns with the results of a search query. The major search engines typically display three kinds of listings on their SERPs. Listings that have been indexed by the search engine's spider, listings that have been indexed into the search engine's directory by a human, and listings that are paid to be listed by the search engine.
Sitewide refers to the linking and navigation structure that is deployed across an entire website.
Recommended Reading: Webopedia's Search Engines Category includes hundreds of definitions related to search engine technology.
This refers to keywords or phrases that are placed in the HTML source code and are not seen by those visiting the webpage.
Keyword stuffing is a technique used to overload keywords onto a webpage so that search engines will read the page as being relevant in a web search. Search engines often penalize a site if keyword stuffing is discovered and offending pages may be removed from search results.
The Long Tail search
The Long Tail search expands on the theory that media and entertainment industries need to push the niche avenues of mainstream popular products. In search queries it is important to tap the main and most common search terms (the head of the list), but also all the keywords and queries that fall along the talk of the search; which are the lesser searched for associated and related keywords.
Phrases or a search term that is made up of multiple keywords, or a specific combination of keywords, that a user would enter into a search engine.
The number of positions a webpage moves up or down in search engine results (SERP).
The position a webpage occupies on a search engine results page (SERP) when a specific keyphrase is searched for by a user. The first page of search results are the top 10 keyphrase positions.
A word used by a search engine in its search for relevant webpages.
Keyword density is the measurement in percentage of the number of times a particular keyword or phrase appears compared to the total number of words in a page.
Keyword frequency is the number of times a keyword or phrase appears on a page, or used throughout an entire site.
The prominent placement of keywords or phrases within a webpage. Prominent placement may be in the page header, meta tags, opening paragraph, or start of a sentence.
A type of search that looks for matching documents that contain one or more words specified by the user.
backlink (inbound link)
A backlink is a hyperlink that links from an external page, back to your own site. Also called an inbound Link, these links are important in determining the popularity (or importance) of your site.
The hyperlink is an element in an electronic document that links to another place in the same document or to an entirely different document. Typically, you click on the hyperlink to follow the link. Hyperlinks are the most essential ingredient of all hypertext systems, including the World Wide Web.
Link building is the process of exchanging links with other websites to increase your own site's backlinks and quality backlinks.
Link checker is a software tool or online service that is used to verify and check for broken hyperlinks within your site.
Link farming is the process of exchanging reciprocal links with websites. The idea behind link farming is to increase the number of sites that link to you.
Link popularity describes the value of website, where the measurement is based on the quantity of quality inbound links to your pages.
A quality backlink is a backlink that links to your Web site using your keywords or keyphrase, and also appears on a site that has the same theme or similar content topic as your own site. It may also be a link from a website with higher page or domain authority than you page, or a link from a .gov or .edu site.
A reciprocal link is an agreement between two site owners to provide a hyperlink within their own website to each other's website.
A deep link is a hyperlink either on a Web page or in the results of a search engine query to a page on a website other than the site's home page. Typically, a site's home page is the top page in the site's hierarchy, and any page other than that is considered "deep."
DID YOU KNOW...? The phrase "link love" is a slang term used to describe reciprocal links.
Content (or text) that has been copied or reused from other Web pages. Duplicate content is often used to help boost keyword density, however some search engines, including Google, filters duplicate text and may penalize your site, resulting in a lower keyphrase position, when you use duplicate content. Duplicate content is not always an intentional copying. For example, printer friendly or local versions of pages may create duplicate content.
Fresh content is any content that is new or dynamic in nature and gives people a reason to visit your website. Many SEO experts believe that using fresh content on your Web site will help your page obtain better placement on search engine results page (SERP). Some search engines may crawl a site more frequently as content is changed and updated.
Invisible text, or hidden text, is making the making the text the same color as the page's background, rendering the text invisible to the user unless the user highlights the text. This is also called invisible keyword stuffing.
An HTML attribute specified in the IMG tag to provide alternate text when an image on a page cannot be displayed.
Spamdexing is the practice of using improper SEO tactics in an attempt to manipulate or elevate the placement of a web site on a search engine's search results pages. Also referred to as search engine spam or Black Hat SEO, spamdexing can include activities such as keyword stuffing, link spamming, duplication of copyrighted content, page hijacking and more.
Also referred to as a doorway page, a jump page, an entry page or a bridge page, it is designed specifically for the purpose of gaining high placement in a search engine's rankings.
Web authoring is a category of software that enables the user to develop a Web site in a desktop publishing format. The software will generate the required HTML coding for the layout of the Web pages based on what the user designs.
Webspam (also referred to search spam) is a phrase used to describe webpages that are designed to "spam Google search results" using SEO tactics that are against Google publishers guidelines. In April 2012, Google announced its Penguin-codenamed algorithm Update would identify sites using aggressive webspam tactics.
A special HTML tag that provides information about a Web page. Unlike other HTML tags, meta tags do not affect how the page is displayed. Instead, they provide information such as who created the page, how often it is updated and what the page is about. Many search engines use this information when building their indices.
Web site content that is available for a fee. Premium content can include e-books, articles or content that is offered to readers on a subscription basis. Premium content is a monetization strategy.
Sticky refers to a site's ability to keep visitors on the site once they have navigated there or encourage the visitor to return frequently (i.e., the visitors "stick" to the site). A site's stickiness depends on the content of the site that encourages the visitors to remain there but is not necessarily what the visitors went to the site looking for. For example, in addition to original content that may be the main reason for visits, a site may add features (i.e. social interaction, forums or free downloads) to make the site more appealing.
The term video SEO is used to describe optimizing video content for search engine traffic. The goal when working with video SEO is to have your video content appear in video search engines as well as in the organic search results for major search engines-with traffic being directed to your site and not to your video hosting provider.
A web beacon is a transparent graphic image that is placed on a site or in an email and used to monitor the behavior of the user.
content management system (CMS) / web content management (WCM)
A content management system (CMS) also called a web management system (WCM)is software or a group or suite of applications and tools that enable an organization to seamlessly create, edit, review and publish electronic text. Many content management systems offer a Web-based GUI, enabling publishers to access the CMS online using only a Web browser.
absolute unique visitor
Absolute unique visitor is a "visitor type" report that will count each visitor to your website only once during the date range you have selected.
average page depth
Average page depth (or depth of visit)is the measurement of the number of pages on your Web site that a visitor views during a single browser session.
average time on site
Average time on site is a type of visitor report that provides data on the amount of time (in minutes or seconds) visitors have spent on your website.
Basic metrics is the term used to refer to the most basic information needed to understand a site's consumption. Basic metrics consists of the following data: visits, bounce, page views, average time on site and new visits.
The percentage of visitors who take a desired action (e.g. make a purchase, download an e-book or click an ad).
Google Analytics is a free service from Google that enables webmasters and site owners to freely access web analytics data. Google Analytics tracks visitors through your site and also tracks of the performance of your marketing campaigns.
Google Trends is a free Google service that provides charts that show how often words, phrases and topics have been searched for over time. Using Google Trends you can compare up to five topics at one time and also see how often those topics have been mentioned in news stories and find out in which geographic regions the topics have been searched for the most.
Top content provides details on the most viewed pages (top performing content) on your website. These pages are responsible for driving the most page views on your website.
Traffic sources is a report that provides an overview of the different kinds of sources that send traffic to your site, for example direct traffic (clicks from bookmarks or visitors who know your URL) or search engines.
When tracking the amount of traffic on a site, unique visitor refers to a person who visits a Web site more than once within a specified period of time.
User session is the session of activity that a user with a unique IP address spends on a Web site during a specified period of time. The number of user sessions on a site is used in measuring the amount of traffic a Web site gets.
Web analytics is a generic term meaning the study of the impact of a website on its users. Publishers often use Web analytics software to measure such concrete details as how many people visited their site, how many of those visitors were unique visitors, how they came to the site (i.e., if they followed a link to get to the site or came there directly), what keywords they searched with on the site's search engine, how long they stayed on a given page or on the entire site and what links they clicked on and when they left the site.
Recommended Reading: From A/B testing to banner ads and affiliate marketing, Webopedia's Online Advertising Dictionary offers hundreds of advertising specific definitions.
- Slideshow: 5 Easy Editorial SEO Tips to Boost Traffic
- Web Search Engines & Directories
- The Difference Between SEM and SEO
- Web Server Error Messages
Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.
List of free online Java courses for students and IT professionals looking to enhance their skills . Read More »SEO Dictionary
From keyword analysis to backlinks and Google search engine algorithm updates, our search engine optimization glossary lists 85 SEO terms you need... Read More »Slideshow: History of Microsoft Operating Systems
Microsoft Windows is a family of operating systems for personal computers. In this article we look at the history of Microsoft operating... Read More »
Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »Java Basics, Part 2
This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »The 7 Layers of the OSI Model
The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »