Web Crawler

A web crawler is a bot that moves through web pages and indexes their content so that users can find it in subsequent searches. The most prominent bots are manned by major search engines. Google has multiple web crawling bots; others include Yahoo‘s bot and Chinese tech corporation Baidu’s bot. A web crawler primarily travels web pages using both external and internal links. Web crawlers are also referred to as spiders.

If a web domain owner wants their site to be found in searches, they must allow web crawling. Search engines will only present web pages that they have discovered through crawling. As a web crawler moves through a page, it indexes, or records, all of the relevant information on the page (often any information on the page) so that it can pull up those pages when a user makes a search engine query. Not all of the Internet is indexed; researchers aren’t sure how much. But only public web pages can be accessed by web crawlers; private pages cannot. A website can also add the robots.txt extension to the HTML for pages that should not be crawled by a bot, or use “noindex” tags in the HTML itself.

Web crawlers and SEO

Web crawlers find content for search engines; what they gather from a web page affects that page’s search engine optimization ranking. If a page has a lot of keywords and relevant links when it is indexed, it will display more prominently on a search engine. Having keywords in important places, such as headings and meta data, also gives a web page better SEO visibility. Web crawlers not only pay attention to the plain text on a web page, they also study meta data and the way users respond to a page, so it’s important for a website to choose accurate meta data to be more accurately displayed in a search engine – and to have content that answers relevant search queries.

Crawler bots have also been used for malicious purposes, such as spreading false content or harvesting user information, and they’ve also been used to gauge and influence opinion.






Jenna Phipps
Jenna Phipps
Jenna Phipps is a writer for Webopedia.com, Enterprise Storage Forum, and CIO Insight. She covers data storage systems and data management, information technology security, and enterprise software solutions.
Get the Free Newsletter
Subscribe to Daily Tech Insider for top news, trends & analysis
This email address is invalid.
Get the Free Newsletter
Subscribe to Daily Tech Insider for top news, trends & analysis
This email address is invalid.

Related Articles

Virtual Private Network (VPN)

A virtual private network (VPN) encrypts a device's Internet access through a secure server. It is most frequently used for remote employees accessing a...

Gantt Chart

A Gantt chart is a type of bar chart that illustrates a project schedule and shows the dependency between tasks and the current schedule...

Input Sanitization

Input sanitization is a cybersecurity measure of checking, cleaning, and filtering data inputs from users, APIs, and web services of any unwanted characters and...

IT Asset Management Software

IT asset management software (ITAM software) is an application for organizing, recording, and tracking all of an organization s hardware and software assets throughout...

ScalaHosting

ScalaHosting is a leading managed hosting provider that offers secure, scalable, and affordable...

HRIS

Human resources information system (HRIS) solutions help businesses manage multiple facets of their...

Best Managed Service Providers...

In today's business world, managed services are more critical than ever. They can...