Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » S »


A program that automatically fetches Web pages. Spiders are used to feed pages to search engines. It's called a spider because it crawls over the Web. Another term for these programs is webcrawler.

Because most Web pages contain links to other pages, a spider can start almost anywhere. As soon as it sees a link to another page, it goes off and fetches it. Large search engines, like Alta Vista, have many spiders working in parallel.

Also see How Web Search Engines Work in the Did You Know . . . ? section of Webopedia.

29 Free Android Apps for Cash-Strapped Students

From wacky alarm clocks to lecture hall tools and after class entertainment, these Android apps are a good fit for a student's life and budget. Read More »

Sharing Threat Intelligence

A growing number of startups make the sharing of threat intelligence a key part of their solutions. Read More »

Smiley Faces and Symbols

A text smiley face is used to convey a facial expression or emotion in texting and online chat conversations. This Webopedia guide shows you how... Read More »

The 7 Layers of the OSI Model

The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »

Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »