Bit Meaning & Definition
Short for binary digit, a bit is a unit of measuring data. The term was first used by John Tukey, a leading statistician and adviser to five U.S. presidents, in a 1946 memo for Bell Labs. His recommendation was the most natural-sounding portmanteau that had been proposed at the time, so it gained popularity and was shortly thereafter codified in A Mathematical Theory of Communication by mathematician Claude E. Shannon.
A single bit is the smallest unit of information on a machine and can hold only one of two values: 0 or 1. These values can represent a variety of binaries, including yes/no, positive/negative, on/off, true/false, and positive/negative. More meaningful information is obtained by combining consecutive bits into larger units such as bytes, kilobytes, gigabytes, and megabytes. (More on that below.)
Computers are sometimes classified by the number of bits they can process simultaneously or by the number of bits used to represent addresses. These two values are not always the same, which leads to some confusion; for example, a computer with a 32-bit machine might have data registers that are 32 bits wide, or it might use 32 bits to identify each address in its memory.
Graphics are often described by the number of bits used to represent each dot. A 1-bit image, for example, is monochrome, whereas an 8-bit image supports 256 colors or grayscale, and a 24- or 32-bit graphic supports true color.
Bits are also used to describe how quickly data is transferred across a network, usually as kilobits per second (Kbps), megabits per second (Mbps), or in rare cases, gigabits per second (Gbps). Network speeds can range from 1 Mbps to 1,000+ Mbps, or 1+ Gbps, though a "fast" network is completely relative. To put this in context, a feature-length, high definition film usually requires about 5 Mbps for uninterrupted streaming.
Bit vs. byte
Although the two terms are sometimes confused, a byte is composed of 8 consecutive bits. Computer scientists rarely work with one bit at a time, so expressing data in bits is usually too long-winded and could be much simpler as bytes, kilobytes (KB), gigabytes (GB), or megabytes (MB). Like other metric prefixes, a kilobyte uses base 10 math and consists of 1,000 bytes.
Multiple bytes (and by extension, bits) can also be expressed using binary prefixes with base 2 math, although it is much less common because of its complexity. These expressions include kibibytes (KiB), which is 210 or 1,024 bytes; mebibytes (MiB), which is 220 or 1,0242 bytes; and gibibytes (GiB), which is 230 or 1,0243 bytes.
Stay up to date on the latest developments in Internet terminology with a free newsletter from Webopedia. Join to subscribe now.
From A3 to ZZZ we list 1,559 text message and online chat abbreviations to help you translate and understand today's texting lingo. Includes Top... Read More »Huge List of Computer Certifications
Have you heard about a computer certification program but can't figure out if it's right for you? Use this handy list to help you decide. Read More »
Computer architecture provides an introduction to system design basics for most computer science students. Read More »Network Fundamentals Study Guide
Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »The Five Generations of Computers
Learn about each of the five generations of computers and major technology developments that have led to the computing devices that we use... Read More »