Short for *binary digit*, a *bit* is a unit of measuring data. The term was first used by John Tukey, a leading statistician and adviser to five U.S. presidents, in a 1946 memo for Bell Labs. His recommendation was the most natural-sounding portmanteau that had been proposed at the time, so it gained popularity and was shortly thereafter codified in *A Mathematical Theory of Communication *by mathematician Claude E. Shannon.

A single bit is the smallest unit of information on a machine and can hold only one of two values: 0 or 1. These values can represent a variety of binaries, including yes/no, positive/negative, on/off, true/false, and positive/negative. More meaningful information is obtained by combining consecutive bits into larger units such as bytes, kilobytes, gigabytes, and megabytes. (More on that below.)

Computers are sometimes classified by the number of bits they can process simultaneously or by the number of bits used to represent addresses. These two values are not always the same, which leads to some confusion; for example, a computer with a 32-bit machine might have data registers that are 32 bits wide, or it might use 32 bits to identify each address in its memory.

Graphics are often described by the number of bits used to represent each dot. A 1-bit image, for example, is monochrome, whereas an 8-bit image supports 256 colors or grayscale, and a 24- or 32-bit graphic supports true color.

Bits are also used to describe how quickly data is transferred across a network, usually as kilobits per second (Kbps), megabits per second (Mbps), or in rare cases, gigabits per second (Gbps). Network speeds can range from 1 Mbps to 1,000+ Mbps, or 1+ Gbps, though a “fast” network is completely relative. To put this in context, a feature-length, high definition film usually requires about 5 Mbps for uninterrupted streaming.

## Bit vs. byte

Although the two terms are sometimes confused, a byte is composed of 8 consecutive bits. Computer scientists rarely work with one bit at a time, so expressing data in bits is usually too long-winded and could be much simpler as bytes, kilobytes (KB), gigabytes (GB), or megabytes (MB). Like other metric prefixes, a kilobyte uses base 10 math and consists of 1,000 bytes.

Multiple bytes (and by extension, bits) can also be expressed using binary prefixes with base 2 math, although it is much less common because of its complexity. These expressions include kibibytes (KiB), which is 2^{10} or 1,024 bytes; mebibytes (MiB), which is 2^{20} or 1,024^{2} bytes; and gibibytes (GiB), which is 2^{30} or 1,024^{3} bytes.