Main » TERM » H »

Hamming code

(ham´ing kōd) (n.) In digital data transmissions, a method of error detection and correction in which every string of four bits is replaced with a string of seven bits. The last three added bits are parity-checking bits that the receiving device uses to check for and correct any errors.

Hamming code will detect any double errors but can only correct a single error. This method of error correction is best suited for situations in which randomly occurring errors are likely, not for errors that come in bursts.

Richard Hamming, a theorist with Bell Telephone Laboratories in the 1940s, developed the Hamming code method of error correction in 1949.







TECH RESOURCES FROM OUR PARTNERS
LATEST ARTICLES
Facts about Cloud Computing in 2017

The following facts and statistics capture the changing landscape of cloud computing and how service providers and customers are keeping up with... Read More »

Facts about Computer Science: Education and Jobs

The following computer science facts and statistics provide a quick introduction to the changing trends in education and related careers. Read More »

Text Messaging & Chat Abbreviations

From A3 to LOL and ZZZ this guide lists 1,500 text message and online chat abbreviations to help you translate and understand today's texting... Read More »

STUDY GUIDES
The Five Generations of Computers

Learn about each of the five generations of computers and major technology developments that have led to the computing devices that we use... Read More »

Computer Architecture Study Guide

Computer architecture provides an introduction to system design basics for most computer science students. Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »