Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » H »

Hamming code

(ham´ing kōd) (n.) In digital data transmissions, a method of error detection and correction in which every string of four bits is replaced with a string of seven bits. The last three added bits are parity-checking bits that the receiving device uses to check for and correct any errors.

Hamming code will detect any double errors but can only correct a single error. This method of error correction is best suited for situations in which randomly occurring errors are likely, not for errors that come in bursts.

Richard Hamming, a theorist with Bell Telephone Laboratories in the 1940s, developed the Hamming code method of error correction in 1949.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
It's Happening Now: Perceptual Computing is Real

Perceptual computing is the ability for a computer to recognize what is going on around it. More specifically, the computer can perceive the... Read More »

Apple Pay Promises to Strengthen Payment Security

Experts believe that Apple Pay and other competitive payment systems will be far more secure than cards, even cards equipped with EMV chips. Read More »

The Great Data Storage Debate: Is Tape Dead?

Tape clearly is on the decline. But remember, legacy systems can hang for a shockingly long time. Read More »

QUICK REFERENCE
Network Fundamentals Study Guide

A network is a group of two or more computer systems or devices, linked together to share resources, exchange files and electronic communications.... Read More »

Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »

Webopedia Polls

The trend for the past two years has been for shoppers to spend more online during the holiday season. How do you typically shop for holiday... Read More »