Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » H »

Hamming code

(ham´ing kōd) (n.) In digital data transmissions, a method of error detection and correction in which every string of four bits is replaced with a string of seven bits. The last three added bits are parity-checking bits that the receiving device uses to check for and correct any errors.

Hamming code will detect any double errors but can only correct a single error. This method of error correction is best suited for situations in which randomly occurring errors are likely, not for errors that come in bursts.

Richard Hamming, a theorist with Bell Telephone Laboratories in the 1940s, developed the Hamming code method of error correction in 1949.







TECH RESOURCES FROM OUR PARTNERS
LATEST ARTICLES
Slideshow: 5 Hot Holiday Gifts for Tech Enthusiasts

From cute electronic toys to VR gaming, here are 5 hot gifts to give to your special tech enthusiast this holiday season. Read More »

What's Hot in Tech: AI Tops the List

Like everything in technology, AI touches on so many other trends, like self-driving cars and automation, and Big Data and the Internet of Things... Read More »

DevOp's Role in Application Security

As organizations rush to release new applications, security appears to be getting short shrift. DevSecOps is a new approach that holds promise. Read More »

STUDY GUIDES
Java Basics, Part 1

Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »

Java Basics, Part 2

This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »

The 7 Layers of the OSI Model

The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »