Moore’s Law

Moore’s Law posits that the number of transistors per square inch on an integrated circuit doubles every two years. Although the doubling of transistors increases the power of computers, the cost of the computers gets halved.

History of Moore’s Law

In 1965, Gordon Moore, then director of Fairchild Semiconductor’s Research and Development Laboratories, published an article titled “Cramming more components onto integrated circuits,” in which he predicted that it would be possible to include components equal to 65,000 on a semiconductor in the next 10 years.

In the Electronics magazine article, Moore outlined several factors to support his prediction, such as the scaling capability of metal-oxide-semiconductors and the emerging trends in the manufacturing of chips.

In 1975, Carver Mead, a professor at the California Institute of Technology, popularized Moore’s prediction as Moore’s Law, and it is widely accepted by the semiconductor manufacturing industries. The miniaturization of semiconductors without increasing power consumption is the prime concept behind Moore’s Law. 

Moore's law in action, shown with an image of an Intel processor.
Processors on an Intel 45nm Hafnium-based High-k Metal Gate ”Penryn” Wafer photographed with an original Intel Pentium processor die. Using an entirely new transistor formula, the new processors incorporate 410 million transistors for each dual core chip, and 820 million for each quad core chip. The original Intel Pentium Processor only has 3.1 million transistors. Image Source:, accessed 2/11/22

Moore predicted that this trend would continue until the mid-2020s. The process of adding transistors involves shrinking the relative size of each transistor by half to create room rather than increasing the size of the circuit itself. This is why computer engineers have been able to create devices over time that are simultaneously smaller and more powerful than their predecessors.

Moore’s Law in practice 

In the real world, every facet of society, ranging from an individual to different types of industries, benefit from Moore’s Law, as all digital devices including, computers, smartphones, tablets, cameras, etc., require semiconductors to function.

Thus, as the transistors and chips become smaller, more efficient, and cheaper, the computers and other devices automatically become smaller, faster, and cheaper. Too, consumer pressures for faster devices with more features is largely based on the advances the industry has made toward smaller, faster, cheaper chips.

The end of Moore’s Law and what the future might hold

Though the transistor manufacturing industries have been driven by Moore’s Law for the last half-century, the golden rule is expected to end soon. When it tries to make smaller circuits, the temperature of transistors becomes high; therefore, it would require more energy to cool down its transistors than the amount of energy that passes through them.

In 2007, Moore also admitted that there is a fundamental limit to making things smaller because atoms are the base of materials.

Watch: What does the slowing of Moore’s Law mean for computing?

Similar observations and ideas

The idea that one entity’s size, speed, or number increases as another decreases isn’t unique to computer processing. Similar observations related to Moore’s Law based on speed, density, cost, and size of the components exist throughout the tech and science realms.

Dennard Scaling: Dennard scaling, or MOSFET scaling, formulated by Robert H. Dennard in 1974, is the foundation for Moore’s Law and states that the power density of transistors stays constant when it becomes smaller.

Network Capacity: According to Network Capacity, or Butter’s Law, the amount of data passing through an optical fiber doubles every nine months. It increases the network capacity and thereby reduces the cost.

Wirth’s Law: Wirth’s Law, contradictory to Moore’s Law, states that the software performance slows down when the hardware gets faster.   

Carlson Curve: The Carlson Curve in biotechnology states that DNA sequencing doubles as fast as the number of transistors doubles; therefore, it’s considered equivalent to Moore’s Law.

Eroom’s Law: As a contrast to Moore’s Law, Eroom’s Law states that the cost of manufacturing new drugs increases every nine years.

Haitz’s Law: According to Haitz’s Law, the manufacturing cost of LED lights decreases while increasing their brightness.

Eldhom’s Law: It’s a law proposed by Phil Eldhom which states that the bandwidth of telecommunication networks is doubling, and its cost diminishes every 18 months. 

This article was reviewed and updated in February 2022 by Siji Roy.
Vangie Beal
Vangie Beal
Vangie Beal is a freelance business and technology writer covering Internet technologies and online business since the late '90s.

Top Articles

List of Windows Operating System Versions & History [In Order]

The Windows operating system (Windows OS) refers to a family of operating systems developed by Microsoft Corporation. We look at the history of Windows...

How to Create a Website Shortcut on Your Desktop

Website Shortcut on Your Desktop reviewed by Web Webster   This Webopedia guide will show you how to create a website shortcut on your desktop using...

What are the Five Generations of Computers? (1st to 5th)

Reviewed by Web Webster Each generation of computer has brought significant advances in speed and power to computing tasks. Learn about each of the...

Hotmail [Outlook] Email Accounts

Launched in 1996, Hotmail was one of the first public webmail services that could be accessed from any web browser. At its peak in...

Capacity Planning

Capacity planning is a process that helps organizations determine the resources needed to...

Defense Advanced Research Projects...

The Defense Advanced Research Projects Agency (DARPA) is a research and development agency...

XiaoBa Ransomware

XiaoBa is a type of file-encrypting ransomware that runs on Windows and encodes...