Moore’s Law

Moore’s Law is a theory about the availability of transistors on integrated circuits. In 1965, Intel co-founder Gordon Moore observed that the number of transistors per square inch on an integrated circuit had doubled each year since its invention. Moore predicted that this trend would continue until the mid-2020s. The process of adding transistors involves shrinking the relative size of each transistor by half to create room rather than increasing the size of the circuit itself. This is why computer engineers have been able to create devices over time that are simultaneously smaller and more powerful than their predecessors.

Fellow leaders in the computer science community began to refer to Moore’s observations as a “law” despite its lack of empirical evidence. In 1975, Moore revised his theory to state that the transistors would double every two years, noting a slight decline in the rate of duplication. Moore’s insight has held true over many decades and has enabled nearly every subsequent innovation in the computer industry.

Why is Moore’s Law important?

Moore’s Law has allowed the computer industry-specifically semiconductor manufacturers-to forecast advances in circuit engineering and the relative profitability of each advancement. It created a roadmap for the ubiquity of computer technology, including that in consumer electronics, artificial intelligence, and supercomputers. In effect, Moore’s Law also predicted the increasing affordability and accessibility of computer technology, as the cost per transistor decreases when there are more transistors available on a single chip.

The end of Moore’s Law

In recent years, Moore’s Law has slowly fallen out of relevance. The predictions Moore made were related to the speed of innovation and that speed has slowed, just as Moore expected. Most recently, semiconductor foundry TSMC has announced that it plans to release a 3nm (nanometer) sometime in 2022. By comparison, the diameter of a single atom measures somewhere between 0.1 and 0.5 nanometers, so there is a finite limit to how small a single transistor can become. Some industry experts have theorized that this trend will create a shift in the way chips are used; rather than a one-size-fits-all approach, chips will be used for highly specialized purposes so that the computing power can be focused more efficiently.

 

 

Vangie Beal
Vangie Beal is a freelance business and technology writer covering Internet technologies and online business since the late '90s.

Top Articles

The Complete List of 1500+ Common Text Abbreviations & Acronyms

Text Abbreviations reviewed by Web Webster   From A3 to ZZZ we list 1,559 SMS, online chat, and text abbreviations to help you translate and understand...

Windows Operating System History & Versions

The Windows operating system (Windows OS) refers to a family of operating systems developed by Microsoft Corporation. We look at the history of Windows...

How to Create a Website Shortcut on Your Desktop

Website Shortcut on Your Desktop reviewed by Web Webster   This Webopedia guide will show you how to create a website shortcut on your desktop using...

Generations of Computers (1st to 5th)

Reviewed by Web Webster Learn about each of the 5 generations of computers and major technology developments that have led to the computing devices that...

Telecommunication

Telecommunication refers to telephony and cellular network technology. However, the broader definition includes...

Spoofing

What is spoofing? As it pertains to cybersecurity, spoofing is when a person disguises...

How to Indent in...

Microsoft Word is a graphical word...