ENIAC - First Computer in the Unites States
ENIAC, an acronym for Electronic Numerical Integrator And Computer is considered to be the first operational electronic digital computer in the United States, developed by Army Ordnance to compute World War II ballistic firing tables. The first all-electric computing machine was proposed by physicist John Mauchly in 1942 and was completed in 1945.
The ENIAC, weighed 30 tons, used 200 kilowatts of electric power and consisted of 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The ENIAC used 20 single-number accumulators as primary functional units but also contained special units for multiplication, division, and square roots.
How Was the First Computer Used?
In addition to ballistics, the ENIAC's field of application included weather prediction, atomic-energy calculations, cosmic-ray studies, thermal ignition, random-number studies, wind-tunnel design, and other scientific uses. The ENIAC soon became obsolete as the need arose for faster computing speeds.
5 Fun Facts about The First Computer, ENIAC
- The ENIAC performed arithmetic and transfer operations simultaneously.
- It took weeks of set-up time to program new problems.
- The only mechanical elements of the ENIC were external to the calculator. This included the IBM card reader for input, a punch card for output and 1,500 relays.
- The divider and square-root unit worked by repeated subtraction and addition.
- ENIAC was the prototype from which most other modern computers evolved. (source)
Recommended Reading: Webopedia Study Guide: The Five Generations of Computers describes each of the five generations of computers and major technology developments that have led to the current devices that we use today.
Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.
From cute electronic toys to VR gaming, here are 5 hot gifts to give to your special tech enthusiast this holiday season. Read More »What's Hot in Tech: AI Tops the List
Like everything in technology, AI touches on so many other trends, like self-driving cars and automation, and Big Data and the Internet of Things... Read More »DevOp's Role in Application Security
As organizations rush to release new applications, security appears to be getting short shrift. DevSecOps is a new approach that holds promise. Read More »
Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »Java Basics, Part 2
This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »The 7 Layers of the OSI Model
The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »