Voltage Definition & Meaning

Voltage is the potential energy of electrical current, per unit of charge, that forces the current to flow between two points. Potential energy in science and engineering is the possibility that something can do some sort of action or work. Voltage forces electrical current to flow between two points based on how their potential electrical energy differs. It’s named after Italian scientist Alessandro Volta, who invented the electric battery.

Voltage exerts a force on electrons within the circuit. Electrons, the negatively charged part of an atom, move around an electrical circuit depending on the circuit points’ charges — they’ll be drawn to one end of it. The two ends of a battery are a standard example: one end is charged negatively, the other positively. Atoms will be drawn to whichever end of the battery has the charge they lack, always attempting to balance the electrical charge. In this example, the electrons on negatively charged atoms move to the positive battery terminal.

The charge of an electron is called a coulomb; it’s the standard unit of measurement for electrical charge. If a circuit’s standard electron rate (or electrical current) is one coulomb passing each second, it has a rate of one ampere (or amp).

Back to voltage: voltage measures the amount of energy for each unit of charge (coulomb). The unit of measurement for energy is a Joule. If V stands for volt:

1V=Joule/coulomb

If a circuit is a “three-volt” circuit, that simply means that three Joules of energy per unit of electrical charge are passing one point in the circuit. This can also be pictured as the pressure put on the electrical energy to pass through the circuit.

Voltage affects the electrical power of all items and devices that use electricity to run. Each device will have an ideal voltage to function (or different voltage for pieces of hardware within the device). For example, parts of a computer require different voltages. Power supply units regulate incoming voltage and manage electrical current, changing it from alternating current to direct current. Alternating current’s direction changes due to a regular shift in electromagnetism, but direct current always moves in the same direction. A computer requires direct current; the current runs in the same direction throughout the hardware.

Related Links

Jenna Phipps
Jenna Phipps
Jenna Phipps is a contributor for websites such as Webopedia.com and Enterprise Storage Forum. She writes about information technology security, networking, and data storage. Jenna lives in Nashville, TN.

Top Articles

Huge List Of Texting and Online Chat Abbreviations

From A3 to ZZZ we list 1,559 text message and online chat abbreviations to help you translate and understand today's texting lingo. Includes Top...

How To Create A Desktop Shortcut To A Website

This Webopedia guide will show you how to create a desktop shortcut to a website using Firefox, Chrome or Internet Explorer (IE). Creating a desktop...

The History Of Windows Operating Systems

Microsoft Windows is a family of operating systems. We look at the history of Microsoft's Windows operating systems (Windows OS) from 1985 to present...

Hotmail [Outlook] Email Accounts

  By Vangie Beal Hotmail is one of the first public webmail services that can be accessed from any web browser. Prior to Hotmail and its...

Unregulated Power Supply Definition...

An unregulated power supply is a system that transforms input voltage into direct...

Cybersecurity Awareness Training Definition...

Cybersecurity awareness training informs employees of the attack surfaces and vectors in their...

OST File Definition &...

An OST file, or offline storage table (.ost) file, is an Offline Outlook...