Table of Contents
    Home / Computers / Ohm
    Computers 2 min read

    The ohm, abbreviated , is the Universal System of Units (SI) measurement used to measure electrical resistance. It is expressed by the resistance in a circuit transmitting a current of one ampere when contingent on a potential difference of one volt. Various experientially-derived units for electrical resistance were developed in connection with early telegraphy practice, ohm included.

    Ohm’s law

    The Ohm measurement is named in honor of 19th-century German physicist and mathematician Georg Simon Ohm. As a school teacher, he discovered what is now known as Ohm’s law, a law stating that resistance equals the ratio of the potential difference (voltage) to current, and the ohm, volt, and ampere are the respective fundamental units used universally for expressing quantities. In simpler terms, an electrical current is proportional to voltage and inversely proportional to resistance. Ohm’s Law is used extensively in electronic formulas and calculations.

    Ohm expressed his discovery in the form of a simple equation that describes how voltage, current, and resistance interrelate:

    E=IR

    Voltage (E) is equal to current (I) multiplied by resistance (R). To solve for current or resistance, the equation can be manipulated into two variations:

    I = E/R

    R= E/I

    Using Ohm’s law

    Ohm’s law can be used to solve simple circuit problems. For example, if a circuit has resistors in a series with resistances of 5 and 10 , and the voltage across the first resistor is equal to 4 voltage, you can find the current passing through second resistor and the voltage across the same resistor by using the three variations of Ohm’s law.