Main » TERM » Q »

quantum computing

First proposed in the 1970s, quantum computing relies on quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, to be the computer's processor and memory. By interacting with each other while being isolated from the external environment, qubits can perform certain calculations exponentially faster than conventional computers.

Qubits Explained

Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, might represent a combination of the two or might represent a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once.

A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers. By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel.

Quantum computing is not well suited for tasks such as word processing and email, but it is ideal for tasks such as cryptography and modeling and indexing very large databases.


Microsoft: Quantum Computing 101







LATEST ARTICLES
Facts about Cloud Computing in 2017

The following facts and statistics capture the changing landscape of cloud computing and how service providers and customers are keeping up with... Read More »

Facts about Computer Science: Education and Jobs

The following computer science facts and statistics provide a quick introduction to the changing trends in education and related careers. Read More »

Texting & Chat Abbreviations

From A3 to ZZZ this guide lists 1,500 text message and online chat abbreviations to help you translate and understand today's texting lingo. Read More »

STUDY GUIDES
The Five Generations of Computers

Learn about each of the five generations of computers and major technology developments that have led to the computing devices that we use... Read More »

Computer Architecture Study Guide

Computer architecture provides an introduction to system design basics for most computer science students. Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »