Glossary

Quantum Bit (Qubit)

Hard

A unit of measurement for the number of bits in quantum information, and is also called a "qubit."

What Is a Quantum Bit (Qubit)?

Qubit is the two-state quantum-mechanical system. A qubit can store a 0, a 1, or a superposition of those states. Qubits are the elements through which quantum computers operate. These, in general, can act as computational bits.

The two basic operations performed on qubits are quantum gates and measurements, which have to be applied with great care because even the slightest interference can destroy the fragile superposition state. In general, the number of qubits needed to implement a quantum computer scales exponentially with the complexity of an algorithm. This is why some problems might be intractable for classical computers but can be solved easily on a quantum computer.

In qubit, the spin of the electron is up and down at the same time. The measurement of this superposition state returns either up or down with equal probability. It is one of several varieties of quantum superposition states. The qubit can be used as the basis for a variety of useful quantum algorithms and has been proved to be optimal for linear optical quantum computing operations.

For example, in just one ten-thousandth of a second, the qubit can:

  • Test all possible passwords at once.

  • Perform an arbitrary sequence of calculations that depends on both the previous and next steps in a computation.

  • Create massive parallelism by performing calculations in different parts of space simultaneously.

  • Create a secure communication channel that cannot be hacked remotely by exploiting physical properties such as polarization, directionality, and entanglement.

Qubit also demonstrates the key properties of quantum computing — all its calculations are done in a non-linear fashion. It is inherently probabilistic and all its qubits are entangled with each other. These properties enable qubit to perform many powerful calculations that classical computers cannot do, such as solving difficult problems where conventional computers get stuck on local minima. 

The Difference Between Qubits and Bits

A bit is the basic unit of data in computing. A bit has only two possible values, usually denoted 0 and 1, representing off and on, low and high voltage levels, and so on. In classical computing, a bit can exist in only one state at a time. 

A qubit (quantum bit) is a unit of data used in quantum computing. In a quantum computer, qubits are capable of existing in multiple states at once (superposition). Using superposition to represent information offers tremendous advantages over classical computers.