0% found this document useful (0 votes)
124 views88 pages

Qta Unit-1,3,4,5

The syllabus for the course 'Introduction to Quantum Theory and Technologies' covers fundamental concepts of quantum mechanics, including superposition, entanglement, and wave-particle duality, and their applications in quantum computing, communication, and sensing. It discusses the transition from classical to quantum physics, highlighting key experiments and the development of quantum theory, as well as real-world applications and global initiatives in quantum technologies. The course aims to equip students with interdisciplinary skills necessary for emerging careers in quantum science.

Uploaded by

nihalkrishna2018
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
124 views88 pages

Qta Unit-1,3,4,5

The syllabus for the course 'Introduction to Quantum Theory and Technologies' covers fundamental concepts of quantum mechanics, including superposition, entanglement, and wave-particle duality, and their applications in quantum computing, communication, and sensing. It discusses the transition from classical to quantum physics, highlighting key experiments and the development of quantum theory, as well as real-world applications and global initiatives in quantum technologies. The course aims to equip students with interdisciplinary skills necessary for emerging careers in quantum science.

Uploaded by

nihalkrishna2018
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 88

SYLLABUS

Unit 1

Introduction to Quantum Theory and Technologies

The transition from classical to quantum physics, Fundamental principles explained


conceptually: Superposition, Entanglement, Uncertainty Principle, Wave-particle duality,
Classical vs Quantum mechanics – theoretical comparison, Quantum states and
measurement: nature of observation, Overview of quantum systems: electrons, photons,
atoms, The concept of quantization: discrete energy levels, Why quantum? Strategic,
scientific, and technological significance, A snapshot of quantum technologies: Computing,
Communication, and Sensing, National and global quantum missions: India’s Quantum
Mission, EU, USA, China
1.0. Introduction

The course "Introduction to Quantum Theory and Technologies" offers a comprehensive


foundation in one of the most exciting and transformative areas of modern science. With the
rapid emergence of quantum computing, communication, and sensing, understanding the
principles of quantum mechanics is no longer limited to theoretical physicists—it has become
essential for engineers, computer scientists, and technology professionals.

This course is designed to introduce students to the fundamental concepts of quantum theory
while showing how these ideas are being applied in cutting-edge quantum technologies. It begins
with the basics of quantum mechanics, covering wave-particle duality, the uncertainty principle,
quantum states, operators, and the Schrödinger equation, laying the groundwork for
understanding how nature behaves at microscopic scales. The course then transitions into
quantum computing, where students learn about qubits, quantum gates, quantum circuits, and
algorithms such as Grover’s and Shor’s that outperform classical solutions. Students are
introduced to quantum sensing and metrology, where quantum systems enable ultra-sensitive
measurements vital to healthcare, navigation, and defence applications. The final unit discusses
real-world industrial applications of quantum technologies, highlighting how companies like
IBM, Google, and Microsoft are deploying quantum systems to solve complex problems. It also
examines global initiatives and India’s strategic efforts, such as the National Mission on
Quantum Technologies & Applications.

The course addresses the challenges of scalability, cost, standardization, and workforce
readiness, giving learners a realistic view of the current landscape. Importantly, it opens doors to
emerging careers in quantum science by equipping students with interdisciplinary skills across
physics, math, and programming. By studying this course, learners gain a solid understanding of
the theory behind quantum phenomena and the practical knowledge needed to engage with this
frontier field. As quantum technology continues to evolve, those equipped with this knowledge
will be at the forefront of innovation in industries ranging from computing and cybersecurity to
materials science and precision medicine.

1.1.Transition from Classical Physics to Quantum Mechanics

At the end of the 19th century, classical physics was seen as nearly complete. Newtonian
mechanics explained planetary motion, Maxwell’s equations described electromagnetism,
thermodynamics clarified heat and work, and classical optics enabled scientific tools like
microscopes and telescopes. Chemistry had a basic periodic table, but the atomic structure and
electron behaviour in reactions were not understood.

The transition from classical physics to quantum mechanics marks one of the most profound
paradigm shifts in scientific history. Classical physics, which dominated for over two centuries,
was grounded in the belief that all natural phenomena could be described by deterministic laws.
Concepts such as Newton’s laws of motion, thermodynamic principles, and Maxwell’s equations
provided a complete and elegant framework for understanding the physical world.

Rutherford’s experiments revealed atoms had dense, positively charged nuclei with orbiting
electrons, raising the question of why electrons didn't collapse into the nucleus—something
classical physics couldn’t explain due to predicted energy loss. This and other unexplained
phenomena, such as blackbody radiation, the photoelectric effect, and hydrogen spectral
lines, molecular spectroscopy exposed the limitations of classical models.

However, as experiments grew more precise at the atomic and subatomic scales, cracks began to
appear in this classical edifice. Phenomena like blackbody radiation defied classical
predictions, as models such as the Rayleigh-Jeans law led to absurd results like the ultraviolet
catastrophe—where infinite energy emission at short wavelengths was expected. Similarly, the
photoelectric effect revealed that light could behave like discrete packets of energy, or photons,
as Einstein proposed—challenging the continuous wave view held by classical optics.

These challenges led to the development of quantum mechanics in the early 20th century, with
major contributions from scientists like Planck, Einstein, Bohr, Schrödinger, and others. Before
exploring the key quantum experiments, it introduces the classical wave model of light, which
dominated thinking before quantum theory.

The stability of atoms posed another riddle. According to classical electromagnetism, orbiting
electrons should continuously emit radiation and spiral into the nucleus, yet atoms remained
stable. Moreover, when hydrogen gas was excited, it emitted light in distinct spectral lines rather
than a continuous spectrum—something classical theory couldn’t explain. These inconsistencies
pointed to the need for a new theoretical framework. Enter quantum mechanics: a theory based
not on certainty, but probability, where particles like electrons exhibit both wave-like and
particle-like properties. Max Planck introduced the idea that energy is quantized, emitted in
discrete amounts called quanta. This concept became the seed for a revolution. Niels Bohr refined
atomic models using quantized orbits to explain hydrogen spectra. Louis de Broglie suggested
that matter had wave properties, while Werner Heisenberg introduced the uncertainty principle,
redefining how we measure physical systems. Schrödinger’s wave equation offered a new
mathematical tool to describe electron behaviour in atoms.

Quantum mechanics did not just revise physics—it redefined our understanding of reality itself.
Unlike classical physics, where outcomes were predictable, quantum theory embraced
uncertainty and probability. Observables like position and momentum could no longer be known
simultaneously with arbitrary precision. The deterministic worldview gave way to a statistical
one, yet this new approach proved incredibly accurate and predictive. Quantum theory provided
the foundation for semiconductors, lasers, nuclear energy, and much more. It also laid the
groundwork for modern quantum technologies—such as quantum computing, quantum
cryptography, and quantum sensing—which are now driving a new technological revolution. The
journey from classical to quantum physics reminds us that scientific knowledge evolves, often
through radical shifts, as we probe deeper into the fundamental nature of the universe.

1.1.1 Description of Light as an Electromagnetic Wave

As mentioned above, the description of electromagnetic radiation in terms of Maxwell’s equation


was published in the early 1860s. The solution of these differential equations described light as
a transverse wave of electric and magnetic fields. In the absence of charge and current, such a
wave, propagating in vacuum in the positive z-direction, can be described by the following
equations:

where the electric field and the magnetic field are perpendicular to each other, as shown in Figure
1.1, and oscillate in phase at the angular frequency

where 𝜈 is the frequency of the oscillation, measured in units of s−1 = Hz. In Eqs. (1.1) and (1.2),
k is the wave vector (or momentum vector) of the electromagnetic wave, defined by Eq. (1.4):

Here, 𝜆 is the wavelength of the radiation, measured in units of length, and is defined by the
distance between two consecutive peaks (or troughs) of the electric or magnetic fields. Vector
quantities, such as the electric and magnetic fields, are indicated by an arrow over the symbol or
by bold typeface. Since light is a wave, it exhibits properties such as constructive and destructive
interference. Thus, when light impinges on a narrow slit, it shows a diffraction pattern similar to
that of a plain water wave that falls on a barrier with a narrow aperture. These wave properties
of light were well known, and therefore, light was considered to exhibit wave properties only, as
predicted by Maxwell’s equation.

In general, any wave motion can be characterized by its wavelength 𝜆, its frequency ν, and its
propagation speed. For light in vacuum, this propagation speed is the velocity of light c (c= 2.998
× 108 m/s). In the context of the discussion in the interaction of light with matter will be described
as the force exerted by the electric field on the charged particles, atoms, and molecules. This
interaction causes a translation of charge. This description leads to the concept of the “electric
transition moment,” which will be used as the basic quantity to describe the likelihood (that is,
the intensity) of spectral transition. In other forms of optical spectroscopy (the magnetic
transition moment must be considered as well. This interaction leads to a coupled translation and
rotation of charge, which imparts a helical motion of charge. This helical motion is the hallmark
of optical activity, since, by definition, a helix can be left- or right-handed.
Thus, light as an electromagnetic wave serves as a bridge between classical theory and the
quantum view. While Maxwell’s equations beautifully describe the propagation and wave
behaviour of light, they fall short when explaining phenomena that involve quantized energy
exchange, such as the photoelectric effect or atomic emission spectra. These limitations led to
the development of quantum theory. However, even in the quantum age, the classical wave model
remains foundational for understanding a wide range of light–matter interactions, especially in
spectroscopy, communications, and optical engineering.

1.1.2 Blackbody Radiation

From the viewpoint of a spectroscopist, electromagnetic radiation is produced by atoms or


molecules undergoing transitions between well-defined stationary states. This view obviously
does not include the creation of radio waves or other long-wave phenomena, for example, in
standard antennas in radio technology, but describes ultraviolet, visible, and infrared radiation,
which are the main subjects of this book. The atomic line spectra that are employed in analytical
chemistry, for example, in a hollow cathode lamp used in atomic absorption spectroscopy, are
due to transitions between electronic energy states of gaseous metal atoms. The light created by
the hot filament in a standard light bulb is another example of light emitted by (metal) atoms.
However, here, one needs to deal with a broad distribution of highly excited atoms, and the
description of this so-called blackbody radiation was one of the first steps in understanding the
quantization of light. Any material at a temperature T will radiate electromagnetic radiation
according to the blackbody equations. The term “blackbody” refers to an idealized emitter of
electromagnetic radiation with intensity I(𝜆, T) or radiation density ρ(T, 𝜈) as a function of
wavelength and temperature. At the beginning of the twentieth century,

it was not possible to describe the experimentally obtained blackbody emission profile by
classical physical models. This profile was shown in Figure 1.2 for several temperatures between
1000 and 5000 K as a function of wavelength. M. Planck attempted to reproduce the observed
emission profile using classical theory, based on atomic dipole oscillators (nuclei and electrons)
in motion. These efforts revealed that the radiation density ρ emitted by a classical blackbody
into a frequency band d𝜈 as function of 𝜈 and T would be given by Eq. (1.5):

where the Boltzmann constant k=1.381 × 10−23 [J/K]. This result indicated that the total energy
radiated by a blackbody according to this “classical” model would increase with 𝜈2 as shown by
the dashed curve in Figure 1.2b. If this equation were correct, any temperature of a material above
absolute zero would be impossible, since any material above 0 K would emit radiation according
to Eq. (1.5), and the total energy emitted would be unrestricted and approach infinity.
Particularly, toward higher frequency, more and more radiation would be emitted, and the
blackbody would cool instantaneously to 0 K. Thus, any temperature above 0 K would be
impossible. This is, of course, in contradiction with experimental results and was addressed by
M. Planck (1901) who solved this conundrum by introducing the term 1/(e^(h𝜈/kT) −1) into the
blackbody equation, where h is Planck’s constant:

The shape of the modified blackbody emission profile given by Eq. (1.6) is in agreement with
experimental results. The new term introduced by Planck is basically an exponential decay
function, which forces the overall response profile to approach zero at high frequency. The
numerator of the exponential expression contains the quantity hν, where h is Planck’s constant
(h =6.626 × 10−34 Js). This numerator implies that light exists as “quanta” of light, or light
particles (photons) with energy
E: Ephoton = hν (1.7)
This, in itself, was a revolutionary thought since the wave properties of light had been established
more than two centuries earlier and had been described in the late 1800s by Maxwell’s equations
in terms of electric and magnetic field contributions. Here arose for the first time the realization
that two different descriptions of light, in terms of waves and particles, were appropriate
depending on what questions were asked. A similar “particle–wave duality” was later postulated
and confirmed for matter as well . Thus, the work by Planck very early in the twentieth century
is truly the birth of the ideas resulting in the formulation of quantum mechanics.

1.1.3 The Photoelectric Effect

In 1905, Einstein reported experimental results that further demonstrated the energy quantization
of light. In the photoelectric experiment, light of variable color (frequency) illuminated a
photocathode contained in an evacuated tube. An anode in the same tube was connected
externally to the cathode through a current meter and a source of electric potential (such as a
battery). Since the cathode and anode were separated by vacuum, no current was observed, unless
light with a frequency above a threshold frequency was illuminating the photocathode. Einstein
correctly concluded that light particles, or photons, with a frequency above this threshold value
had sufficient kinetic energy to knock out electrons from the metal atoms of the photocathode.
These “photoelectrons” left the metal surface with a kinetic energy given by

where 𝜙 is the work function, or the energy required to remove an electron from metal atoms.
This energy basically is the atoms’ ionization energy multiplied by Avogadro’s number.
Furthermore, Einstein reported that the photocurrent produced by the irradiation of the
photocathode was proportional to the intensity of light, or the number of photons, but that
increasing the intensity of light that had a frequency below the threshold did not produce any
photocurrent. This provided further proof of Eq. (1.9). This experiment further demonstrated that
light has particle character with the kinetic energy of the photons given by Eq. (1.7), which led
to the concept of wave–particle duality of light. Later, de Broglie theorized that the momentum
p of a photon was given by
p = h∕λ (1.10)
Equation (1.10) is known as the de Broglie equation. The wave–particle duality was later (1927)
confirmed to be true for moving masses as well by the electron diffraction experiment of
Davisson and Germer [3]. In this experiment, a beam of electrons was diffracted by an atomic
lattice and produced a distinct interference pattern that suggested that the moving electrons
exhibited wave properties. The particle–wave duality of both photons and moving matter can be
summarized as follows. For photons, the wave properties are manifested by diffraction
experiments and summarized by Maxwell’s equation. As for all wave propagation, the velocity
of light, c, is related to wavelength 𝜆 and frequency 𝜈 by

with c =2.998 × 108 [m/s] and 𝜆 expressed in [m] and 𝜈 expressed in [Hz = s−1]. The quantity ̃ν
is referred to as the wavenumber of radiation (in units of m−1 or cm−1) that indicates how many
wave cycles occur per unit length:

The (kinetic) energy of a photon is given by

with̄h = h/2π and ω, the angular frequency, defined before as 𝜔 =2π𝜈.


From the classical definition of the momentum of matter and light, respectively,
it follows that the photon mass is given by

Notice that a photon can only move at the velocity of light and the photon mass can only be
defined at the velocity c. Therefore, a photon has zero rest mass, m0. Particles of matter, on the
other hand, have a nonzero rest mass, commonly referred to as their mass. This mass, however,
is a function of velocity v and should be referred to as mv, which is given by

Equation (1.16) demonstrates that the mass of any matter particle will reach infinity when
accelerated to the velocity of light. Their kinetic energy at velocity v (far from the velocity of
light) is given by the classical expression

The discussion of the last paragraphs demonstrates that at the beginning of the twentieth century,
experimental evidence was amassed that pointed to the necessity to redefine some aspects of
classical physics. The next of these experiments that led to the formulation of quantum mechanics
was the observation of “spectral lines” in the absorption and emission spectra of the hydrogen
atom.

1.1.4 Hydrogen Atom Absorption and Emission Spectra

Between the last decades of the nineteenth century and the first decade of the twentieth century,
several researchers discovered that hydrogen atoms, produced in gas discharge lamps, emit light
at discrete colors, rather than as a broad continuum of light as observed for a blackbody (Figure
1.2a). These emissions occur in the ultraviolet, visible, and near-infrared spectral regions, and a
portion of such an emission spectrum is shown schematically in Figure 1.3. These observations
predate the efforts discussed in the previous two sections and therefore may be considered the
most influential in the development of the connection between spectroscopy and quantum
mechanics.
These experiments demonstrated that the H atom can exist in certain “energy states” or
“stationary states.” These states can undergo a process that is referred to as a “transition.” When
the atom undergoes such a transition from a higher or more excited state to a lower or less excited
state, the energy difference between the states is emitted as a photon with an energy
corresponding to the energy difference between the states:

where the subscript f and i denote, respectively, the final and initial (energy) state of the atom (or
molecule). Such a process is referred to as a “emission” of a photon. Similarly, an absorption
process is one in which the atom undergoes a transition from a lower to a higher energy state, the
energy difference being provided by a photon that is annihilated in the process. Absorption and
emission processes are collectively referred to as “transitions” between stationary states and are
directly related to the annihilation and creation, respectively, of a photon. The wavelengths or
energies from the hydrogen emission or absorption experiments were fit by an empirical equation
known as the Rydberg equation, which gave the energy “states” of the hydrogen atom as

In this equation, n is an integer (>0) “quantum” number, and Ry is the Rydberg constant, (Ry
=2.179 × 10−18 J). This equation implies that the energy of the hydrogen atom cannot assume
arbitrary energy values, but only “quantized” levels, E(n). This observation led to the ideas of
electrons in stationary planetary orbits around the nucleus, which – however – was in
contradiction with existing knowledge of electrodynamics, as discussed in the beginning of this
chapter. The energy level diagram described by Eq. (1.19) is depicted in Figure 1.4. Here, the
sign convention is as follows. For n= ∞, the energy of interaction between nucleus
and electron is zero, since the electron is no longer associated with the nucleus. The lowest energy
state is given by n= 1, which corresponds to the H atom in its ground state that has a negative
energy of 2.179 × 10−18 J. Equation (1.19) provided a background framework to explain the
hydrogen atom emission spectrum. According to Eq. (1.19), the energy of a photon, or the energy
difference of the atomic energy levels, between any two states nf and ni can be written as

At this point, an example may be appropriate to demonstrate how this empirically derived
equation predicts the energy, wavelength, and wavenumber of light emitted by hydrogen atoms.
This example also introduces a common problem, namely, that of units. Although there is an
international agreement about what units (the system international, or SI units) are to be used to
describe spectral transitions, the problem is that few people are using them. All efforts will be
made to use SI units, or at least give the conversion to other units. The sign conventions used
here are similar to those in thermodynamics where a process with a final energy state lower than
that of the initial state is called an “exothermic” process, where heat or energy is lost The energy
is lost as a photon and is called an emission transition. When describing an absorption process,
the energy difference of the atom is negative, ΔEatom <0, that is, the atom has gained energy
(“endothermic” process in thermodynamics). Following the procedure outlined in Example 1.2
would lead to a negative wavelength of the photon, which of course is physically meaningless,
and one has to remember that the negative ΔEatom implies the absorption of a photon.
1.1.5 Molecular Spectroscopy

Molecular spectroscopy is a branch of science in which the interactions of electromagnetic


radiation and molecules are studied, where the molecules exist in quantized stationary energy
states similar to those discussed in the previous section. However, these energy states may or
may not be due to transitions of electrons into different energy levels, but due to vibrational,

rotational, or spin energy levels. Thus, molecular spectroscopy often is classified by the
wavelength ranges of the electromagnetic radiation (for example, microwave or infrared
spectroscopies) or changes in energy levels of the molecular systems. This is summarized in
Table 1.1, and the conversion of wavelengths and energies were discussed in Eqs. (1.11)–(1.15)
and are summarized in Appendix 1. In this table, NMR and EPR stand for nuclear magnetic and
electron paramagnetic resonance spectroscopy, respectively. In both these spectroscopic
techniques, the transition energy of a proton or electron spin depends on the applied magnetic
field strength. All techniques listed in this table can be described by absorption processes
although other descriptions, such as bulk magnetization in NMR, are possible as well. As seen
in Table 1.1, the photon energies are between 10−16 and 10−25 J/photon or about 10−4–105
kJ/(mol photons). Considering that a bond energy of a typical chemical (single) bond is about
250–400 kJ/mol, it shows that ultraviolet photons have sufficient energy to break chemical bonds
or ionize molecules. Most of the spectroscopic processes discussed are absorption or emission
processes as defined by Eq. (1.18):

However, interactions between light and matter occur even when the light’s wavelength is
different from the specific wavelength at which a transition occurs. Thus, a classification of
spectroscopy, which is more general than that given by the wavelength range alone, would be a
resonance/off-resonance distinction. Many of the effects described and discussed in this book are
observed as resonance interactions where the incident light, indeed, possesses the exact energy
of the molecular transition in question. IR and UV/vis absorption spectroscopy, microwave
spectroscopy, and NMR are examples of such resonance interactions. The off-resonance
interactions between electromagnetic radiation and matter give rise to well-known phenomena
such as the refractive index of dielectric materials. These interactions arise since force is exerted
by the electromagnetic radiation on the charged particles of matter even at off-resonance
frequencies. This force causes an increase in the amplitude of the motion of these particles. When
the frequency of light reaches the transition energy between two states, an effect known as
anomalous dispersion of the refractive index takes place. This anomalous dispersion of the
refractive index always accompanies an absorption process. This phenomenon makes it possible
to observe the interaction of light either in an absorption or as a dispersion measurement, since
the two effects are related to each other by a mathematical relation known as the Kramers–Kronig
relation. This aspect will be discussed in more detail in Chapter 5. The normal (nonresonant)
Raman effect is a phenomenon that also is best described in terms of off-resonance models, since
Raman scattering can be excited by wavelengths that are not being absorbed by molecules. A
discussion of nonresonant effects ties together many well-known aspects of classical optics and
spectroscopy.

1.2 Fundamental Principles

1.2.1 Superposition

Superposition is a fundamental concept in quantum mechanics, describing the condition in which


a quantum system can exist in multiple states or configurations simultaneously. Classical bits can
exist in two possible states, typically labelled as "0" and "1". In contrast, because a qubit is a
quantum mechanical system, it can exist in the “0” state, the “1” state, or any state that is a linear
combination of 0 and 1.
Mathematically, superposition is a linear combination of "0" and "1" and can be written as:

where ∣ψ⟩| is the state of the qubit, ∣0⟩ and ∣1⟩ are the basis states (or the computational basis
states), and ααα and βββ are complex numbers called probability amplitudes. The probability
amplitudes determine the probability of measuring the qubit in either state when a measurement
is made.
Importantly, the state of superposition can be maintained only while a quantum system is
unobserved. Once measured, the wave function of a quantum system in a state of superposition
"collapses" into one of the basis states.

For a concrete example of how this might work if superposition could exist in the everyday world,
imagine that a coin that is flipped and lands on a table. In classical mechanics -- and in the
everyday world as we know it -- the coin ends up in a state of either heads or tails. In a quantum
mechanical system, the coin could be both heads and tails at the same time, but only until
someone or something observes it or measures it. In this analogy, once observed, the coin would
take on the state of either heads or tails.

Superposition is a fundamental attribute of quantum computing systems. One of its benefits or


uses is that it allows for the possibility of quantum parallelism. Because classical bits can be in
only one of two possible states, 0 or 1, classical computers can perform only one computation at
a time, e.g., “if the bit is a 1 do this, if not do that, next do this, etc.” In contrast, because a qubit
can be in a superposition of 0 and 1, the quantum computer can perform multiple computations
in parallel by processing all possible states of the qubits at once.
Superposition is central to quantum computing, offering a level of power and parallelism that
classical computers cannot match. With classical bits, each bit can represent only a single value
at a time, limiting operations to sequential logic. But with qubits in superposition, a quantum
computer can evaluate many possible combinations of inputs simultaneously. This effect is
known as quantum parallelism. For example, a system of just 3 qubits can represent 8 states at
once (2³), allowing it to process all those configurations in a single operation.
Despite its strangeness, superposition is not magic; it's a natural, observable aspect of the
microscopic world. Technologies like quantum sensors, quantum simulators, and quantum
cryptography also leverage this principle. As researchers continue to refine hardware and error
correction techniques, harnessing superposition will remain key to unlocking the full promise of
quantum technologies in the years ahead.

1.2.2 Entanglement

Entanglement is a fundamental concept of quantum mechanics that describes a non-classical


correlation, or shared quantum state, between two or more quantum systems (or quantum
particles) even if they are separated by a large distance. This phenomenon is also known as
quantum non-locality, and it is one of the key features of quantum mechanics that distinguishes
it from classical mechanics. Quantum systems are described by a mathematical object called a
wavefunction, which contains information about the possible outcomes of measurements that can
be performed on the systems. When two or more quantum systems are entangled, their
wavefunction cannot be expressed as a product of individual wavefunctions for each system.
Instead, the systems are described by a single wavefunction that captures the correlation between
them. The fact that entangled systems are described by a single wavefunction means that any
actions or measurements made on one of the systems affect the state of the other systems.
In quantum computing, entanglement is used to enable quantum parallelism, which is the ability
of quantum computers to perform multiple calculations simultaneously. Entanglement allows
quantum computers to manipulate many qubits in a single operation, instead of manipulating
each qubit individually, as in classical computing. For example, consider two qubits that are
initially prepared in an entangled state. If a measurement is made on one of the qubits, and it is
found to be in the state |0⟩, then the state of the other qubit immediately collapses to the state |0⟩
as well. Similarly, if the first qubit is measured to be in the state |1⟩, then the state of the second
qubit collapses to the state |1⟩ as well.

In quantum computing, entanglement is a powerful enabler of quantum parallelism and


coordination. While classical computers manipulate individual bits, quantum algorithms take
advantage of entangled qubits to operate on a vast number of states simultaneously, allowing for
exponentially more complex computations. Entanglement also plays a vital role in quantum
teleportation, which does not transmit matter but instead transfers a quantum state from one
location to another using pre-shared entangled qubits and classical communication.

Entanglement enables quantum computers to implement various protocols and algorithms that
are not possible with classical systems. For example, it is used in quantum teleportation, which
allows for the transfer of quantum states between two distant systems. Entanglement is also a
key resource for quantum error correction, which is necessary to protect quantum information
from decoherence and other errors. By creating and manipulating entangled states, quantum
computers can detect and correct errors in a way that is not possible for classical computers.

Entanglement is also essential in quantum cryptography, particularly in protocols like Quantum


Key Distribution (QKD), where the security of the communication is guaranteed by the laws of
quantum physics rather than mathematical assumptions. If an eavesdropper tries to intercept the
entangled signal, the disturbance would be immediately evident to the legitimate users.

Another key use of entanglement is in quantum error correction. Quantum information is fragile
and susceptible to noise and decoherence. By entangling qubits cleverly, quantum error
correction schemes can detect and recover from errors without disturbing the original
information—something impossible in classical computing. Entangled states serve as the
building blocks for logical qubits, which are more stable and can be used for extended
computations.

Overall, quantum entanglement redefines the limits of communication, computation, and


information theory. It challenges our classical intuitions and offers a new paradigm where non-
local correlations become tools for advanced technologies. As research progresses, entanglement
will continue to be the backbone of innovations in secure communication, high-speed
computation, and precision measurement in the quantum era.

1.2.3 Uncertainty Principle

The Uncertainty Principle, also known as Heisenberg's Uncertainty Principle, is a


fundamental concept in quantum mechanics. It states that:

It is impossible to simultaneously know both the exact position and the exact
momentum of a particle.

The Uncertainty Principle also applies to other pairs of observables, such as energy and
time, and has deep implications for the behaviour of particles in confined systems, like
electrons in atoms. It helps explain phenomena like zero-point energy, where particles
have motion even at absolute zero temperature, and quantum tunneling, where particles
appear to pass through energy barriers.

In quantum computing, the Uncertainty Principle defines limits on how precisely


quantum states can be manipulated or measured. It is also crucial in quantum
cryptography, helping to ensure that any eavesdropping attempt on a quantum
communication channel can be detected, since measurement disturbs the system.

Ultimately, the Uncertainty Principle reshaped our understanding of the universe by


replacing the classical idea of determinism with a probabilistic framework. It teaches us
that the act of measurement in quantum mechanics is not passive—it fundamentally
changes what we observe. This uncertainty is not a weakness but a feature of the quantum
world, revealing its inherently probabilistic and non-deterministic nature.

1.2.4 Wave-particle duality

Wave-particle duality is a fundamental concept in quantum mechanics which reveals that all
quantum objects, including light and matter, exhibit both wave-like and particle-like properties.
This idea emerged from a series of experiments and theoretical developments in the early 20th
century. Light, which was classically understood as a wave due to its ability to interfere and
diffract, was shown by Einstein in 1905 to also behave like a stream of particles called photons
when explaining the photoelectric effect—where light knocks electrons out of a metal surface.
This demonstrated that light has a particle nature as well. Inspired by this, Louis de Broglie in
1924 proposed that matter, such as electrons, should also exhibit wave-like behavior. He
introduced the concept of the matter wave, assigning a wavelength to any particle based on its
momentum using the relation,

where h is Planck’s constant.

This duality was dramatically confirmed by experiments such as the electron double-slit
experiment. When electrons pass through two slits, they produce an interference pattern typical
of waves—even when fired one at a time. However, each electron is detected as a single point-
like impact on the screen, showing its particle nature. The interference pattern only emerges after
many electrons have passed through, revealing the underlying wave-like behavior. This
paradoxical result means that quantum objects cannot be fully described as just particles or just
waves. Instead, their behaviour depends on how they are measured. Wave-particle duality
challenges our classical intuition and suggests that quantum entities exist in a superposition of
possibilities, governed by a probability wave, until a measurement collapses this wave into a
definite outcome.
This strange duality means that quantum particles do not behave strictly as particles or waves,
but as a blend of both, determined by the measurement setup. If you measure their position, they
appear particle-like; if you observe their path indirectly, they exhibit wave-like interference. This
dual nature defies classical expectations and forces us to adopt a probabilistic interpretation of
nature.
In quantum theory, particles are described by wavefunctions, which represent the probability of
finding a particle in a certain state. Only when a measurement is made does this wavefunction
“collapse” to a single, definite value. Thus, the wave-particle duality reveals that quantum entities
exist in a superposition of possibilities until observed.
This concept not only underpins the foundations of quantum physics but also drives the
functioning of quantum technologies like electron microscopes, quantum computers, and even
lasers. Ultimately, wave-particle duality challenges our intuitive notions of reality and shows that
at a fundamental level, nature behaves in ways that are deeply counterintuitive, yet
experimentally proven.

1.3 Classical vs Quantum mechanics – theoretical comparison

Classical mechanics and quantum mechanics are two distinct frameworks for understanding
physical phenomena. Classical mechanics, formulated primarily by Newton, governs the motion
of macroscopic objects like planets, cars, and projectiles. Quantum mechanics, developed in the
early 20th century, is essential for accurately describing the behavior of microscopic particles
such as electrons, atoms, and photons. The two theories differ fundamentally in their
assumptions, mathematical formalisms, and interpretations of nature.

In classical mechanics, objects have definite positions and velocities at all times. The state of a
system can be precisely predicted using Newton's laws, and the evolution of that system is
deterministic: given initial conditions, the future behavior is uniquely determined. On the other
hand, quantum mechanics introduces inherent indeterminacy. A particle does not have a definite
position or momentum until it is measured. Instead, it is described by a wavefunction, which
encodes a probability distribution over all possible outcomes. The act of measurement collapses
this wavefunction, resulting in a specific observed value.

Classical mechanics relies on continuous variables and smooth trajectories in phase space. In
contrast, quantum mechanics uses discrete quantized energy levels and operates within a
probabilistic framework, governed by operators on Hilbert space and the Schrödinger equation.
While classical systems obey the principle of determinism and locality, quantum systems exhibit
phenomena like superposition, entanglement, and non-locality, which have no classical analogs.

Moreover, classical mechanics is intuitive and aligns with everyday experiences, whereas
quantum mechanics often defies intuition, requiring abstract mathematical tools and accepting
that some aspects of nature are fundamentally unknowable. Despite their differences, classical
mechanics is actually a limiting case of quantum mechanics—it emerges naturally when dealing
with large systems or high energies where quantum effects become negligible. Thus, quantum
mechanics is more fundamental and universal, with classical mechanics being an effective
approximation in the macroscopic world.

Classical mechanics treats motion and energy as continuous, and systems evolve along smooth
trajectories in space and time. Quantum mechanics, however, reveals that energy is quantized—
only specific, discrete energy levels are allowed. It also uses complex mathematical tools like
operators, matrices, and Hilbert spaces, along with the Schrödinger equation, to describe the
evolution of systems.

Another major difference is that classical physics adheres to local realism, assuming that objects
are only influenced by their immediate surroundings. Quantum systems defy this through
entanglement and non-local interactions, where particles can exhibit strong correlations even
across large distances. Additionally, quantum mechanics introduces superposition, where
particles exist in multiple states simultaneously, a concept with no classical counterpart.

Though quantum theory may seem abstract and counterintuitive, it is more fundamental—
classical mechanics turns out to be a special case of quantum mechanics, valid only when dealing
with large objects or systems where quantum effects are negligible. Thus, while classical physics
provides accurate predictions in everyday scenarios, it fails at microscopic scales, where only
quantum mechanics can accurately describe the behavior of matter and energy.
In essence, quantum mechanics reshaped our understanding of reality, replacing certainty with
probability, and introducing a new framework for describing the strange and fascinating world
that lies beneath our everyday experiences.

1.4 Quantum States and Measurement: Nature of Observation

In quantum mechanics, a quantum state represents the complete information about a system and
is typically described by a mathematical function called a wavefunction (denoted by Ψ). This
wavefunction encodes the probabilities of finding the system in various configurations. Unlike
classical systems, quantum systems can exist in a superposition of multiple states
simultaneously, meaning a particle can be in many possible states until a measurement is made.
The measurement process in quantum mechanics is fundamentally different from classical
observation—it is not passive. Instead, observing a quantum system collapses the wavefunction
to a single definite state, chosen probabilistically according to the squared magnitude of the
wavefunction. This collapse is instantaneous and unpredictable, highlighting the probabilistic
nature of quantum systems and the active role of the observer in defining the outcome. The
peculiar nature of measurement leads to non-intuitive phenomena such as wavefunction collapse
and quantum entanglement, where the act of observing one particle instantly affects the state of
another, even across large distances.

This phenomenon shows that the act of measurement is not simply revealing a pre-existing value
but is in fact defining the outcome itself. The system chooses one definite state from the spectrum
of probabilities, and all other possibilities vanish upon observation. This nature of quantum
measurement gives rise to deeply non-intuitive effects such as quantum entanglement, where two
or more particles share a linked state. If one entangled particle is measured, the state of its partner
is instantly determined, no matter the distance between them—a phenomenon that baffled even
Einstein, who referred to it as "spooky action at a distance."

Furthermore, this interaction between observer and system implies that objective reality, as
understood in classical terms, does not always exist independently of observation. Instead, the
observer plays an essential role in shaping the physical outcome. This shift from a deterministic
to a probabilistic and observer-dependent framework is what marks one of the most
fundamental departures of quantum mechanics from classical physics. The study of quantum
states and their measurement continues to influence modern fields such as quantum computing,
quantum cryptography, and quantum teleportation, where the principles of wavefunction
manipulation and collapse are harnessed to perform computations and transmit information in
revolutionary ways.

1.4.1 Overview of Quantum Systems: Electrons, Photons, Atoms

Quantum systems consist of microscopic entities such as electrons, photons, and atoms, which
all exhibit wave-particle duality and are governed by the laws of quantum mechanics. Electrons,
though traditionally thought of as point particles, also behave like waves. This wave nature is
responsible for phenomena like electron diffraction and atomic orbitals. Photons are the quantum
particles of light; they have no rest mass and always move at the speed of light, displaying both
energy quantization (in packets called quanta) and wave-like behavior such as interference.
Atoms are more complex quantum systems made of electrons orbiting a nucleus. In quantum
mechanics, these electrons occupy discrete energy levels or orbitals, and transitions between
levels involve absorption or emission of photons with specific energies. All these systems
demonstrate uniquely quantum effects such as superposition, entanglement, and tunneling—none
of which can be explained using classical physics. These systems form the foundation of modern
technologies such as lasers, semiconductors, quantum dots, and quantum computers.
Electrons, despite being considered point-like particles in classical physics, reveal a wave-like
character at small scales, a fact made evident by experiments such as electron diffraction. Their
dual nature allows them to form standing wave patterns around atomic nuclei, known as orbitals,
which determine the structure of atoms and molecules.
Photons, on the other hand, are massless quantum particles of electromagnetic radiation. They
always travel at the speed of light and carry energy proportional to their frequency, as described
by E = hν. Their wave-particle duality manifests in phenomena like interference and the
photoelectric effect. Photons can also become entangled, making them important in quantum
communication and cryptography. Meanwhile, atoms are composite systems made of nuclei
surrounded by electrons. In a quantum view, electrons do not orbit in classical trajectories but
instead occupy quantized energy states, transitioning between them by absorbing or emitting
photons of discrete energy.
Each of these systems—electrons, photons, and atoms—exhibit hallmark quantum behaviors
including superposition (being in multiple states at once), entanglement (non-local correlation
between particles), and quantum tunneling (the ability to cross classically forbidden barriers).
These phenomena cannot be explained using classical physics and require the probabilistic,
wave-based framework of quantum theory. Understanding these quantum systems is critical
because they are the foundation of modern quantum-enabled technologies. Innovations such as
semiconductors, quantum sensors, lasers, MRI machines, LEDs, and quantum computers all rely
on manipulating the quantum properties of these particles. As our ability to control these systems
improves, their role in computation, communication, and sensing will only grow more significant
in the future of science and technology

1.5 The Concept of Quantization

Quantization lies at the heart of quantum mechanics, fundamentally changing our understanding
of nature. Unlike classical physics—where properties such as energy or momentum can vary
continuously—quantum mechanics shows that many physical quantities are restricted to discrete
values. This concept was introduced to resolve the inconsistencies observed in classical models,
such as the blackbody radiation problem and the photoelectric effect, where the observed results
could not be explained without assuming that energy comes in discrete packets called quanta.
Max Planck and Albert Einstein were among the pioneers who proposed that light and energy
must be quantized to align with experimental data, setting the stage for a new theory of matter
and energy.

Quantization is a core principle of quantum mechanics that states certain physical quantities,
like energy, angular momentum, and charge, can only take on discrete values, rather than any
value within a continuous range. This idea is radically different from classical physics, where
such quantities can vary smoothly. The earliest evidence for quantization came from the
blackbody radiation problem and the photoelectric effect, which were explained by assuming
that energy is emitted or absorbed in discrete units called quanta. In atoms, electrons can only
exist in specific quantized energy levels, and transitions between these levels result in the
emission or absorption of photons with fixed frequencies. Quantization is also seen in systems
like the harmonic oscillator, where energy levels are separated by fixed intervals. This discrete
nature of quantum systems is mathematically expressed using operators with eigenvalues
corresponding to observable quantities. Quantization is what gives rise to atomic spectra, the
stability of atoms, and the structure of matter itself, making it a cornerstone of all quantum
theories.
In atomic systems, quantization becomes especially evident. Electrons in atoms cannot occupy
arbitrary energy levels; instead, they are found only in certain allowed states. When an electron
transitions between these levels, it absorbs or emits a photon with a specific frequency, giving
rise to the spectral lines seen in emission and absorption spectra. This phenomenon is responsible
for the stability of atoms and the unique identity of elements. Similar principles apply to
rotational and vibrational states of molecules, which are also quantized and form the basis of
various spectroscopic techniques.
Quantization is not limited to energy. Angular momentum, spin, and even electric charge can
also be quantized, leading to surprising consequences in both microscopic systems and
macroscopic quantum phenomena like superconductivity and quantum Hall effects. The
mathematics of quantization involves solving operator equations, where only certain values
(called eigenvalues) correspond to physical observables. This discrete nature of reality is not just
a mathematical curiosity—it underpins the structure and behavior of matter at the most
fundamental level.
Understanding why quantum mechanics is essential goes beyond explaining atomic structure—
it is about embracing a radically different view of reality. Classical physics fails to explain
phenomena like entanglement, superposition, and tunneling, all of which are routinely observed
in quantum systems. Quantum mechanics accounts for these behaviors through a probabilistic
and non-deterministic framework, where the act of measurement plays a crucial role in
determining outcomes.
Moreover, quantum theory is not just a theoretical success; it has practical, transformative
applications. Technologies such as semiconductors, lasers, magnetic resonance imaging (MRI),
LEDs, and atomic clocks are direct outcomes of quantum principles. Even more revolutionary
are the emerging fields of quantum computing, quantum cryptography, and quantum sensing,
which promise to outperform classical technologies in speed, security, and sensitivity.

In essence, quantum mechanics is not just a scientific necessity—it is a technological enabler. It


reshapes our understanding of the universe and opens new frontiers in computing,
communication, and materials science. Studying quantum concepts is therefore not only vital for
physicists, but also for engineers, computer scientists, and innovators of the future.

1.6 Why Quantum?

Quantum mechanics is not just a theoretical breakthrough in physics—it marks a radical shift in
how we understand and interact with the universe at the most fundamental level. Traditional
classical theories fail to explain the behavior of microscopic particles like electrons, photons, and
atoms. Quantum theory fills this gap by accurately describing the probabilistic and non-
deterministic nature of such particles. Its predictions have been experimentally verified with
extraordinary precision, making it a cornerstone of modern physics. More importantly, quantum
mechanics forms the foundation for transformative advancements in technology, computing, and
security, answering both scientific curiosity and real-world challenges.

1.6.1 Strategic Significance

The strategic value of quantum technologies is increasingly recognized by governments and


industries worldwide. Quantum computing, for instance, has the potential to break classical
encryption methods, posing both risks and opportunities for national security. Similarly, quantum
communication promises ultra-secure information transfer using Quantum Key Distribution
(QKD), which is theoretically unbackable. Nations are investing heavily in quantum research to
ensure leadership in this frontier domain, as it offers a significant edge in defence, surveillance,
intelligence, and cybersecurity. As a result, mastery of quantum technologies is becoming a
critical component of geopolitical power and technological sovereignty.

Governments worldwide are ramping up funding, launching national missions, and forming
alliances to stay competitive in this field. The ability to control and implement quantum systems
will influence power dynamics globally—shaping military capabilities, intelligence operations,
and secure digital infrastructure. As a result, expertise in quantum science is fast becoming a
determinant of geopolitical and economic strength.
1.6.2 Scientific Significance

Quantum mechanics stands as one of the most profound scientific achievements of the 20th
century, dramatically expanding our understanding of the physical world. Scientifically, quantum
mechanics has revolutionized our understanding of nature. It explains phenomena that classical
physics cannot, such as superconductivity, quantum tunneling, and the behavior of particles in
extreme conditions. Quantum theory has also laid the groundwork for fields like quantum
chemistry, condensed matter physics, and particle physics. It enables the modeling of complex
systems with high accuracy, leading to discoveries in materials science, nanotechnology, and
fundamental physics. Beyond practical uses, quantum mechanics continues to challenge our
philosophical notions of reality, causality, and measurement, making it a profoundly rich area of
ongoing scientific inquiry.
Quantum theory allows scientists to model atomic and subatomic systems with remarkable
precision, leading to the discovery of new materials and deeper insights into the behavior of
matter and energy.

1.6.3 Technological Significance

Quantum technologies are poised to bring transformative changes to the technological landscape.
Quantum technologies are set to redefine the future of computation, communication, sensing,
and imaging. Quantum computers can solve certain classes of problems exponentially faster than
classical computers, with potential applications in drug discovery, optimization, machine
learning, climate modelling, and logistics optimization. Quantum sensors can measure time,
magnetic fields, and gravitational forces with unprecedented precision, useful in GPS systems,
medical diagnostics, and geological surveys. Meanwhile, quantum cryptography offers solutions
for secure digital infrastructure. These innovations are not distant dreams—they are already in
early stages of development, and their practical impact is beginning to emerge, setting the stage
for the next technological revolution.
In communication, quantum encryption could underpin a new era of ultra-secure digital
infrastructure. These applications, once theoretical, are now transitioning into real-world pilots
and commercial prototypes—marking the beginning of a new age where quantum mechanics
powers next-generation innovation.
1.7 A Snapshot of Quantum Technologies: Computing, Communication, and Sensing

Quantum technologies are at the forefront of a technological revolution, harnessing the unique
and counterintuitive principles of quantum mechanics—such as superposition, entanglement, and
quantization—to build revolutionary tools that far surpass the capabilities of their classical
counterparts. These technologies are being developed across three primary domains: quantum
computing, quantum communication, and quantum sensing, each offering transformative
potential for science, industry, and society everyday life.

Quantum Computing is perhaps the most well-known application of quantum mechanics.


Unlike classical computers that process information in binary bits (0 or 1), quantum computers
use quantum bits or qubits, which can exist in superpositions of 0 and 1. This allows quantum
computers to perform many calculations in parallel, enabling them to solve certain problems—
like factoring large numbers, simulating molecular behavior, or optimizing complex systems—
exponentially faster than classical machines. Although still in early stages, companies and
research institutions are racing to achieve "quantum advantage," where a quantum computer
outperforms the best classical supercomputers on useful tasks.

Quantum Communication focuses on the secure transmission of information using quantum


phenomena. The most notable technique is Quantum Key Distribution (QKD), which allows
two parties to share a cryptographic key with security guaranteed by the laws of physics. If an
eavesdropper tries to intercept the key, the quantum state is disturbed, alerting the communicating
parties. Quantum communication can also involve quantum teleportation, where the state of a
particle is transferred instantaneously over a distance, using entanglement. As global digital
infrastructure becomes increasingly vulnerable, quantum communication promises
unprecedented levels of security for sensitive data.

Quantum Sensing exploits the extreme sensitivity of quantum systems to environmental


changes, enabling measurements with unprecedented precision. Quantum sensors can detect
minute variations in gravitational fields, magnetic fields, acceleration, and time. Applications
range from medical diagnostics (e.g., highly sensitive brain scans using quantum
magnetometers), to underground exploration, to navigation systems that do not rely on GPS.
Atomic clocks—among the most accurate devices ever built—are based on quantum transitions
and are crucial for global positioning and timekeeping systems.
Together, these quantum technologies are not isolated innovations—they are part of a rapidly
evolving ecosystem that is expected to reshape the technological landscape in the coming
decades.

As these technologies mature, they will not remain isolated solutions but will become deeply
integrated into a wide range of applications. Quantum computing could redefine how we solve
scientific and industrial problems. Quantum communication may establish new standards of
digital security. Quantum sensing is set to improve how we measure, observe, and navigate the
world. Together, these advances signal a shift toward a new era of quantum-enhanced innovation
that will shape the future of multiple sectors including finance, defense, transportation, health,
and information technology

1.8 National and Global Quantum Missions

As quantum technologies emerge as a critical area of innovation and national interest, several
countries have launched ambitious quantum missions to secure strategic and technological
leadership. These initiatives aim to develop quantum computing, communication, and sensing
capabilities through coordinated investments in research, infrastructure, and talent development.
India, along with major global powers like the USA, China, and the European Union, is actively
building its presence in the quantum landscape.

Fig: Global commitment to accelerate quantum for society solutions


Fig: Announced government investments in quantum research and commercialization around
the world

China leads in quantum communications, lags behind in computing (where the United States
excels), and matches the United States in sensing, excelling in market-ready tech, while the
United States dominates high-impact areas.

1.8.1 India’s National Quantum Mission (NQM)

India launched its National Quantum Mission in 2023, with a budget of ₹6,003 crore
(approximately $730 million) over eight years. The mission seeks to position India among the
top quantum nations by developing indigenous capabilities in quantum computing, quantum
communication, quantum sensing, and quantum materials. It aims to establish four Thematic
Hubs (T-Hubs) in leading academic and research institutions focusing on foundational
technologies. The NQM also plans to build intermediate-scale quantum computers (with 50–
1000 qubits), develop quantum key distribution networks, and promote workforce training and
international collaborations. The mission aligns with India’s larger vision of self-reliance in
strategic technologies and aims to boost national security, telecommunications, and advanced
research.
Fig: India Approves National Quantum Mission

1.8.2 European Union (EU) Quantum Flagship

The EU Quantum Flagship is a €1 billion, 10-year initiative launched in 2018 to unify Europe’s
fragmented quantum research landscape. It supports hundreds of research institutions, startups,
and industries across member states. The program focuses on four main areas: quantum
communication, quantum simulation, quantum computing, and quantum metrology. The
EU also promotes infrastructure projects like the European Quantum Communication
Infrastructure (EuroQCI), which aims to establish a secure pan-European quantum
communication network. This mission reflects Europe’s intent to compete globally while
fostering innovation, industrial adoption, and academic excellence in quantum science.
1.8.3 United States: National Quantum Initiative Act

The United States formalized its quantum strategy with the National Quantum Initiative Act
passed in 2018. This act coordinates efforts across government agencies, including the
Department of Energy (DOE), National Science Foundation (NSF), and National Institute of
Standards and Technology (NIST), with significant funding and collaboration with private sector
leaders like IBM, Google, and Microsoft. The National Quantum Coordination Office
oversees these efforts, focusing on quantum research, technology transfer, education, and the
creation of quantum research centers. The U.S. aims to maintain its technological leadership,
secure supply chains, and harness quantum advantages for national security, scientific progress,
and economic growth.

Fig: USA must dominate quantum technology

1.8.4 China: Quantum Strategic Advantage

China has emerged as a global leader in quantum technology through sustained state-led
investment and rapid deployment. It has achieved several milestones, including launching the
world’s first quantum communication satellite (Micius) and demonstrating satellite-based
quantum key distribution over thousands of kilometers. China also leads in building a
nationwide quantum communication backbone network, connecting major cities through
ultra-secure fiber links. The Chinese government has reportedly invested billions of dollars in
quantum R&D, and projects like the National Laboratory for Quantum Information Science
in Hefei aim to consolidate China’s dominance in this space. China views quantum technologies
as essential to future economic and military competitiveness.
Fig: How Innovative Is China in Quantum?

These missions reflect a global “quantum race”, where nations recognize that quantum
supremacy could redefine cybersecurity, artificial intelligence, defense, and economic structures.
International collaboration, balanced with strategic competition, will shape the trajectory of
quantum innovation in the coming decades, with each nation seeking to leverage quantum
breakthroughs for economic growth, defense strength, and scientific prestige.

Reference : https://tifac.org.in/images/nmqta/concept_note12.06.19.pdf
SYLLABUS

Unit 2

Theoretical Structure of Quantum Information Systems


What is a qubit? Conceptual understanding using spin and polarization, Comparison:
classical bits vs quantum bits, Quantum systems: trapped ions, superconducting circuits,
photons (non-engineering view),Quantum coherence and decoherence – intuitive
explanation, Theoretical concepts: Hilbert spaces, quantum states, operators – only
interpreted in abstract, The role of entanglement and non-locality in systems, Quantum
information vs classical information: principles and differences, Philosophical implications:
randomness, determinism, and observer role
Introduction

Quantum Information Systems represent a transformative approach to computation and


communication, fundamentally leveraging the principles of quantum mechanics to process
and transmit information. At the heart of these systems lies quantum theory, which introduces
novel concepts such as superposition, entanglement, and quantum measurement, radically
differing from classical information theory.

In contrast to classical bits that exist in a definite state of 0 or 1, quantum bits or qubits can
exist in a superposition of both states simultaneously. This characteristic allows quantum
systems to perform parallel computations, offering exponential speedups for certain classes
of problems.
The theoretical foundation of Quantum Information Systems is built upon:

1. Quantum Mechanics: Core principles such as wavefunction, unitary evolution, and


measurement theory form the basis for information processing in quantum systems.
2. Qubits and Quantum Gates: Analogous to classical logic gates, quantum gates
manipulate qubits using unitary operations, enabling the construction of quantum
circuits.
3. Quantum Entanglement: A uniquely quantum phenomenon where the states of two
or more qubits become interdependent, regardless of spatial separation, enabling
powerful communication and computation protocols.
4. Quantum Algorithms and Complexity: Algorithms such as Shor’s for factoring and
Grover’s for search illustrate the advantages of quantum computation over classical
approaches.
5. Quantum Error Correction: Due to the fragile nature of qubits, robust error
correction techniques are essential for practical and scalable quantum computing.
6. Quantum Communication: Protocols like quantum teleportation and quantum key
distribution exploit entanglement and superposition to enable secure information
transfer.

The study of these theoretical structures not only lays the groundwork for quantum computing
and quantum cryptography but also contributes to the understanding of information itself in
a fundamentally new light.
What is a qubit?
A qubit, or quantum bit, is the fundamental unit of information in a quantum computer.
Unlike a classical bit, which can be either 0 or 1, a qubit can exist in a superposition of both
states simultaneously, represented as |0⟩, |1⟩, or any complex linear combination α|0⟩ + β|1⟩,
where α and β are complex probability amplitudes. This superposition allows quantum
systems to process vast amounts of information in parallel, enabling certain computations to
be executed exponentially faster than their classical counterparts. Qubits can also exhibit
entanglement, a uniquely quantum phenomenon where the state of one qubit is dependent on
the state of another, regardless of the distance between them. This allows for highly correlated
systems that are essential for quantum logic operations. Another key property is quantum
interference, which enables quantum algorithms to amplify correct computational paths while
canceling out incorrect ones. Qubits are extremely delicate and susceptible to noise, so
maintaining coherence—the time over which a qubit retains its quantum state—is a major
challenge. Various physical systems can be used to realize qubits, including superconducting
circuits (used by IBM and Google), trapped ions (IonQ), photons (PsiQuantum), quantum
dots, and NV centers in diamond. Each technology comes with trade-offs in terms of gate
speed, error rates, scalability, and environmental requirements. A qubit must be initializable,
controllable via quantum gates, measurable, and able to participate in entangling operations.
Typically, multiple physical qubits are needed to form a logical qubit that is protected by
quantum error correction codes, due to the inherent instability of quantum states. These
logical qubits serve as the robust foundation for large-scale, fault-tolerant quantum
computation. Qubit manipulation is performed using finely tuned pulses of microwave,
optical, or radio-frequency energy, depending on the implementation. The Bloch sphere is
often used to visually represent a qubit’s state, where the poles correspond to |0⟩ and |1⟩, and
any point on the sphere’s surface represents a superposition. Reading a qubit’s state involves
a measurement, which collapses the qubit into one of the basis states (0 or 1) probabilistically,
determined by |α|² and |β|². This collapse is irreversible, and thus, quantum information must
be processed carefully before measurement. Qubits are the heart of all quantum algorithms,
including Shor’s factoring algorithm and Grover’s search algorithm. The power of a quantum
computer scales not linearly but exponentially with the number of coherent, entangled qubits,
making them uniquely powerful for problems involving massive state spaces. Developing
stable, high-fidelity, scalable qubit systems is one of the grand engineering challenges of our
time. Current quantum systems range from a few to several hundred qubits, but building a
fault-tolerant quantum computer will require millions of physical qubits operating in
synchrony. Despite their potential, qubits remain deeply complex and demand sophisticated
hardware, control electronics, cryogenics, and quantum software stacks. Ultimately, a qubit
is not just a data unit—it is a gateway to an entirely new computational paradigm governed
by the laws of quantum mechanics.

Conceptual understanding using spin and polarization


SYLLABUS

Unit 3

Building a Quantum Computer – Theoretical Challenges and


Requirements
What is required to build a quantum computer (conceptual overview)?,Fragility of quantum
systems: decoherence, noise, and control, Conditions for a functional quantum system:
Isolation, Error management, Scalability, Stability, Theoretical barriers: Why maintaining
entanglement is difficult, Error correction as a theoretical necessity, Quantum hardware
platforms (brief conceptual comparison),Superconducting circuits, Trapped ions, Photonics,
Visionvs reality: what’s working and what remains elusive, The role of quantum software in
managing theoretical complexities
3.0 Introduction

Quantum computing represents a revolutionary paradigm shift in the field of computation,


promising exponential speed-ups for specific classes of problems that are infeasible for classical
computers. The given figure 3.1 shows the blueprint for a Practical Quantum
Computer.However, building a functional and scalable quantum computer remains a profound
scientific and engineering challenge. This challenge is rooted not only in technological
constraints but also in deep theoretical issues that must be addressed to harness the full power of
quantum mechanics.

At the heart of a quantum computer lies the qubit — a quantum bit capable of existing in a
superposition of states. While this property enables parallelism and quantum interference, it also
introduces extreme sensitivity to environmental noise and errors. Maintaining quantum
coherence and achieving fault-tolerant computation are among the primary theoretical obstacles.

Fig 3.1: Blueprint for a Practical Quantum Computer


Theoretical challenges in building a quantum computer include:

* Quantum Decoherence and Error Correction: Qubits are highly susceptible to decoherence due
to interactions with their environment. Designing error-correcting codes that preserve quantum
information without direct measurement is a critical requirement.

* Qubit Scalability and Connectivity: Developing scalable architectures that support a large
number of qubits, along with efficient inter-qubit connectivity, is essential for executing complex
quantum algorithms.

* Quantum Gate Fidelity: High-precision control of quantum gates and operations is necessary
to ensure reliable computation. Theoretical models must support the design of gates that meet
fault-tolerance thresholds.

* Measurement and Readout: Extracting information from quantum states without disturbing
them significantly poses both theoretical and practical difficulties.

* Universal Quantum Computation: Establishing the minimum set of gates and operations
necessary for universal computation is a key theoretical concern in quantum computer design.

* Physical Implementation Models: Each physical platform — including superconducting qubits,


trapped ions, topological qubits, and photonic systems — has its own theoretical framework and
constraints, which must be rigorously analyzed and optimized.

This topic integrates quantum mechanics, computer science, and information theory to explore
the foundational principles required to construct a working quantum computer. Addressing these
theoretical challenges is vital to transforming quantum computing from a scientific curiosity into
a practical and transformative technology.

3.1 What is required to build a quantum computer (conceptual overview)?


Building a quantum computer is not just about assembling hardware—it's about engineering a
system that can faithfully implement the laws of quantum mechanics while overcoming
significant physical, technical, and theoretical barriers. Unlike classical computers, which
manipulate binary bits using transistors and electrical circuits, quantum computers manipulate
qubits—quantum bits that rely on phenomena like superposition, entanglement, and interference.
To make this possible, several core requirements must be met, often referred to as the DiVincenzo
Criteria, proposed by quantum physicist David DiVincenzo.

To build a quantum computer, one must design and integrate a highly complex system that
leverages quantum mechanical phenomena to process information. Unlike classical computers
that use binary bits (0 or 1), quantum computers use quantum bits or qubits, which can exist in
multiple states simultaneously due to superposition and entanglement. Below is a detailed
conceptual overview of what is required to build a quantum computer.

3.1.1. Qubits: The Fundamental Building Block


A quantum computer begins with qubits, which are the quantum analogs of classical bits. Qubits
can exist in a superposition of 0 and 1 and can be entangled with other qubits, enabling powerful
computational capabilities.
Types of physical implementations of qubits:
• Superconducting circuits (e.g., IBM, Google)
• Trapped ions (e.g., IonQ, Honeywell)
• Photons (optical qubits)
• Topological qubits (still theoretical)
• Quantum dots or spin-based qubits

3.1.2. Initialization and Control


Qubits must be initialized to a known state (typically |0⟩) before computation begins. Reliable
control over qubits using external signals (microwaves, lasers, or magnetic fields) is essential to
manipulate their quantum states for logic operations.
Key requirements:
• Precise quantum gate operations (single-qubit and two-qubit gates)
• Low noise and minimal external interference
• High-speed control and synchronization mechanisms

3.1.3. Quantum Gates and Circuits


Quantum logic gates operate on qubits and are the building blocks of quantum circuits. Common
quantum gates include:
• Pauli gates (X, Y, Z)
• Hadamard gate (H)
• Phase gates (S, T)
• Controlled-NOT (CNOT) gate

To perform algorithms, these gates are combined into quantum circuits following the rules of
unitary evolution governed by quantum mechanics.

3.1.4. Quantum Coherence and Decoherence Management


Coherence refers to the ability of a quantum system to maintain its quantum state over time.
Decoherence is the loss of this property due to environmental noise and interactions.
Challenges:
• Qubits must have long coherence times.
• Isolation from the environment is crucial.
• Use of cryogenics (e.g., dilution refrigerators) to reduce thermal noise in some systems.

3.1.5. Quantum Error Correction


Because quantum systems are fragile and prone to errors, error correction is vital for practical
quantum computing. Quantum error-correcting codes (QECCs) protect information without
measuring it directly.
Common approaches:
• Shor code
• Surface codes
• Concatenated codes
Requirements:
• Redundant encoding of logical qubits into multiple physical qubits
• Frequent error syndrome measurement
• Fault-tolerant implementation of gates

3.1.6. Scalability
A universal quantum computer must scale to hundreds or thousands of qubits. This involves:
• Modular design of qubit systems
• Inter-qubit connectivity (nearest-neighbor or all-to-all coupling)
• Integration with control electronics and hardware

3.1.7. Readout and Measurement


At the end of a quantum computation, the final quantum state must be measured to obtain
classical output. Measurement needs:
• High-fidelity and fast measurement techniques
• Minimal disturbance to unmeasured qubits (in mid-circuit measurement scenarios)
• Repeated measurements for probabilistic outputs

3.1.8. Quantum Software and Algorithms


Quantum algorithms exploit the unique features of quantum mechanics to solve problems more
efficiently than classical algorithms.
Examples:
• Shor’s algorithm for factoring
• Grover’s algorithm for search
• Quantum simulations for chemistry and materials science
A high-level software stack is required for:
• Programming (using languages like Qiskit, Cirq, or Q#)
• Compilation into low-level quantum gates
• Error-aware execution and scheduling

3.1.9. Quantum Hardware Infrastructure


To physically implement the above components, extensive infrastructure is needed:
• Cryogenic systems (especially for superconducting qubits)
• Vacuum systems (for ion trap qubits)
• Lasers, microwave generators, and optical systems
• High-speed electronics and classical co-processors
• Shielding from electromagnetic interference

3.1.10. Integration and Control Architecture


An orchestrated control system must:
• Coordinate qubit initialization, gate operations, and measurements
• Handle timing synchronization and feedback
• Interface classical and quantum components in hybrid architectures

3.1.11. Validation and Verification


Because of the probabilistic nature of quantum computation, verifying correctness is non-trivial.
Techniques include:
• Tomography (quantum state/process)
• Benchmarking (randomized or cross-entropy)
• Classical simulation of small quantum systems for comparison

3.1.12. Quantum Networking (for future scaling)


In distributed quantum computing or quantum internet settings, entanglement between qubits
across different machines will be necessary. This requires:
• Quantum repeaters
• Entanglement swapping and purification protocols
• Quantum communication interfaces (quantum teleportation)

To build a quantum computer, we must combine:


A physical platform to store and process qubits, Tools for initializing, controlling, and
measuring them, Methods for preserving quantum states long enough to compute, and
robust systems for error correction and fault tolerance

It’s a deeply interdisciplinary effort, involving physics, electrical engineering, computer science,
and materials science. Only by integrating all these elements can we move from small test
systems to powerful, large-scale quantum computers capable of solving the world’s hardest
problems.

1. Scalable Physical System with Qubits


The most basic requirement is a physical system that can represent qubits. These qubits can be
realized using trapped ions, superconducting circuits, quantum dots, photons, or atoms. The
system must be scalable, meaning that we can increase the number of qubits without losing
control or coherence. Scalability is crucial because real-world problems often require hundreds
to millions of qubits, and managing their states becomes exponentially more complex.

2. Initialization of Qubits to a Known State


Before computation begins, all qubits must be reliably initialized to a known reference state,
typically |0⟩. This is similar to resetting classical memory before use. In quantum systems,
initialization can be challenging due to thermal noise and environmental interactions, so the
system must be cooled or isolated to ensure clean starting states.

3. Long Coherence Time


Coherence refers to the ability of qubits to maintain their quantum state over time. The longer a
qubit remains coherent, the more complex computations it can perform. Unfortunately, quantum
states are extremely fragile and prone to decoherence—loss of quantum behavior due to
interaction with the environment. Building systems with long coherence times requires shielding,
cooling, and highly stable hardware components.

4. Universal Set of Quantum Gates


A quantum computer must be able to apply a universal set of quantum logic gates to manipulate
qubits. These gates are the building blocks of quantum circuits, just like AND/OR/NOT gates in
classical circuits. Essential gates include single-qubit gates (like the Hadamard and Pauli-X) and
multi-qubit gates (like the CNOT gate) that enable entanglement. Together, they must be able to
perform any quantum computation.

5. Qubit-Specific Measurement
After quantum computation, the result must be read out by measuring the qubits. This
measurement collapses the qubits into classical 0 or 1 values. It is vital that each qubit can be
measured individually, reliably, and without disturbing others. High-fidelity measurement is
essential for accurate output.

6. Quantum Error Correction


Due to the fragile nature of quantum states, errors are inevitable. However, unlike classical errors,
quantum errors involve not just flipping bits but also changing phases. Quantum error correction
(QEC) techniques are essential to protect against decoherence and operational errors.
Implementing QEC requires encoding logical qubits into groups of physical qubits and
detecting/correcting errors without collapsing the quantum state.

7. Control and Interconnects

Quantum computers require precise control systems to manipulate qubits with microwaves,
lasers, or magnetic fields. The architecture must include high-speed control electronics and
low-noise communication channels between qubits. As systems grow in size, creating efficient
interconnects and control networks becomes even more challenging.

8. Fault-Tolerance and Scalability


A practical quantum computer must be fault-tolerant, meaning it can perform long computations
even when some components fail or introduce errors. This involves building redundancy into
both the physical qubits and control logic. It also means the architecture should scale from small
labs to commercial-grade machines without degradation in performance or reliability.

3.2 Fragility of quantum systems:

Quantum systems, the foundation of quantum computing and quantum information science, are
governed by the principles of quantum mechanics—namely superposition, entanglement, and
coherence. These properties enable quantum computers to process information in powerful new
ways. However, quantum systems are inherently fragile, meaning they are highly sensitive to
external disturbances, environmental interactions, and imperfections in control mechanisms.

The fragility of quantum systems is one of the most critical challenges in realizing practical
quantum computing. Tiny interactions with the environment can destroy the delicate quantum
states—an effect known as decoherence. Moreover, even small inaccuracies in quantum gate
operations or fluctuations in temperature or electromagnetic fields can introduce errors. Because
quantum information cannot be cloned (as per the no-cloning theorem), standard redundancy and
error-handling techniques from classical computing do not apply directly.

This fragility necessitates stringent control over qubit environments, high-fidelity operations, and
the development of sophisticated quantum error correction strategies. Understanding and
mitigating the fragility of quantum systems is central to building stable, scalable, and fault-
tolerant quantum technologies.

3.2.1 Decoherence

Decoherence is one of the most fundamental and problematic challenges in quantum computing.
It refers to the process by which a quantum system loses its quantum mechanical properties—
particularly superposition and entanglement—due to interactions with the surrounding
environment. In theory, a qubit can exist in a coherent superposition of both 0 and 1, allowing
quantum computers to perform complex parallel calculations. However, in practice, qubits are
never completely isolated. They interact with stray electromagnetic fields, nearby particles,
thermal energy, and even cosmic radiation. These tiny interactions disturb the quantum state,
forcing it to "collapse" into a definite classical state, destroying the computation. Unlike classical
bits, which are stable under most conditions, qubits are fragile and highly sensitive. The
timeframe during which a qubit retains its coherence is known as the coherence time, and this is
often very short—ranging from microseconds to milliseconds depending on the hardware. The
shorter the coherence time, the fewer quantum operations (gates) can be performed reliably.
Extending coherence time is one of the central goals of quantum hardware design, and it requires
extreme isolation techniques, cryogenic temperatures, and highly pure materials. Until
decoherence is significantly minimized or managed with effective quantum error correction,
building large-scale, reliable quantum computers will remain a formidable task.
Decoherence is the process by which a quantum system loses its quantum behavior and begins
to behave classically due to interactions with its environment.

For example, if a qubit in superposition |0⟩ + |1⟩ interacts with a photon, it may end up in either
|0⟩ or |1⟩, destroying the computation. Mathematically, decoherence is modeled as the decay of
off-diagonal terms in the system’s density matrix.
Common types of decoherence:
Dephasing (loss of relative phase between |0⟩ and |1⟩)
Amplitude damping (loss of energy from excited to ground state)
Decoherence Time (T2): The characteristic time over which a qubit remains coherent. Longer
T2 times are desirable for computation.

Fig3.2: Quantum decoherence as characterized by bit-flips and phase-flips

3.2.2 Quantum Noise: The Enemy of Accuracy


Closely related to decoherence is the concept of quantum noise, which refers to unwanted and
random variations in a quantum system that introduce errors during computations. In classical
systems, noise often manifests as minor fluctuations in voltage or current, which can be filtered
or tolerated due to the digital nature of classical bits. But quantum systems, operating at the level
of probability amplitudes, are far more susceptible. Noise can arise from imperfections in the
material, inconsistencies in control pulses, or environmental vibrations and electromagnetic
interference. In a quantum processor, even tiny noise levels can cause bit-flip errors (where |0⟩
becomes |1⟩ or vice versa) or phase-flip errors (which affect the relative phase between |0⟩ and
|1⟩). These errors accumulate rapidly and can destroy the accuracy of a computation.

Fig: Noise in Quantum Computing


What makes quantum noise particularly challenging is that it is often difficult to detect and
correct due to the no-cloning theorem, which prevents copying of an unknown quantum state.
Moreover, every physical implementation of a quantum computer—be it superconducting qubits,
trapped ions, or photonic systems—has its own unique noise characteristics. Understanding,
modeling, and mitigating noise is essential for increasing fidelity in quantum gates and improving
overall system reliability. Advanced error correction codes like the surface code aim to
counteract noise, but they require a large number of physical qubits to protect just a few logical
qubits, further emphasizing how deeply quantum noise constrains system design.

Noise refers to any unwanted disturbance that affects the state of a quantum system.
Types of quantum noise include:
a) Thermal Noise:
Caused by fluctuations in temperature.
Can cause qubits to flip randomly (bit-flip errors) or change phase (phase-flip errors).
b) Gate Noise:
Arises from imprecise control over quantum gates.
Imperfect calibration leads to small but accumulating errors during gate operations.
c) Measurement Noise:
Occurs when reading out the quantum state.
Detectors may misidentify the qubit state due to limitations in resolution or interference.
d) Crosstalk:
When operations on one qubit unintentionally affect another nearby qubit.
Noise models are often described using quantum channels such as:
• Bit-flip channel
• Phase-flip channel
• Depolarizing channel

Quantum systems are highly susceptible to noise due to their continuous, analog nature and the
lack of built-in error correction as in classical digital systems.

3.2.3 Control: The Precision Engineering of Quantum Gates

Operating a quantum computer demands extraordinary precision in control systems, far beyond
what is typically required for classical machines. Qubits are manipulated using finely tuned
electromagnetic pulses—such as microwave signals in superconducting qubits or laser beams in
trapped ions—to perform quantum gate operations. These gates must rotate the quantum state
precisely on the Bloch sphere, which geometrically represents the state of a qubit. Even the
slightest error in timing, amplitude, or phase of these control pulses can result in the qubit
deviating from the intended path, leading to computational errors. In classical systems, slight
inaccuracies may go unnoticed due to their binary nature; however, quantum systems demand
continuous, analogy precision, where even a minor fluctuation can ruin a quantum operation.
Additionally, as the number of qubits increases, the complexity of their interactions also rises.

Control systems must not only address individual qubits but also coordinate entanglement
operations between multiple qubits—often requiring synchronization at the nanosecond scale.
Any crosstalk, unintended coupling, or thermal noise in control lines can introduce correlated
errors. Developing scalable and accurate quantum control hardware—such as low-noise signal
generators, error-resilient pulse sequences, and high-speed electronics—is one of the most active
areas in quantum engineering. Without ultra-precise control, even a perfect theoretical algorithm
cannot be reliably executed on real hardware.
Controlling quantum systems with high precision is extremely difficult and crucial for reliable
computation.

Control challenges include:

a) Precision Requirements:

• Quantum gates must operate at near-perfect fidelity.


• Even tiny inaccuracies can cause errors that propagate throughout a quantum circuit.

b) Timing and Synchronization:

• Operations must be perfectly timed to avoid decoherence or errors.


• Delays or jitter can desynchronize qubits and destroy quantum correlations.

c) Isolation vs. Accessibility:

• Qubits must be isolated from environmental noise but accessible for operations and
measurements.
• This duality is difficult to achieve and maintain.

d) Scalability:

• As the number of qubits increases, maintaining uniform control and minimizing cross-
qubit interference becomes exponentially more difficult.

3.3 Conditions for a functional quantum system:


To build a functional quantum system—especially for quantum computing—a number of stringent
conditions must be met. These are often summarized as the DiVincenzo Criteria, proposed by physicist
David DiVincenzo, which outline the fundamental requirements for a practical quantum computer.
Here’s a comprehensive breakdown of the key conditions for a functional quantum system:

1. Well-defined Qubits
A quantum system must have clearly defined two-level quantum states that act as qubits (quantum bits).
These states (e.g., |0⟩ and |1⟩) must be distinguishable and controllable.
Examples: Spin states of an electron, energy levels of an ion, superconducting loops.
2. Initialization of Qubits
The system must be able to reliably prepare all qubits in a known initial state, typically |0⟩.
Initialization is crucial for consistent quantum algorithm execution.
3. Long Coherence Time
Qubits must maintain their quantum state (coherence) long enough to perform computations.
Coherence time (T2) must be significantly longer than the time it takes to perform quantum gate
operations.
High coherence ensures the integrity of superposition and entanglement.
4. Universal Set of Quantum Gates
The system must support a set of quantum gates that can perform arbitrary operations on qubits.
This usually includes:
• Single-qubit gates (e.g., Hadamard, Pauli-X)
• At least one entangling two-qubit gate (e.g., CNOT)
Together, these gates must form a universal set, enabling the construction of any quantum algorithm.
5. Qubit-Specific Measurement Capability
It must be possible to measure the state of individual qubits without disturbing others.
Measurement should yield reliable classical outcomes corresponding to quantum basis states.
6. Scalable Architecture
The system must allow for the integration of many qubits (tens to thousands or more) without excessive
overhead or noise.
Scalability involves both hardware and control systems, requiring modularity and fault-tolerance.
7. Qubit Interconnectivity
Qubits must be able to interact with specific others (not necessarily all), enabling entanglement and two-
qubit gates.
Efficient connectivity is essential for implementing quantum algorithms and error correction.
8. Error Correction and Fault Tolerance
The system must support quantum error correction to counteract decoherence and noise.
Error correction requires additional qubits (logical qubits encoded in many physical ones) and complex
operations.
9. Reproducible and Controllable Quantum Dynamics
All quantum operations (initialization, gates, measurements) must be precisely reproducible and
controllable.
Gate fidelities must be extremely high (typically >99.9% for fault-tolerant thresholds).
10. Interface for Input and Output
The system should be able to take classical inputs, execute quantum instructions, and return classical
outputs after quantum measurements.
This involves control electronics, classical computers, and user interfaces.

3.3.1 Isolation: Shielding Qubits from the World


Isolation is one of the most fundamental prerequisites for a functional quantum system. Qubits
must be completely isolated from environmental disturbances in order to maintain their fragile
quantum states. Even the tiniest interaction with the outside world—such as stray
electromagnetic waves, temperature fluctuations, air molecules, or mechanical vibrations—can
cause the qubit to lose coherence, the key property that enables quantum superposition and
entanglement. This process, known as decoherence, is the primary threat to accurate quantum
computation. To combat this, quantum systems are built in highly controlled environments: ultra-
high vacuum chambers, cryogenic systems operating near absolute zero, and magnetically
shielded rooms. For example, superconducting qubits are kept at millikelvin temperatures using
dilution refrigerators to eliminate thermal energy, while trapped-ion systems are held in
electromagnetic fields within vacuum chambers to prevent collisions. Without such extreme
isolation, qubits would interact with external noise and collapse into classical states, making
quantum computation unreliable or impossible.

3.3.2 Error Management: Handling the Fragility of Quantum Information


Error management in quantum systems is significantly more complex than in classical systems
due to the nature of quantum information. In classical computing, errors like bit-flips can often
be corrected using redundancy and parity checks. In contrast, quantum errors involve more than
just flipping bits—they include phase errors, amplitude damping, and crosstalk, all of which must
be detected and corrected without measuring or collapsing the quantum state. This is where
Quantum Error Correction (QEC) comes into play. QEC encodes a single logical qubit across
multiple physical qubits, allowing the system to detect and correct errors by measuring ancillary
qubits without directly disturbing the encoded quantum information. One popular method is the
surface code, which provides robustness against local errors and is scalable for large systems.
However, implementing error correction requires a large overhead: to protect a single logical
qubit, dozens to hundreds of physical qubits may be needed. The goal of error management is to
reach the fault-tolerant threshold, where the rate of error correction exceeds the rate of error
occurrence, allowing quantum algorithms to run reliably for extended periods.

3.3.3 Scalability: From a Few Qubits to Millions


Scalability is the bridge between experimental quantum computers and useful, industry-grade
quantum machines. Current quantum computers can control a few dozen to a few hundred qubits,
but solving real-world problems—like breaking RSA encryption or simulating complex
molecules—may require thousands to millions of qubits. To scale quantum systems to this level,
the entire architecture must be designed to support modular, repeatable, and interconnected qubit
arrays. This means the hardware, control electronics, error correction protocols, and
communication interfaces must be extensible without exponential increases in complexity or
cost. One of the challenges in scaling is that as the number of qubits grows, so does the cross-
talk between them, making control more difficult. Moreover, physical space, cooling
infrastructure, and signal routing become bottlenecks. Technologies like quantum interconnects,
quantum buses, and distributed quantum computing (where multiple quantum processors are
networked) are being explored to overcome these limitations. A scalable quantum system must
not only add more qubits, but also maintain their fidelity, coherence, and manageability as the
system grows.

3.3.4 Stability: Ensuring Long-Term Reliability and Repeatability


Stability is the foundation upon which quantum computing must rest if it is to become
commercially viable and widely adopted. A functional quantum system must not just perform
one accurate computation—it must consistently deliver high-fidelity results across repeated
operations, over extended time periods, and under varying physical conditions. This requires both
physical stability of the hardware and logical stability of the quantum operations. Physical
stability involves minimizing thermal drift, vibrations, and electromagnetic fluctuations, all of
which can disturb the qubit environment. Logical stability, on the other hand, demands that
quantum gates behave predictably and reproducibly with minimal error, despite operating in a
probabilistic framework. Stabilizing a quantum system also includes managing long-term
degradation of materials, maintaining calibration of control systems, and implementing feedback
loops to self-correct errors or drifts. Without stability, quantum systems cannot scale up, remain
useful, or be trusted to run complex algorithms—making it a non-negotiable requirement in the
roadmap toward fault-tolerant, large-scale quantum computing.

3.4 Theoretical barriers

Building a functional, scalable, and reliable quantum computer involves not just engineering
challenges, but also profound theoretical barriers that stem from the fundamental nature of
quantum mechanics.

3.4.1 Why Maintaining Entanglement Is Difficult


Entanglement is a cornerstone of quantum computing—allowing qubits to be deeply correlated in ways
that classical bits can never be. However, maintaining entanglement between qubits is one of the most
fragile and technically demanding aspects of building a quantum computer. Entangled states are highly
sensitive to external disturbances, such as temperature fluctuations, magnetic fields, or even atomic
vibrations. Any slight interaction with the environment can cause decoherence, breaking the delicate
correlations and rendering the entangled state useless. Moreover, the more qubits you entangle, the harder
it becomes to keep them stable over time and across physical distance. Entanglement also requires precise
synchronization between qubits, often involving laser pulses, microwave signals, or magnetic fields that
must be coordinated to near perfection. This precision becomes increasingly difficult to maintain in large
systems, leading to a loss of fidelity in quantum operations. From a theoretical standpoint, entanglement
must persist long enough to be used in computation, communication, or measurement, which places a
massive burden on system design, shielding, error correction, and control mechanisms. Without reliably
maintaining entanglement, the very foundation of quantum computing collapses.

3.4.2 Error Correction as a Theoretical Necessity


Unlike classical systems where error rates are minimal and redundancy can be added with simple
checks, quantum systems suffer frequent and subtle errors that cannot be addressed through
traditional means. Qubits can experience not only bit-flip errors but also phase-flip and combined
errors, due to the probabilistic nature of quantum mechanics. Compounding the issue is the no-
cloning theorem, which states that unknown quantum states cannot be copied—so we cannot
simply replicate data to safeguard it. As a result, quantum error correction (QEC) is not a
luxury—it is a theoretical necessity. QEC codes such as the Shor code, Steane code, and surface
codes work by encoding a logical qubit into multiple physical qubits in a way that errors can be
detected and corrected indirectly, without collapsing the quantum state. However, implementing
QEC comes with massive overhead—sometimes requiring dozens or hundreds of physical qubits
for a single logical qubit. This introduces significant complexity and resource demands, pushing
the limits of hardware and control systems. From a theoretical standpoint, fault-tolerant quantum
computing—where computations can proceed indefinitely despite the presence of noise and
imperfections—is only achievable through robust and scalable error correction, making it a
foundational element of any future quantum architecture.

3.4.3 Quantum Hardware Platforms (Brief Conceptual Comparison)


There is no single way to build a quantum computer, and several hardware platforms have emerged, each
with distinct theoretical advantages and practical limitations. The three most prominent approaches are
superconducting circuits, trapped ions, and photonic systems. Superconducting qubits, used by companies
like Google and IBM, are built on electrical circuits that operate at extremely low temperatures to
eliminate resistance. They offer fast gate speeds and are compatible with existing semiconductor
technologies, but suffer from short coherence times and significant control complexity. Trapped ions,
used by IonQ and Honeywell, involve storing individual atoms in electromagnetic fields and manipulating
them with lasers. These systems have long coherence times and extremely high fidelity, but gate
operations are slower and the system is harder to scale due to the complexity of ion control. Photonic
systems, being explored by Xanadu and PsiQuantum, use particles of light (photons) as qubits. They are
naturally robust to environmental noise and excellent for quantum communication, but face challenges in
generating and interacting photons on demand. Each platform has theoretical implications regarding
scalability, coherence, speed, and connectivity, and ongoing research continues to refine which
approach—or combination—will lead to practical, universal quantum computing.

3.4.4 Superconducting Circuits


Superconducting circuits are perhaps the most commercially mature quantum hardware platform
to date. They use tiny loops of superconducting materials cooled to near absolute zero, where
they exhibit zero electrical resistance and allow quantum effects like superposition and
entanglement to emerge. Qubits in this system are known as transmons, and they are manipulated
using microwave pulses. These systems are attractive because they are relatively fast, can be
fabricated using existing chip-making technologies, and are easily integrated with classical
electronics. However, they have short coherence times (typically microseconds), meaning
operations must be performed quickly before the qubits lose their quantum behavior.
Furthermore, maintaining the cryogenic environment requires complex and costly infrastructure.
Superconducting systems are also susceptible to crosstalk and noise, which increases with the
number of qubits. Despite these challenges, they remain a leading contender in the race toward
scalable quantum processors, especially due to the rapid improvements being made in error
correction and qubit coherence.

3.4.5 Trapped Ions


Trapped ion quantum computers use charged atoms (ions) suspended in electromagnetic fields as qubits.
These ions are isolated in ultra-high vacuum chambers and manipulated using precisely tuned laser beams.
One of the biggest theoretical advantages of trapped ions is their exceptionally long coherence times,
sometimes exceeding seconds or even minutes, which is orders of magnitude longer than superconducting
qubits. Additionally, all qubits in a trapped ion system are naturally identical, reducing variability and
improving error correction. Gate operations are highly accurate, and entanglement between ions is
relatively straightforward to create. However, trapped ion systems are slower in operation—gates can take
microseconds to milliseconds—and become increasingly hard to control as the number of ions increases.
The complexity of laser control systems and the physical footprint of the apparatus make large-scale
deployment challenging. Still, their high fidelity and predictable behavior make them a favorite for small-
to medium-scale fault-tolerant quantum systems.

3.4.6 Photonics
Photonic quantum computing uses light particles (photons) as qubits, which makes them uniquely suited
for quantum communication and networking. Photons are naturally immune to many environmental
disturbances that affect matter-based qubits, giving them an inherent robustness to noise and decoherence.
Quantum information is typically encoded in properties like polarization, phase, or path of the photons.
Because photons travel at the speed of light, photonic systems promise extremely fast communication,
making them ideal for building the quantum internet. However, photonic quantum computing also faces
significant challenges. Generating single photons on demand, routing them precisely through optical
circuits, and making them interact to perform logic gates require highly advanced technologies. Unlike
ions or superconducting qubits, photons do not naturally interact, so nonlinear optical components or
measurement-based schemes are needed to perform two-qubit gates. Despite these hurdles, advances in
integrated photonics and optical chips are making photonic quantum systems increasingly viable. Their
ability to operate at room temperature and interface with fiber-optic networks gives them a distinct edge
for scalable communication-focused quantum applications.

3.5 Vision vs. Reality


3.5.1 What’s Working and What Remains Elusive
The vision of quantum computing promises breakthroughs in areas like cryptography, material science,
machine learning, optimization, and secure communication. In theory, quantum computers can solve
problems that are intractable for classical machines, such as factoring large integers in polynomial time
(via Shor’s algorithm) or searching unstructured databases in square-root time (via Grover’s algorithm).
The ultimate vision is the development of universal, fault-tolerant, scalable quantum computers capable
of transforming entire industries—achieving so-called quantum advantage or even quantum supremacy
in practical tasks.
However, the reality today is far more constrained. Although there has been significant progress—most
notably Google's demonstration of quantum supremacy in 2019 (where their quantum processor
completed a task in minutes that would take a classical supercomputer days)—these achievements are still
largely academic or proof-of-concept in nature. Current quantum devices are known as Noisy
Intermediate-Scale Quantum (NISQ) systems. They typically consist of tens to a few hundred qubits, are
error-prone, and lack the fault-tolerance required for large-scale applications. Problems like decoherence,
error rates, limited qubit connectivity, and short coherence times still limit their utility.
Furthermore, most real-world problems require high-fidelity qubits in the thousands, if not millions—
something today’s hardware is far from achieving. Scalability, reliability, and robust error correction
remain elusive. Additionally, while quantum algorithms theoretically outperform classical ones, they
often require thousands of perfect gate operations—currently impossible on today's hardware. Therefore,
although the foundational concepts have been validated, practical, industry-relevant quantum applications
are still largely out of reach. The field is progressing fast, but the gap between visionary expectations and
current technological maturity is still substantial.

3.5.2 The Role of Quantum Software in Managing Theoretical Complexities


While hardware development is essential, quantum software plays an equally critical role in
bridging the gap between theoretical quantum algorithms and practical implementation. Quantum
software addresses the inherent complexities of quantum computation—such as encoding
algorithms into hardware-specific instructions, managing noise, optimizing gate sequences, and
handling quantum-classical hybrid models. These complexities arise from the very nature of
quantum information: it is non-intuitive, probabilistic, and fragile, requiring entirely new
programming paradigms.
Quantum software platforms like Qiskit (IBM), Cirq (Google), Ocean (D-Wave), and PennyLane
(Xanadu) allow researchers and developers to write and simulate quantum algorithms in high-
level programming languages. These frameworks handle low-level tasks like gate
decomposition, qubit mapping, and error mitigation, making quantum computing more
accessible. They also support hybrid quantum-classical algorithms like the Variational Quantum
Eigensolver (VQE) and Quantum Approximate Optimization Algorithm (QAOA), which are
particularly suitable for NISQ-era devices.
Moreover, quantum software plays a vital role in quantum error correction—designing codes that
detect and correct errors while preserving entanglement and superposition. It also assists in
compilation and transpiration, converting abstract algorithms into hardware-specific instructions
that account for connectivity constraints, coherence times, and gate fidelity. As quantum systems
scale up, software will be central to orchestrating parallel qubit operations, managing quantum
resources, and ensuring system stability.
In essence, quantum software is not just a support tool—it is a core enabler of quantum
computation, helping manage the complexities that come from both the theory and the limitations
of physical systems. It transforms quantum computers from abstract theoretical models into
usable, programmable machines and will continue to play a pivotal role as the technology
matures.
SYLLABUS

Unit 4

Quantum Communication and Computing – Theoretical


Perspective
Quantum vs Classical Information, Basics of Quantum Communication, Quantum Key
Distribution (QKD),Role of Entanglement in Communication, The Idea of the Quantum
Internet – Secure Global Networking, Introduction to Quantum Computing, Quantum
Parallelism (Many States at Once),Classical vs Quantum Gates, Challenges: Decoherence
and Error Correction, Real-World Importance and Future Potential
4.0 Introduction

Quantum communication and computing represent a revolutionary shift in how information is


processed, transmitted, and secured, based on the fundamental principles of quantum mechanics.
Unlike classical systems that rely on bits as the smallest unit of information (taking values 0 or
1), quantum systems use quantum bits or qubits, which can exist in superpositions of states and
exhibit entanglement—phenomena with no classical counterpart.

The theoretical foundations of quantum communication and computing provide the framework
to understand, design, and analyse the behaviour and capabilities of quantum systems. These
principles form the backbone for developing quantum algorithms, secure communication
protocols, and scalable quantum architectures.

In quantum communication, the theoretical perspective focuses on how quantum entanglement


and no-cloning principles enable fundamentally secure methods of transmitting information,
such as Quantum Key Distribution (QKD). It also explores the limits of information transfer and
the impact of noise and decoherence on communication fidelity.

In quantum computing, the theoretical viewpoint addresses how quantum mechanics can be
harnessed to perform computations that are intractable for classical systems. It includes the study
of quantum gates, quantum circuits, algorithm complexity, and error correction models, as well
as the mathematical underpinnings of quantum logic and measurement.

This theoretical lens is essential to understand both the potential and limitations of quantum
technologies, guiding researchers in overcoming key challenges such as decoherence, scalability,
fault-tolerance, and algorithmic development.

4.1 Quantum vs Classical Information


Classical information refers to the type of information we deal with in everyday computing—
where data is encoded using binary digits, or bits, which can be in one of two states: 0 or 1. All
classical computations, from browsing the internet to storing videos, are ultimately performed by
manipulating these bits using logic gates. Classical information theory, introduced by Claude
Shannon, measures the amount of information using bits and is constrained by deterministic
rules. These systems can be copied, measured without disturbance, and transmitted reliably over
classical channels like fiber optics or radio waves.
Quantum information, on the other hand, operates in a radically different framework based on
the principles of quantum mechanics. It uses qubits (quantum bits), which can exist not only in
the states 0 or 1, but also in a superposition of both. This means a qubit can represent multiple
values at once, allowing quantum computers to perform complex computations more efficiently
than classical systems in specific tasks. Furthermore, qubits exhibit entanglement, a phenomenon
where the state of one qubit is dependent on the state of another, no matter how far apart they
are. This creates powerful correlations that classical bits cannot replicate. However, quantum
information is fragile—it cannot be cloned (due to the no-cloning theorem), is altered upon
measurement, and is highly susceptible to noise and decoherence.
In this, classical information is stable, scalable, and well-understood, forming the backbone of
today’s digital world. Quantum information offers a leap in computational power and encryption
capabilities, but remains in a developmental stage due to the inherent challenges in controlling
and maintaining quantum states. Both forms of information are crucial, but quantum information
opens doors to solving problems that are unsolvable or intractable using classical approaches.
4.1.1. Representation of Information
• Classical: Information is represented using bits, which take values of either 0 or 1. All
classical systems and digital devices operate using binary states and logic gates like AND,
OR, and NOT.
• Quantum: Information is represented using qubits, which can be in state 0, 1, or a
superposition of both. A qubit’s state is described by a complex probability amplitude,
allowing parallelism in computations.
4.1.2. Superposition and Parallelism
• Classical: A bit can only be in one state at a time—either 0 or 1. Computation must
evaluate each possibility sequentially (unless using parallel processors).
• Quantum: Due to superposition, qubits can represent multiple states simultaneously. A
quantum computer with n qubits can theoretically represent 2ⁿ states at once, offering
exponential computational power for specific problems.

4.1.3. Entanglement
• Classical: Bits operate independently. The state of one bit does not affect another unless
explicitly connected via logic operations.
• Quantum: Qubits can become entangled, meaning the state of one qubit directly affects
the state of another, even over long distances. This allows for powerful correlations used
in quantum algorithms and quantum teleportation.
4.1.4. Measurement and Observation
• Classical: Measuring a classical bit simply reveals its value (0 or 1), and the bit remains
unchanged by the observation.
• Quantum: Measuring a qubit collapses its superposition to a single classical state (0 or 1),
altering its original state. This makes observation destructive and requires careful design
of quantum algorithms.

4.1.5. Information Copying and Cloning


• Classical: Bits can be freely copied without altering the original data. Data backup,
replication, and transmission are straightforward.
• Quantum: The no-cloning theorem states that it is impossible to make an exact copy of
an arbitrary unknown quantum state. This protects data in quantum cryptography but
complicates quantum communication and computation.

4.1.6. Error Correction and Stability


• Classical: Error correction is mature and well-developed using redundancy, parity bits,
and error-correcting codes.
• Quantum: Qubits are fragile and prone to decoherence (loss of quantum behavior due to
environmental noise). Quantum error correction is an active area of research and requires
complex strategies like surface codes.

4.1.7. Computational Power


• Classical: Classical computers excel at general-purpose tasks and are extremely efficient
for most everyday computing needs.
• Quantum: Quantum computers outperform classical ones in specific tasks like factoring
large numbers (Shor’s algorithm), searching unsorted data (Grover’s algorithm), and
simulating quantum systems. However, they are not universally superior and are currently
limited by hardware constraints.

4.1.8. Communication and Security


• Classical: Classical communication channels are vulnerable to eavesdropping but are
protected using encryption schemes based on mathematical hardness assumptions.
• Quantum: Quantum communication enables quantum key distribution (QKD), which
ensures secure communication that is provably resistant to interception due to the laws of
quantum physics.

4.1.9. Physical Implementation


• Classical: Bits are implemented using voltage levels in transistors. Devices are stable,
mass-producible, and energy-efficient.
• Quantum: Qubits are realized using various physical systems—superconducting circuits,
trapped ions, photons, or spins. Each has trade-offs in terms of scalability, coherence
time, and ease of control.

4.1.10. Development and Maturity


• Classical: Classical computing is a mature field with decades of progress, large-scale
infrastructure, and global adoption.
• Quantum: Quantum computing is still emerging, with progress accelerating in both
academia and industry. While small-scale quantum systems exist, building fault-tolerant,
scalable machines is a major challenge.

4.2 Basics of Quantum Communication


Quantum communication is a cutting-edge field that leverages the principles of quantum
mechanics to transmit information securely and efficiently. Unlike classical communication,
which uses electrical signals or light pulses to represent bits (0s and 1s), quantum communication
uses qubits, often encoded in photons. These qubits can exist in superposition states, enabling
the encoding of more complex information. The core advantage of quantum communication lies
in its inherent security—thanks to principles like the Heisenberg Uncertainty Principle, any
attempt to measure or intercept a quantum state inevitably disturbs it, making eavesdropping
detectable. Quantum communication is particularly useful for applications such as secure
transmission of sensitive information, quantum internet, and distributed quantum computing.
However, long-distance transmission is still a challenge due to photon loss in optical fibers and
the fragility of quantum states, which is why technologies like quantum repeaters are under
development.
• Definition: Quantum communication is the process of transferring information using
quantum states such as qubits, often carried by photons.
• Security: Inherent security arises because quantum states cannot be measured or cloned
without altering them (Heisenberg Uncertainty Principle and No-Cloning Theorem).
• Medium: Photons are typically used for quantum communication because they travel at
the speed of light and are less prone to environmental noise.
• Applications: Includes secure data transmission, quantum internet, satellite
communication, and distributed quantum computing.
• Challenges: Quantum signals degrade over long distances due to photon loss and
decoherence. Solutions like quantum repeaters are under research.

4.3. Quantum Key Distribution (QKD)


Quantum Key Distribution (QKD) is one of the most practical and successful applications of
quantum communication. It allows two parties (commonly called Alice and Bob) to generate a
shared secret key over an insecure channel in such a way that any eavesdropper (Eve) attempting
to intercept the communication will inevitably be detected. The most famous QKD protocol is
BB84, introduced by Charles Bennett and Gilles Brassard in 1984. In QKD, quantum bits are
transmitted using properties such as polarization of photons. Because measuring a quantum state
disturbs it, any unauthorized observation changes the state of the qubits, thus alerting the
legitimate users. After transmission, Alice and Bob compare a subset of their bits to detect any
discrepancies. If the error rate is below a threshold, the key is considered secure. QKD is
unconditionally secure in theory, relying not on computational hardness but on the laws of
quantum physics. It is already being used in sectors like banking, defense, and government
communication in some countries.
Steps in QKD:
1. Quantum Transmission – Qubits are sent via a quantum channel.
2. Measurement and Sifting – Receiver measures qubits and compares part of the data.
3. Error Checking – Public comparison detects eavesdropping.
4. Key Extraction – A shared secret key is derived using only verified bits.

4.4. Role of Entanglement in Communication


Entanglement is one of the most intriguing and powerful phenomena in quantum mechanics, and
it plays a critical role in quantum communication. When two qubits are entangled, their states are
deeply correlated, such that the measurement of one instantly determines the state of the other,
regardless of the distance between them. This non-local correlation enables protocols like
Quantum Teleportation, where the state of a qubit can be transferred from one location to another
without physically moving the particle. Entanglement is also a fundamental resource in device-
independent QKD, where the security of the communication does not rely on trusting the
quantum devices themselves. Additionally, entanglement swapping allows the linking of distant
nodes in a quantum network, serving as the backbone of the quantum internet. Despite its
promise, maintaining entanglement over long distances is challenging due to decoherence and
noise, which is why creating stable, long-lasting entangled pairs is a major focus of current
research.

4.5. The Idea of the Quantum Internet – Secure Global Networking


The Quantum Internet is a revolutionary concept that aims to extend the principles of quantum
communication across a global network, enabling fundamentally secure and powerful
communication channels. Unlike the classical internet, which transmits information in binary
form using electrical or optical signals, the quantum internet would transmit qubits—information
encoded in quantum states like the spin of an electron or the polarization of a photon. One of its
most powerful features is quantum entanglement, which allows instantaneous correlations
between distant qubits, enabling advanced functions such as quantum teleportation and device-
independent quantum key distribution (QKD).
The most compelling advantage of a quantum internet is unbreakable security. Since any attempt
to eavesdrop on quantum communication disturbs the quantum states being transmitted, such
intrusion can be immediately detected. This makes it ideal for sensitive communications in
defense, finance, diplomacy, and personal privacy. A fully functional quantum internet could
also connect quantum computers across the globe, creating a distributed quantum computing
network with immense collective processing power.
Building the quantum internet, however, is extremely challenging. Quantum signals degrade over
distance, and classical repeaters used in today’s internet cannot be used for qubits due to the no-
cloning theorem. As a result, researchers are developing quantum repeaters based on
entanglement swapping and quantum memory, which can extend the range of quantum
communication without destroying the quantum state. Some countries, like China, have already
taken early steps toward building quantum internet infrastructure, with successful satellite-based
QKD demonstrations.
In the future, the quantum internet could enable completely secure banking, tamper-proof voting
systems, cloud quantum computing, and next-generation encryption protocols. Although it may
take decades to be fully realized, the quantum internet represents a paradigm shift in the way
humanity communicates and processes information—merging the laws of physics with global
networking to create a new digital frontier.
4.5.1. What Is the Quantum Internet?
• A proposed global network that uses quantum communication protocols to transmit qubits
instead of classical bits.
• It connects quantum devices (like sensors, computers, and communication nodes) using
principles of quantum mechanics—mainly entanglement and superposition.

4.5.2. Core Technologies


• Qubits: Basic units of quantum information (photons, ions, electrons).
• Quantum Entanglement: Allows distant qubits to be correlated in a way that classical
systems can't replicate.
• Quantum Teleportation: Transfers quantum states across the network using entangled
particles.
• Quantum Repeaters: Special nodes that extend communication distances by performing
entanglement swapping and storing qubit states in quantum memory.

4.5.3. Unbreakable Security


• Quantum Key Distribution (QKD): Enables users to exchange encryption keys securely.
• Eavesdropping alters the quantum state, making intrusion detectable.
• Prevents cyber-attacks like man-in-the-middle or signal interception that are common on
the classical internet.

4.5.4. Applications of the Quantum Internet


• Secure Communication: Military, government, and corporate data can be transmitted
without risk of decryption.
• Quantum Cloud Computing: Remote users access quantum computing resources via
entangled connections.
• Quantum Sensor Networks: Synchronizing ultra-precise quantum sensors over large
distances for environmental monitoring or space exploration.
• Tamper-Proof Voting & Financial Transactions: Trustless systems using quantum
protocols to ensure integrity.

4.5.5. Global Developments and Initiatives


• China’s Micius Satellite: Demonstrated QKD between ground stations 1,200 km apart
via satellite.
• EU’s Quantum Flagship Program: Investing heavily in quantum network research.
• U.S. Quantum Internet Blueprint: A federal strategy to build a national quantum
communication backbone.

4.5.6. Challenges to Realization


• Quantum Signal Loss: Photons lose energy and coherence over long distances in fiber
optics.
• No-Cloning Theorem: Quantum data cannot be copied, so traditional amplifiers/repeaters
don't work.
• Scalability: Developing stable, affordable, and room-temperature quantum devices for
large-scale deployment.
• Standardization: Lack of unified protocols and architecture across global research and
industries.

4.5.7. The Future Vision


• A fully secure, tamper-proof internet with global reach.
• The merging of classical networks and quantum backbones, creating hybrid
communication systems.
• Connecting quantum computers, quantum sensors, and quantum users around the world
to form the foundation of a new digital age.

4.6. Introduction to Quantum Computing


Quantum computing is a revolutionary paradigm that harnesses the strange and powerful
principles of quantum mechanics to process information in fundamentally new ways. Unlike
classical computers, which use bits (0s and 1s) as the basic unit of data, quantum computers use
qubits—quantum bits that can exist in a superposition of both 0 and 1 at the same time. This
property allows quantum computers to perform many calculations in parallel. Furthermore,
qubits can be entangled, meaning the state of one qubit is linked to the state of another, no matter
the distance. These features enable quantum computers to solve certain problems much faster
than classical computers. For example, quantum algorithms like Shor’s algorithm can factor large
numbers exponentially faster than the best-known classical algorithms—posing a challenge to
existing encryption systems. Though the technology is still in early stages, quantum computing
holds promise in fields such as cryptography, optimization, drug discovery, artificial intelligence,
and materials science. However, building reliable quantum computers is challenging due to
issues like decoherence, error rates, and the need for extremely low temperatures.

4.7. Quantum Parallelism (Many States at Once)


One of the most powerful concepts in quantum computing is quantum parallelism, which refers
to a quantum system’s ability to evaluate multiple input states simultaneously. This is possible
because of superposition, where a qubit can exist in a combination of both |0⟩ and |1⟩ states at
once. When multiple qubits are in superposition, the system represents a vast number of
combinations at the same time. For example, a 3-qubit system in superposition can represent all
8 (2³) possible combinations of bits at once. This parallelism allows quantum algorithms to
explore a large solution space in a fraction of the time it would take a classical computer.
However, the real power of quantum parallelism lies not just in evaluating many states
simultaneously, but in using interference and entanglement to amplify correct answers and cancel
out incorrect ones. This principle is crucial in quantum algorithms like Grover’s search algorithm,
which finds an item in an unsorted database in √N steps instead of N steps. It’s important to note
that we can’t directly read out all the parallel states—measurement collapses the system, so the
trick lies in carefully designing algorithms to extract useful outcomes from the superposition.

4.6. Classical vs Quantum Gates


In classical computing, logic gates are simple devices that perform operations on one or more
bits, such as AND, OR, and NOT gates. These gates are deterministic and irreversible in many
cases—once a bit is processed, its previous state may be lost. Classical gates manipulate bits
using electrical circuits and are limited to binary state changes. In contrast, quantum gates operate
on qubits and follow the rules of unitary transformations, which are linear and reversible
operations. Common quantum gates include the Hadamard gate (which puts a qubit into
superposition), the Pauli-X gate (quantum equivalent of NOT), and the CNOT gate (a two-qubit
gate used in entanglement). Unlike classical gates, quantum gates can perform operations that
involve rotating states on the Bloch sphere, enabling complex manipulations of quantum states.
Also, quantum gates must be reversible, which means the input can always be retrieved from the
output. This is essential because information loss would violate quantum mechanics. Quantum
circuits are composed of sequences of such gates, and their combined behavior enables quantum
algorithms that can outperform classical counterparts in specific tasks.
Feature Classical Gates Quantum Gates
Operate On Bits (0 or 1) Qubits (superpositions)
Examples AND, OR, NOT, NAND Hadamard, Pauli-X, CNOT, T-gate
Reversibility Often irreversible Always reversible (unitary operations)
State Binary states Complex vectors on the Bloch sphere
Representation
Information Not always preserved Always preserved (no information
Preservation loss)
Entanglement Not possible Possible with multi-qubit gates
Capability
Parallelism No (sequential processing) Yes (superposition + interference)

• Hadamard Gate (H): Creates superposition.


• Pauli-X Gate: Equivalent to classical NOT.
• CNOT Gate: Conditional operation that can entangle qubits.
• Quantum Circuits: Built by combining quantum gates; analogous to classical logic
circuits but exponentially more powerful for certain tasks.

4.7. Challenges: Decoherence and Error Correction


One of the most critical challenges in quantum computing is decoherence, which refers to the
loss of quantum information due to the interaction of a qubit with its surrounding environment.
Qubits are extremely delicate—they must be isolated from vibrations, temperature fluctuations,
electromagnetic interference, and even cosmic rays. When a qubit decoheres, it loses its
superposition and entanglement, rendering the information unusable. This fragility limits the time
available for computation and increases the error rate, making large-scale quantum computing
extremely difficult. In addition to decoherence, quantum operations themselves are prone to
errors, both from imperfect gate operations and readout inaccuracies.
To address this, researchers are developing advanced quantum error correction (QEC)
techniques. Unlike classical error correction, which uses simple redundancy, quantum error
correction must protect quantum information without directly measuring or copying it—because
doing so collapses the quantum state. This is achieved using entangled logical qubits made from
multiple physical qubits. Popular codes like the Shor Code and Surface Code are designed to
detect and correct bit-flip and phase-flip errors without destroying the information. However,
implementing QEC requires many more physical qubits per logical qubit, often hundreds or
thousands, dramatically increasing the system size and complexity. Overcoming decoherence
and developing scalable, fault-tolerant error correction are essential for making practical, reliable
quantum computers a reality.

4.7.1. Challenge: Decoherence


Definition: Decoherence is the loss of quantum coherence when a qubit interacts with its
environment.
Causes: Environmental noise, temperature fluctuations, magnetic fields, radiation, material
imperfections.
Effect: Qubits lose their quantum behavior (superposition and entanglement), leading to errors.
Impact: Limits computation time and makes quantum results unreliable if not corrected.

4.7.2. Challenge: Quantum Error Correction (QEC)


Problem: Quantum states cannot be copied (no-cloning theorem), so classical error correction
methods don't work.
Solution: Use redundant encoding of quantum information in logical qubits built from multiple
physical qubits.
Popular Methods:
• Shor Code – Encodes 1 logical qubit into 9 physical qubits.
• Surface Code – Highly fault-tolerant, scalable architecture requiring fewer operations.
Complexity: Requires enormous overhead—hundreds or thousands of physical qubits for one
logical qubit.
Goal: Achieve fault-tolerant quantum computing that can operate reliably even with noise and
hardware imperfections.

4.8. Real-World Importance and Future Potential


Quantum computing is not just a theoretical marvel—it holds the potential to transform industries
and redefine computing as we know it. In pharmaceuticals, it could revolutionize drug discovery
by simulating molecular interactions at a level no classical computer can match, reducing years
of R&D into weeks. In finance, quantum algorithms can optimize portfolios, assess risks in real-
time, and detect fraud faster and more accurately. Logistics and supply chain systems could be
optimized on a global scale, saving billions through efficient resource allocation. Quantum-
enhanced AI and machine learning models could identify patterns and make predictions with far
greater speed and precision than current models allow.
Moreover, quantum communication can enable secure data transfer through quantum key
distribution, making eavesdropping impossible and redefining cybersecurity. In the energy
sector, quantum simulations could lead to breakthroughs in battery technology and materials for
clean energy. Climate modelling and natural disaster prediction could become more accurate by
processing vast datasets through quantum simulations. Long-term, the quantum internet could
securely connect quantum computers worldwide, allowing for distributed quantum computing.
Despite the hurdles, the future of quantum computing is bright. Governments, tech giants, and
startups alike are investing billions to make it a reality. The technology is still in its infancy, but
its disruptive potential is undeniable. Just as classical computing gave birth to the internet, social
media, and AI, quantum computing could be the cornerstone of the next technological revolution,
solving problems that today are beyond the reach of even our most powerful supercomputers.

4.8.1 Real-World Importance


Healthcare: Molecular modeling for drug discovery, protein folding, personalized medicine.
Finance: Portfolio optimization, market simulation, fraud detection, real-time decision making.
Logistics: Route optimization, supply chain modeling, dynamic scheduling.
Cybersecurity: Quantum-safe encryption and secure communication using quantum key
distribution (QKD).
AI and ML: Speeding up training of models, improving pattern recognition, enhancing data
analysis.

4.8.2. Future Potential


Quantum Internet: Enables secure, high-speed, global quantum communication and networking.
Materials Science: Simulating new materials for superconductors, batteries, solar cells.
Climate Science: Enhances simulation models for weather, climate, and environmental changes.
National Security: Protecting critical infrastructure with quantum encryption, predicting and
countering threats.
Economic Growth: Opens new industries, job roles, and research domains with high innovation
potential.
SYLLABUS

Unit 5

Applications, Use Cases, and the Quantum Future


Real-world application domains:
Healthcare (drug discovery),Material science, Logistics and optimization, Quantum sensing
and precision timing, Industrial case studies: IBM, Google, Microsoft, PsiQuantum, Ethical,
societal, and policy considerations, Challenges to adoption: cost, skills, standardization,
Emerging careers in quantum: roles, skillsets, and preparation pathways, Educational and
research landscape – India's opportunity in the global quantum race
Applications, Use Cases, and the Quantum Future

5.0 Introduction

Quantum computing is poised to revolutionize numerous fields by solving problems that are
practically impossible for classical computers. In medicine, it can simulate molecular
interactions at an atomic level, enabling the discovery of new drugs and personalized
treatments. In finance, quantum algorithms may drastically improve risk analysis, portfolio
optimization, and fraud detection by processing vast datasets in real time. In logistics and supply
chain management, companies like DHL and Volkswagen are already exploring quantum
algorithms to optimize delivery routes and reduce operational costs. Cybersecurity, too, is
expected to transform, as quantum computers may break current encryption methods,
prompting the development of quantum-safe cryptography.
In artificial intelligence, quantum computing can enhance machine learning models, enabling
faster training and better pattern recognition for applications like autonomous driving or
language translation. Climate modelling is another significant use case, where quantum
simulations can offer better predictions for global warming and natural disasters. Material
science can benefit as well, with the discovery of new materials for batteries, superconductors,
or solar panels. Moreover, quantum computing can simulate quantum systems themselves,
aiding the development of better quantum devices. As we look into the future, a quantum-
powered world could bring disruptive innovation, but it will also require entirely new
programming models, infrastructure, and ethical considerations to harness its full potential
responsibly.

5.1 Real-world application domains

Quantum technologies are increasingly moving from theory to real-world application. These
technologies exploit principles of quantum mechanics—such as superposition, entanglement,
and quantum tunneling—to perform tasks that classical systems struggle with or cannot do at
all.
Fig 5.0 Application of Quantum Computing

Here are some real-world applications of quantum technologies, categorized by field:

5.1.1 Healthcare
Drug Discovery
Drug discovery is one of the most promising real-world applications of quantum technologies.
The process of discovering new drugs involves simulating complex molecules and chemical
reactions—tasks that are extremely difficult and time-consuming for classical computers.
Quantum computers offer a revolutionary approach.

Fig.5.1: The Potential Role of Quantum Computing in Biomedicine and Healthcare


Quantum computing has the potential to transform drug discovery by simulating complex
molecular structures and chemical reactions interactions with unprecedented accuracy.
Classical computers struggle with these complex calculations due to the enormous number of
possible configurations in large molecules. Companies like IBM, Google, and D-Wave, as well
as biotech firms like Biogen and Roche, are exploring this for faster drug development.
Quantum systems, however, can process these combinations more efficiently by leveraging
quantum superposition and entanglement. This can significantly reduce the time and cost of
discovering new drugs. Pharmaceutical companies are exploring quantum algorithms to
identify promising compounds and predict how they bind to target proteins. Personalized
medicine also stands to gain, as quantum simulations can model individual genetic variations.
This leads to customized treatments that are more effective with fewer side effects.
Additionally, quantum computing can aid in optimizing clinical trials by selecting ideal patient
groups and predicting outcomes. In the future, quantum-enhanced drug discovery could
accelerate responses to pandemics and rare diseases alike.
Why Quantum Computing for Drug Discovery?

i. Molecular Simulation

Molecules follow the laws of quantum mechanics. Quantum computers can naturally model
these behaviors:

• Simulate interactions between atoms and molecules.

• Predict molecular properties and binding affinities.

• Understand reaction mechanisms at the quantum level.

ii. Speed and Accuracy

Traditional supercomputers use approximations for quantum behavior, which limits accuracy.
Quantum computers can perform these simulations exponentially faster and more accurately,
leading to:

• Faster screening of drug candidates.

• Better prediction of side effects and efficacy.


iii. Reduction in Cost & Time

Traditional drug development takes 10–15 years and billions of dollars. Quantum-enabled
simulations could significantly shorten R&D cycles.

Current Applications & Progress

1. Protein Folding & Target Interaction

Quantum computers help simulate how proteins fold and how drugs bind to them.

Understanding folding is critical for targeting diseases like Alzheimer’s, cancer, and viral
infections.
2. Chemical Reaction Simulation

Modeling how a candidate drug behaves in the human body.

Example: BASF and Zapata Computing work on reaction pathway predictions.

Table 5.1 Companies and Research Labs Involved

Organization Contribution
IBM Quantum Simulated small molecules like LiH and BeH₂. Collaborating with
biotech firms.
Google Quantum AI Simulated basic molecules using Sycamore quantum processor.
D-Wave Exploring quantum annealing for molecule optimization
AstraZeneca Collaborating with Quantinuum and Cambridge Quantum for drug
design.
Roche & Boehringer Partnering with quantum startups to simulate complex molecules.

Ingelheim
ProteinQure Uses quantum computers for protein-drug interactions and
optimization

5.1.2 Material Science

Quantum technologies are transforming material science by enabling scientists to discover and
design new materials with unprecedented accuracy and speed. Quantum computers and
quantum simulations help model complex atomic interactions that are too difficult for classical
computers to handle.

Why Use Quantum Technologies in Material Science?

1.Quantum systems obey quantum rules

Traditional materials modeling often relies on approximations. Quantum computers simulate


matter at the quantum level—electrons, bonds, energy states—without such approximations.

2.Designing from the atom up

• Quantum technologies allow researchers to:


• Discover new superconductors.
• Design stronger, lighter alloys.
• Engineer better batteries, semiconductors, and catalysts.

Quantum computing enables the accurate simulation of material behavior at the atomic level,
which is difficult for traditional systems to achieve. This opens the door to discovering new
materials with tailored properties for use in industries such as energy, electronics, and
aerospace. For instance, researchers could design more efficient superconductors, lighter and
stronger metals, or advanced polymers for biodegradable packaging. Quantum simulations
allow scientists to test and tweak atomic structures before they are physically created, saving
time and resources.

Fig5.2: Quantum Computing and Simulations for Energy Applications


The development of better batteries—like solid-state or lithium-air types—can also be
accelerated through quantum methods. Solar panel efficiency could be significantly improved
by finding materials that better convert sunlight into electricity. High-performance computing
already assists in these areas, but quantum systems bring the necessary scale and precision. The
ability to model quantum effects directly makes quantum computing an ideal tool for material
science. It may soon lead to breakthroughs in sustainability, electronics, and manufacturing.
For example, Volkswagen is using quantum computing to simulate battery materials.

5.1.3 Logistics and optimization

Quantum computing is set to revolutionize logistics and optimization problems that are
computationally intensive for classical systems. These include route optimization, supply chain
management, inventory forecasting, and delivery scheduling. Quantum algorithms like the
Quantum Approximate Optimization Algorithm (QAOA) are being explored to solve such
combinatorial problems more efficiently.

Fig5.3: Quantum Computing Applications In Logistics And Supply Chain


Companies like DHL and FedEx are investigating quantum solutions to reduce delivery times
and costs, especially under variable constraints like traffic and weather. In manufacturing,
quantum systems can optimize production line workflows and resource allocation. Airlines
could use quantum methods to improve aircraft scheduling and crew assignments. As logistics
grow more complex with global trade, the ability to find near-optimal solutions rapidly becomes
a competitive advantage. Classical computers reach limitations quickly with these NP-hard
problems, whereas quantum systems scale better. In the near future, logistics powered by
quantum computing could redefine speed and precision in global commerce.

5.1.4 Quantum sensing and precision timing

Quantum sensing harnesses the unique properties of quantum systems—such as superposition,


entanglement, and quantum tunneling—to detect and measure physical quantities with extreme
precision. Unlike classical sensors, which are limited by thermal noise and other environmental
interferences, quantum sensors exploit the sensitivity of quantum states to detect incredibly
subtle changes in environmental parameters. These include magnetic fields, gravitational
anomalies, electric fields, acceleration, and rotation. As a result, quantum sensing is opening
up new frontiers in areas that demand ultra-high sensitivity and accuracy.
For example, quantum sensors can detect minute changes in magnetic and gravitational fields,
enabling earlier earthquake detection or underground resource mapping. In healthcare, they
may allow for more accurate brain scans or non-invasive diagnostics. Precision timing, powered
by quantum clocks, ensures ultra-accurate synchronization, essential for global positioning
systems (GPS), financial trading networks, and secure communications.
This technology promises breakthroughs in fields that require extremely sensitive instruments,
such as medical imaging, geological surveying, and navigation.. These quantum clocks are far
more stable and precise than current atomic clocks. Defense and aerospace sectors are also
interested in quantum sensors for inertial navigation systems that don’t rely on GPS.
One of the most promising applications of quantum sensing is in geophysical and geological
surveying. For instance, quantum gravimeters can detect underground voids, mineral
deposits, and water sources by sensing tiny variations in gravitational fields. This has significant
implications for oil and gas exploration, mining, archaeology, and even tunnel detection for
military use. Similarly, quantum magnetometers can detect minute changes in magnetic
fields, which can be used to monitor volcanic activity or predict seismic disturbances—enabling
early earthquake warning systems that could save lives.
In the medical field, quantum sensors are expected to revolutionize diagnostic techniques.
Technologies such as quantum-enhanced magnetoencephalography (MEG) and
magnetocardiography (MCG) could allow for highly detailed and non-invasive monitoring
of brain and heart activity. These tools offer improved resolution compared to conventional
imaging systems and can help detect abnormalities at much earlier stages, contributing to early
diagnosis and treatment of neurological or cardiac disorders.
Another critical domain is precision timing, where quantum clocks—often based on optical
transitions in atoms like strontium or ytterbium—offer accuracy several orders of magnitude
higher than traditional atomic clocks. These clocks are capable of maintaining time so precisely
that they would lose less than a second over the age of the universe. This level of precision is
essential for a wide range of applications: GPS and satellite navigation, which require
synchronized clocks to triangulate location; high-frequency financial trading, where
nanosecond timing accuracy can determine the success of transactions; and quantum-secure
communication networks, which rely on exact timing to distribute quantum keys securely.
In defense and aerospace, quantum sensors play a pivotal role in developing inertial
navigation systems that do not depend on GPS signals. These systems can guide submarines,
aircraft, or spacecraft accurately, even when satellite signals are jammed or unavailable. As
geopolitical and space-based threats grow, the ability to operate independently of GPS is
becoming a strategic necessity.
As quantum sensor technologies mature and become more compact and energy-efficient, they
are likely to be integrated into consumer-grade electronics. Future smartphones, wearables,
and health devices could incorporate quantum-enhanced sensors for more accurate fitness
tracking, health diagnostics, and even environmental monitoring. Such capabilities would
quietly yet significantly change how individuals interact with and understand the world around
them.
As the technology matures, everyday devices like smartphones and wearables may include
quantum-enhanced sensors for better tracking and diagnostics. The impact of quantum sensing
could quietly but profoundly reshape how we measure and perceive the world.

In this, quantum sensing and precision timing stand to redefine the limits of measurement
and detection across multiple sectors. Their impact, though often behind the scenes, will be
foundational in enabling next-generation technologies in science, security, health, navigation,
and communication. As these tools move from the lab to real-world deployment, they will play
a critical role in building a more precise, responsive, and interconnected future.
5.2 Industrial case studies:

5.2.1 IBM

IBM has been one of the earliest and most active players in the quantum computing industry.
Its flagship platform, IBM Quantum, provides cloud-based access to quantum processors,
allowing researchers, students, and developers to experiment with quantum algorithms. IBM
introduced the Qiskit open-source framework to encourage quantum programming and research
collaboration.
Their IBM Quantum System One, the world’s first integrated quantum system for commercial
use, has been deployed in multiple locations globally. IBM’s roadmap is transparent and
ambitious— they aim to scale quantum hardware from hundreds to thousands of qubits using
error-corrected quantum processors. IBM is also making progress in quantum error correction,
recently demonstrating the use of quantum LDPC (Low-Density Parity-Check) codes, which
are essential for building reliable, large-scale quantum systems. Their 2023 milestone—the
433-qubit “Osprey” processor—showcases their hardware scalability. By 2025, IBM plans to
release Condor, a processor with over 1,000 qubits, further pushing the envelope. IBM is
pioneering modular quantum computing, where smaller quantum chips are interconnected to
function as a larger system. This approach mirrors classical multi-core processing and is crucial
for scalability.
In parallel, IBM continues to enhance Qiskit Runtime, an execution environment that optimizes
quantum circuit performance through advanced compilation and error mitigation. IBM also
publishes a transparent quantum roadmap, updated annually, which guides developers,
educators, and researchers globally. Their presence in quantum education is unmatched,
offering resources like Quantum Composer, hands-on labs, and hackathons through the IBM
Quantum Network. Collaborations with organizations such as CERN and MIT underline their
leadership in open science. IBM’s dual commitment to technological progress and community
development positions it as a central pillar in the global quantum ecosystem.
Notably, IBM is collaborating with industries such as healthcare, finance, and chemicals to
apply quantum computing to real-world challenges, including molecule simulation, portfolio
optimization, and materials discovery. They have also partnered with governments and
academic institutions to develop the quantum workforce, showcasing a commitment not only
to technology but also to ecosystem building.
5.2.2 Google

Google captured global attention in 2019 when it claimed quantum supremacy—


demonstrating that its 53-qubit processor “Sycamore” could perform a specific computation in
200 seconds that would take the best classical supercomputer thousands of years. While the
practical value of the task was debated, the experiment marked a significant milestone in
quantum hardware progress.
Google’s quantum research division is focused on building a fault-tolerant quantum
computer with 1 million physical qubits. They are exploring use cases in AI, optimization, and
quantum chemistry. Google is also investing heavily in error correction techniques to make
large-scale quantum computing viable.
Google is also a quantum AI hybrid model, where classical and quantum systems work in
tandem to accelerate tasks like data clustering, pattern recognition, and neural network training.
Their Quantum AI Campus in Santa Barbara is home to cutting-edge labs where innovations
in cryogenics, qubit calibration, and quantum firmware are rapidly advancing. Google’s team
has made important strides in quantum error suppression through techniques like zero-noise
extrapolation and quantum benchmarking. Their recent work on surface codes and logical
qubits shows measurable progress toward fault tolerance.
Their bold long-term vision includes integrating quantum computing with classical systems and
cloud infrastructure. As a tech leader, Google’s work is influencing academic and industrial
agendas worldwide, accelerating innovation and competition in the quantum space.
Beyond hardware, Google actively contributes to the open-source quantum community through
Cirq and TensorFlow Quantum, allowing AI researchers to explore quantum-enhanced
machine learning models. Google’s Quantum Computing Service aims to eventually offer
practical applications through Google Cloud, bringing enterprise-level quantum access into
mainstream business. Their partnerships with institutions like UC Berkeley and ETH Zurich
are advancing research in quantum simulation and condensed matter physics. With a strong
emphasis on longterm scalability, Google’s approach balances scientific rigor with practical
engineering. Its bold goal—achieving a commercial-grade, error-corrected quantum computer
this decade—drives much of the industry’s pace

5.2.3Microsoft

Microsoft is approaching quantum computing from a full-stack perspective. Unlike IBM and
Google, Microsoft is working on topological qubits, a type of qubit expected to be more stable
and less error-prone than traditional ones. While topological qubits are still in early stages,
Microsoft is simultaneously providing tools and platforms through Azure Quantum, a
cloudbased ecosystem that offers access to quantum hardware and simulators from multiple
vendors.
Azure Quantum’s integration with Microsoft’s cloud ecosystem gives users access to quantum
solutions alongside tools like Azure AI and Azure HPC—bridging classical and quantum
workflows. Microsoft’s Quantum Innovator Series and technical documentation have been
influential in educating enterprises on how to prepare for the quantum era. They are also
focusing on quantum-resilient cryptography, developing protocols that can withstand both
classical and quantum attacks. By building an abstraction layer across diverse hardware,
Microsoft is enabling developers to write once and deploy across platforms, accelerating
application prototyping. Their end-to-end approach reflects a deep commitment to usability,
scalability, and enterprise adoption.
Their Quantum Development Kit (QDK) includes Q#, a programming language designed
specifically for quantum algorithms. By focusing on integration and developer accessibility,
Microsoft’s contribution lies not only in quantum research but also in making quantum
technologies available and practical for developers and organizations across various sectors.
Microsoft is also deeply invested in quantum error correction, actively exploring Majorana
fermions—exotic particles believed to make topological qubits naturally error-resistant. Their
StationQ lab, headquartered at UC Santa Barbara, focuses on this ambitious path, which, if
successful, could leap ahead of current noisy qubit approaches. Microsoft has also established
partnerships with academic institutions and quantum startups to develop hybrid
quantumclassical algorithms tailored for early business use cases. They emphasize the
importance of resource estimation tools, allowing developers to assess what kind of quantum
system is required to run a given algorithm.

5.2.4PsiQuantum

PsiQuantum takes a unique and bold approach to quantum computing by building a photonic
quantum computer using conventional semiconductor fabrication techniques. Their goal is to
build a fault-tolerant, million-qubit quantum computer using photons as qubits instead of
superconducting circuits. PsiQuantum’s photonic approach benefits from the low decoherence
of photons, which can travel long distances without interacting with their environment—a
major advantage over fragile superconducting qubits. Their system uses linear optical
elements, such as beam splitters and phase shifters, along with single-photon sources and
detectors, which can be manufactured using standard CMOS fabrication techniques. This
positions PsiQuantum to benefit from existing semiconductor supply chains and reduce
hardware costs in the long run.
They are also investing in cryogenic electronics and quantum-classical control systems that
can scale with photonic architectures. PsiQuantum has filed numerous patents related to fault-
tolerant architecture design, photon routing, and quantum error correction, highlighting
the depth of their IP strategy. The firm collaborates with government agencies like DARPA
and national laboratories, and is exploring applications in energy optimization, quantum
networking, and climate modeling. Though still in stealth for some aspects of their
technology, PsiQuantum aims to build a utility-scale quantum computer that could run
meaningful applications with full error correction. Their combination of high ambition,
deep physics, and scalable engineering could allow them to emerge as a disruptive force in
the global quantum race.
This design choice aims to solve scalability and error correction challenges from the ground up.
Unlike other quantum startups, PsiQuantum emphasizes working with existing silicon foundries
to leverage mature infrastructure and reduce manufacturing risk. Although their systems are not
yet publicly available, the company has received significant investment and is partnering with
industry leaders and government bodies to advance its technology. If successful, PsiQuantum
could leapfrog traditional architectures by introducing a scalable and manufacturable approach
to quantum hardware.

5.3 Ethical, societal, and policy considerations

As quantum computing moves from theoretical promise to technological reality, it raises


profound ethical, societal, and policy questions that demand proactive attention. One of the
foremost concerns is the potential to break existing encryption standards. Quantum
algorithms like Shor’s algorithm could render RSA and ECC encryption obsolete, risking
exposure of sensitive data, national security secrets, and private communications. This
necessitates the urgent development and global adoption of post-quantum cryptography to
future-proof digital systems.
Societal inequality is another major issue. If quantum computing remains accessible only to
wealthy corporations or powerful governments, it could widen the digital divide and reinforce
global disparities. Open-source tools, educational programs, and public-sector funding are
essential to democratize access and ensure that quantum benefits are shared across societies.
Job displacement and workforce transformation will also follow. While quantum
technology creates new opportunities, it may disrupt industries by automating tasks or shifting
required skillsets. Preparing a new generation of quantum-literate professionals will require
major reforms in education, including curriculum updates and reskilling initiatives for existing
workers.
Bias and fairness in quantum-enhanced AI systems pose risks as well. If data and algorithms
are biased at the classical level, quantum acceleration could magnify these biases at scale.
Ensuring transparency, explainability, and ethical use of quantum algorithms becomes crucial,
especially in high-stakes fields like finance, healthcare, and criminal justice.
From a policy perspective, governments must develop frameworks for international
cooperation, cybersecurity, export controls, and intellectual property related to quantum
technologies. Just as nuclear technology required treaties and safeguards, quantum computing
calls for regulatory foresight to prevent misuse and promote peaceful innovation.
Ultimately, responsible quantum development must balance scientific ambition with human
values. A collaborative approach—uniting governments, academia, industry, and civil
society— is key to ensuring that quantum advancements uplift humanity without compromising
security, privacy, or equality.

5.4 Challenges to adoption: cost, skills, standardization

The road to mainstream adoption of quantum computing is filled with significant challenges,
the most immediate being cost. Building and maintaining quantum systems—especially those
based on superconducting qubits— requires not only sophisticated technology but also
environments cooled to near absolute zero, typically using expensive dilution refrigerators. The
infrastructure needed to support such systems involves complex shielding from electromagnetic
interference, ultra-stable power sources, and precise control equipment. These requirements
drive up capital and operational expenses, making it nearly impossible for small startups,
educational institutions, or developing countries to participate meaningfully in quantum
research and development. As of now, only a handful of tech giants and government-backed
research labs possess the resources needed to invest in such large-scale quantum initiatives.
Beyond cost, the shortage of skilled professionals in the quantum ecosystem is a pressing
concern. Quantum computing is a multidisciplinary domain that spans quantum mechanics,
advanced mathematics, classical and quantum algorithms, and computer engineering. However,
academic programs offering dedicated training in quantum information science are still limited.
This creates a bottleneck in talent availability, with companies and universities struggling to
find individuals who can bridge the gap between theoretical research and practical system
development. The few who are highly skilled are in such high demand that they are often
absorbed into elite roles within top-tier tech companies or academic institutions, further limiting
broad-based industry access.
The skills gap also hampers innovation. Without a sufficiently large and well-trained
workforce, progress in algorithm design, hardware testing, and software integration slows
considerably. This shortage extends to educators and trainers as well, meaning that scaling up
learning programs is itself a challenge. Governments and educational institutions have started
investing in quantum literacy initiatives, but progress is slow compared to the pace of
technological advancement.
Quantum computing demands a rare combination of knowledge in quantum physics,
mathematics, computer science, and engineering. As a result, the number of trained
professionals capable of designing, building, and programming quantum systems is critically
low.
Another formidable barrier is the lack of standardization across the quantum computing
ecosystem. In classical computing, universal programming languages (like C, Java, or Python),
standardized chip architectures (like x86 or ARM), and defined protocols for data exchange
have created an ecosystem where hardware and software can evolve rapidly and cooperatively.
In contrast, the quantum world remains fragmented. Each hardware vendor—whether working
on superconducting qubits, trapped ions, photonic systems, or topological qubits—uses unique
control systems, programming environments, and error correction methods. As a result,
software written for one platform is rarely portable to another, making collaboration and system
integration difficult.
The absence of standardization also means there is no shared benchmarking system to measure
progress objectively across platforms. This makes it harder for organizations to make informed
decisions about which quantum technologies to invest in, and for researchers to compare results
and replicate studies. Without agreed-upon protocols, it’s also difficult to ensure compatibility
between different layers of the quantum computing stack—from hardware to middleware to
application software.
Until these core challenges—cost, workforce skills, and system standardization—are
addressed, quantum computing will continue to remain largely in the domain of research and
experimentation. For the technology to achieve widespread adoption and commercial viability,
there must be concerted efforts by governments, academia, and industry to democratize access,
invest in education, and agree on shared frameworks and protocols. Only then can the true
transformative potential of quantum computing be fully realized across sectors such as
healthcare, finance, energy, logistics, and beyond
5.5 Emerging careers in quantum : roles, skillsets, and preparation pathways

The rise of quantum computing is generating an exciting array of new career opportunities,
blending physics with computer science, mathematics, and engineering. As quantum
technologies move closer to practical application, the demand for skilled professionals is
growing rapidly. Among the most prominent emerging roles is the Quantum Software
Developer, responsible for writing algorithms tailored to quantum computers using specialized
frameworks such as IBM’s
Qiskit, Google’s Cirq, Xanadu’s PennyLane, or Microsoft’s Q#. These developers work on
creating quantum programs for applications in cryptography, optimization, chemistry, and
machine learning. Another critical role is that of the Quantum Hardware Engineer, who
designs, tests, and maintains the delicate physical systems—such as superconducting circuits,
ion traps, or photonic chips—that serve as the backbone of quantum computation. These
engineers must understand cryogenics, quantum control systems, and the physics of qubit
interactions. Their work ensures the reliable operation of quantum processors under extreme
environmental conditions.
Quantum Researchers and Quantum Algorithm Scientists play a foundational role in
pushing the frontiers of the field. They focus on developing more stable and error-resistant
qubits, inventing novel quantum algorithms, and improving quantum error correction and fault-
tolerance mechanisms. Many of these professionals work in academia or research labs but are
increasingly being recruited into private-sector R&D roles.
In parallel, Quantum Information Scientists work on the theoretical aspects of how quantum
systems process, transmit, and secure information. Their insights underpin advances in areas
like quantum cryptography, quantum communications, and entanglement-based networks.
Meanwhile, the industry is seeing the emergence of roles such as Quantum Systems
Integrators, who bridge the gap between hardware, software, and applications—ensuring that
quantum components work together efficiently across the tech stack.
With the growing intersection of business and quantum, companies are also hiring Quantum
Product Managers, who guide the development and delivery of quantum solutions aligned
with customer needs and market trends. Similarly, Quantum Cybersecurity Analysts are
becoming vital in preparing organizations for a post-quantum world by analyzing encryption
vulnerabilities and implementing quantum-safe cryptographic protocols.
The required skillsets for these careers are diverse but generally include a strong foundation in
quantum mechanics, linear algebra, probability theory, and classical programming
languages like Python or C++. Knowledge of quantum programming platforms, familiarity
with quantum gates and circuits, and experience with simulation tools are increasingly
expected. In hardware-related roles, additional expertise in electrical engineering,
nanofabrication, cryogenics, or optics may be essential.
To prepare for a career in quantum technologies, students and professionals can pursue formal
degrees in physics, computer science, mathematics, or electrical engineering. Many
universities now offer specialized quantum computing master’s programs,
interdisciplinary
PhDs, and research assistantships in quantum labs. For those seeking flexible learning paths,
numerous online platforms—including edX, Coursera, QuTech Academy, and MITx—offer
quantum computing courses. Additionally, companies like IBM, Microsoft, and Google
provide free tools and resources for self-learning and experimentation.
Hands-on training is increasingly vital. Platforms such as IBM Quantum Experience, Azure
Quantum, and Amazon Braket allow users to access real quantum hardware and simulators.
Industry certifications, hackathons, internships, and quantum developer bootcamps are also
emerging as effective ways to gain practical exposure and build credibility in the field.
As quantum technology evolves, so too will the career landscape. Interdisciplinary
collaboration— combining physics, engineering, AI, and cybersecurity—will be essential.
Lifelong learning and adaptability will remain key traits for anyone aiming to build and sustain
a successful career in the quantum workforce of the future.

5.6 Educational and research landscape – India's opportunity in the global quantum race

India is uniquely positioned to play a pivotal role in the global quantum revolution, thanks to
its vast pool of scientific talent, growing technology infrastructure, and increased policy-level
attention to emerging technologies. Recognizing the transformative potential of quantum
computing The Indian government launched the National Mission on Quantum Technologies
& Applications (NM-QTA) with a significant outlay of ₹8,000 crores (around $1 billion
USD). Premier institutes like IISc Bangalore, IIT Bombay, Delhi, Madras, and Kharagpur,
and Tata Institute of Fundamental Research (TIFR) are at the forefront of academic research
in quantum physics and quantum computing. These institutions are engaged in pioneering work
on quantum algorithms, quantum key distribution (QKD), quantum error correction,
quantum optics, and quantum materials. In parallel, specialized quantum research labs are
being established in collaboration with government agencies such as DRDO, ISRO, and DST,
further expanding India’s R&D footprint.
Educational initiatives are also gathering momentum. Universities are beginning to offer
elective and degree programs in quantum information science, and efforts are underway to
integrate quantum modules into engineering and physics curricula at both undergraduate and
postgraduate levels. The Quantum Computer Simulator Toolkit (QSim), launched by the
Ministry of Electronics and Information Technology (MeitY), is an important step toward
democratizing quantum education. QSim allows students and researchers to develop and test
quantum algorithms on simulated environments without needing access to real quantum
hardware.
Despite this promising start, India must address several systemic challenges to fully harness its
potential. A major bottleneck is the shortage of trained faculty and researchers who specialize
in quantum science. Additionally, infrastructure gaps—such as the lack of high-fidelity
quantum hardware, advanced fabrication labs, and dedicated quantum computing centers—
impede rapid progress. There is also a pressing need to foster deeper industry-academic
collaborations, which remain limited compared to global counterparts.
To bridge these gaps, public-private partnerships (PPP) must be scaled up. Tech companies
like TCS, Infosys, and HCL are beginning to explore quantum computing applications and
can play a vital role in commercializing academic research. India should also focus on
international collaborations with leading quantum research hubs in the US, EU, Canada, and
Japan to gain access to expertise, platforms, and funding. Encouraging student participation
through quantum hackathons, fellowships, and global internships will further energize the
ecosystem.
With its robust IT and software industry, deep mathematical and scientific base, and strong
policy direction, India has the potential not only to catch up with global quantum leaders but
also to lead in select areas such as quantum software development, theoretical quantum
research, quantum cryptography, and simulation technologies. For this vision to
materialize, a long-term commitment to curriculum reform, faculty development, infrastructure
investment, and ecosystem collaboration is essential.
If India leverages these strengths strategically, it can transform from a follower to a global
innovator in quantum technologies—contributing significantly to secure communications,
next generation computing, precision medicine, and national defence.

You might also like