Qta Unit-1,3,4,5
Qta Unit-1,3,4,5
Unit 1
This course is designed to introduce students to the fundamental concepts of quantum theory
while showing how these ideas are being applied in cutting-edge quantum technologies. It begins
with the basics of quantum mechanics, covering wave-particle duality, the uncertainty principle,
quantum states, operators, and the Schrödinger equation, laying the groundwork for
understanding how nature behaves at microscopic scales. The course then transitions into
quantum computing, where students learn about qubits, quantum gates, quantum circuits, and
algorithms such as Grover’s and Shor’s that outperform classical solutions. Students are
introduced to quantum sensing and metrology, where quantum systems enable ultra-sensitive
measurements vital to healthcare, navigation, and defence applications. The final unit discusses
real-world industrial applications of quantum technologies, highlighting how companies like
IBM, Google, and Microsoft are deploying quantum systems to solve complex problems. It also
examines global initiatives and India’s strategic efforts, such as the National Mission on
Quantum Technologies & Applications.
The course addresses the challenges of scalability, cost, standardization, and workforce
readiness, giving learners a realistic view of the current landscape. Importantly, it opens doors to
emerging careers in quantum science by equipping students with interdisciplinary skills across
physics, math, and programming. By studying this course, learners gain a solid understanding of
the theory behind quantum phenomena and the practical knowledge needed to engage with this
frontier field. As quantum technology continues to evolve, those equipped with this knowledge
will be at the forefront of innovation in industries ranging from computing and cybersecurity to
materials science and precision medicine.
At the end of the 19th century, classical physics was seen as nearly complete. Newtonian
mechanics explained planetary motion, Maxwell’s equations described electromagnetism,
thermodynamics clarified heat and work, and classical optics enabled scientific tools like
microscopes and telescopes. Chemistry had a basic periodic table, but the atomic structure and
electron behaviour in reactions were not understood.
The transition from classical physics to quantum mechanics marks one of the most profound
paradigm shifts in scientific history. Classical physics, which dominated for over two centuries,
was grounded in the belief that all natural phenomena could be described by deterministic laws.
Concepts such as Newton’s laws of motion, thermodynamic principles, and Maxwell’s equations
provided a complete and elegant framework for understanding the physical world.
Rutherford’s experiments revealed atoms had dense, positively charged nuclei with orbiting
electrons, raising the question of why electrons didn't collapse into the nucleus—something
classical physics couldn’t explain due to predicted energy loss. This and other unexplained
phenomena, such as blackbody radiation, the photoelectric effect, and hydrogen spectral
lines, molecular spectroscopy exposed the limitations of classical models.
However, as experiments grew more precise at the atomic and subatomic scales, cracks began to
appear in this classical edifice. Phenomena like blackbody radiation defied classical
predictions, as models such as the Rayleigh-Jeans law led to absurd results like the ultraviolet
catastrophe—where infinite energy emission at short wavelengths was expected. Similarly, the
photoelectric effect revealed that light could behave like discrete packets of energy, or photons,
as Einstein proposed—challenging the continuous wave view held by classical optics.
These challenges led to the development of quantum mechanics in the early 20th century, with
major contributions from scientists like Planck, Einstein, Bohr, Schrödinger, and others. Before
exploring the key quantum experiments, it introduces the classical wave model of light, which
dominated thinking before quantum theory.
The stability of atoms posed another riddle. According to classical electromagnetism, orbiting
electrons should continuously emit radiation and spiral into the nucleus, yet atoms remained
stable. Moreover, when hydrogen gas was excited, it emitted light in distinct spectral lines rather
than a continuous spectrum—something classical theory couldn’t explain. These inconsistencies
pointed to the need for a new theoretical framework. Enter quantum mechanics: a theory based
not on certainty, but probability, where particles like electrons exhibit both wave-like and
particle-like properties. Max Planck introduced the idea that energy is quantized, emitted in
discrete amounts called quanta. This concept became the seed for a revolution. Niels Bohr refined
atomic models using quantized orbits to explain hydrogen spectra. Louis de Broglie suggested
that matter had wave properties, while Werner Heisenberg introduced the uncertainty principle,
redefining how we measure physical systems. Schrödinger’s wave equation offered a new
mathematical tool to describe electron behaviour in atoms.
Quantum mechanics did not just revise physics—it redefined our understanding of reality itself.
Unlike classical physics, where outcomes were predictable, quantum theory embraced
uncertainty and probability. Observables like position and momentum could no longer be known
simultaneously with arbitrary precision. The deterministic worldview gave way to a statistical
one, yet this new approach proved incredibly accurate and predictive. Quantum theory provided
the foundation for semiconductors, lasers, nuclear energy, and much more. It also laid the
groundwork for modern quantum technologies—such as quantum computing, quantum
cryptography, and quantum sensing—which are now driving a new technological revolution. The
journey from classical to quantum physics reminds us that scientific knowledge evolves, often
through radical shifts, as we probe deeper into the fundamental nature of the universe.
where the electric field and the magnetic field are perpendicular to each other, as shown in Figure
1.1, and oscillate in phase at the angular frequency
where 𝜈 is the frequency of the oscillation, measured in units of s−1 = Hz. In Eqs. (1.1) and (1.2),
k is the wave vector (or momentum vector) of the electromagnetic wave, defined by Eq. (1.4):
Here, 𝜆 is the wavelength of the radiation, measured in units of length, and is defined by the
distance between two consecutive peaks (or troughs) of the electric or magnetic fields. Vector
quantities, such as the electric and magnetic fields, are indicated by an arrow over the symbol or
by bold typeface. Since light is a wave, it exhibits properties such as constructive and destructive
interference. Thus, when light impinges on a narrow slit, it shows a diffraction pattern similar to
that of a plain water wave that falls on a barrier with a narrow aperture. These wave properties
of light were well known, and therefore, light was considered to exhibit wave properties only, as
predicted by Maxwell’s equation.
In general, any wave motion can be characterized by its wavelength 𝜆, its frequency ν, and its
propagation speed. For light in vacuum, this propagation speed is the velocity of light c (c= 2.998
× 108 m/s). In the context of the discussion in the interaction of light with matter will be described
as the force exerted by the electric field on the charged particles, atoms, and molecules. This
interaction causes a translation of charge. This description leads to the concept of the “electric
transition moment,” which will be used as the basic quantity to describe the likelihood (that is,
the intensity) of spectral transition. In other forms of optical spectroscopy (the magnetic
transition moment must be considered as well. This interaction leads to a coupled translation and
rotation of charge, which imparts a helical motion of charge. This helical motion is the hallmark
of optical activity, since, by definition, a helix can be left- or right-handed.
Thus, light as an electromagnetic wave serves as a bridge between classical theory and the
quantum view. While Maxwell’s equations beautifully describe the propagation and wave
behaviour of light, they fall short when explaining phenomena that involve quantized energy
exchange, such as the photoelectric effect or atomic emission spectra. These limitations led to
the development of quantum theory. However, even in the quantum age, the classical wave model
remains foundational for understanding a wide range of light–matter interactions, especially in
spectroscopy, communications, and optical engineering.
it was not possible to describe the experimentally obtained blackbody emission profile by
classical physical models. This profile was shown in Figure 1.2 for several temperatures between
1000 and 5000 K as a function of wavelength. M. Planck attempted to reproduce the observed
emission profile using classical theory, based on atomic dipole oscillators (nuclei and electrons)
in motion. These efforts revealed that the radiation density ρ emitted by a classical blackbody
into a frequency band d𝜈 as function of 𝜈 and T would be given by Eq. (1.5):
where the Boltzmann constant k=1.381 × 10−23 [J/K]. This result indicated that the total energy
radiated by a blackbody according to this “classical” model would increase with 𝜈2 as shown by
the dashed curve in Figure 1.2b. If this equation were correct, any temperature of a material above
absolute zero would be impossible, since any material above 0 K would emit radiation according
to Eq. (1.5), and the total energy emitted would be unrestricted and approach infinity.
Particularly, toward higher frequency, more and more radiation would be emitted, and the
blackbody would cool instantaneously to 0 K. Thus, any temperature above 0 K would be
impossible. This is, of course, in contradiction with experimental results and was addressed by
M. Planck (1901) who solved this conundrum by introducing the term 1/(e^(h𝜈/kT) −1) into the
blackbody equation, where h is Planck’s constant:
The shape of the modified blackbody emission profile given by Eq. (1.6) is in agreement with
experimental results. The new term introduced by Planck is basically an exponential decay
function, which forces the overall response profile to approach zero at high frequency. The
numerator of the exponential expression contains the quantity hν, where h is Planck’s constant
(h =6.626 × 10−34 Js). This numerator implies that light exists as “quanta” of light, or light
particles (photons) with energy
E: Ephoton = hν (1.7)
This, in itself, was a revolutionary thought since the wave properties of light had been established
more than two centuries earlier and had been described in the late 1800s by Maxwell’s equations
in terms of electric and magnetic field contributions. Here arose for the first time the realization
that two different descriptions of light, in terms of waves and particles, were appropriate
depending on what questions were asked. A similar “particle–wave duality” was later postulated
and confirmed for matter as well . Thus, the work by Planck very early in the twentieth century
is truly the birth of the ideas resulting in the formulation of quantum mechanics.
In 1905, Einstein reported experimental results that further demonstrated the energy quantization
of light. In the photoelectric experiment, light of variable color (frequency) illuminated a
photocathode contained in an evacuated tube. An anode in the same tube was connected
externally to the cathode through a current meter and a source of electric potential (such as a
battery). Since the cathode and anode were separated by vacuum, no current was observed, unless
light with a frequency above a threshold frequency was illuminating the photocathode. Einstein
correctly concluded that light particles, or photons, with a frequency above this threshold value
had sufficient kinetic energy to knock out electrons from the metal atoms of the photocathode.
These “photoelectrons” left the metal surface with a kinetic energy given by
where 𝜙 is the work function, or the energy required to remove an electron from metal atoms.
This energy basically is the atoms’ ionization energy multiplied by Avogadro’s number.
Furthermore, Einstein reported that the photocurrent produced by the irradiation of the
photocathode was proportional to the intensity of light, or the number of photons, but that
increasing the intensity of light that had a frequency below the threshold did not produce any
photocurrent. This provided further proof of Eq. (1.9). This experiment further demonstrated that
light has particle character with the kinetic energy of the photons given by Eq. (1.7), which led
to the concept of wave–particle duality of light. Later, de Broglie theorized that the momentum
p of a photon was given by
               p = h∕λ                                                              (1.10)
Equation (1.10) is known as the de Broglie equation. The wave–particle duality was later (1927)
confirmed to be true for moving masses as well by the electron diffraction experiment of
Davisson and Germer [3]. In this experiment, a beam of electrons was diffracted by an atomic
lattice and produced a distinct interference pattern that suggested that the moving electrons
exhibited wave properties. The particle–wave duality of both photons and moving matter can be
summarized as follows. For photons, the wave properties are manifested by diffraction
experiments and summarized by Maxwell’s equation. As for all wave propagation, the velocity
of light, c, is related to wavelength 𝜆 and frequency 𝜈 by
with c =2.998 × 108 [m/s] and 𝜆 expressed in [m] and 𝜈 expressed in [Hz = s−1]. The quantity ̃ν
is referred to as the wavenumber of radiation (in units of m−1 or cm−1) that indicates how many
wave cycles occur per unit length:
Notice that a photon can only move at the velocity of light and the photon mass can only be
defined at the velocity c. Therefore, a photon has zero rest mass, m0. Particles of matter, on the
other hand, have a nonzero rest mass, commonly referred to as their mass. This mass, however,
is a function of velocity v and should be referred to as mv, which is given by
Equation (1.16) demonstrates that the mass of any matter particle will reach infinity when
accelerated to the velocity of light. Their kinetic energy at velocity v (far from the velocity of
light) is given by the classical expression
The discussion of the last paragraphs demonstrates that at the beginning of the twentieth century,
experimental evidence was amassed that pointed to the necessity to redefine some aspects of
classical physics. The next of these experiments that led to the formulation of quantum mechanics
was the observation of “spectral lines” in the absorption and emission spectra of the hydrogen
atom.
Between the last decades of the nineteenth century and the first decade of the twentieth century,
several researchers discovered that hydrogen atoms, produced in gas discharge lamps, emit light
at discrete colors, rather than as a broad continuum of light as observed for a blackbody (Figure
1.2a). These emissions occur in the ultraviolet, visible, and near-infrared spectral regions, and a
portion of such an emission spectrum is shown schematically in Figure 1.3. These observations
predate the efforts discussed in the previous two sections and therefore may be considered the
most influential in the development of the connection between spectroscopy and quantum
mechanics.
These experiments demonstrated that the H atom can exist in certain “energy states” or
“stationary states.” These states can undergo a process that is referred to as a “transition.” When
the atom undergoes such a transition from a higher or more excited state to a lower or less excited
state, the energy difference between the states is emitted as a photon with an energy
corresponding to the energy difference between the states:
where the subscript f and i denote, respectively, the final and initial (energy) state of the atom (or
molecule). Such a process is referred to as a “emission” of a photon. Similarly, an absorption
process is one in which the atom undergoes a transition from a lower to a higher energy state, the
energy difference being provided by a photon that is annihilated in the process. Absorption and
emission processes are collectively referred to as “transitions” between stationary states and are
directly related to the annihilation and creation, respectively, of a photon. The wavelengths or
energies from the hydrogen emission or absorption experiments were fit by an empirical equation
known as the Rydberg equation, which gave the energy “states” of the hydrogen atom as
In this equation, n is an integer (>0) “quantum” number, and Ry is the Rydberg constant, (Ry
=2.179 × 10−18 J). This equation implies that the energy of the hydrogen atom cannot assume
arbitrary energy values, but only “quantized” levels, E(n). This observation led to the ideas of
electrons in stationary planetary orbits around the nucleus, which – however – was in
contradiction with existing knowledge of electrodynamics, as discussed in the beginning of this
chapter. The energy level diagram described by Eq. (1.19) is depicted in Figure 1.4. Here, the
sign convention is as follows. For n= ∞, the energy of interaction between nucleus
and electron is zero, since the electron is no longer associated with the nucleus. The lowest energy
state is given by n= 1, which corresponds to the H atom in its ground state that has a negative
energy of 2.179 × 10−18 J. Equation (1.19) provided a background framework to explain the
hydrogen atom emission spectrum. According to Eq. (1.19), the energy of a photon, or the energy
difference of the atomic energy levels, between any two states nf and ni can be written as
At this point, an example may be appropriate to demonstrate how this empirically derived
equation predicts the energy, wavelength, and wavenumber of light emitted by hydrogen atoms.
This example also introduces a common problem, namely, that of units. Although there is an
international agreement about what units (the system international, or SI units) are to be used to
describe spectral transitions, the problem is that few people are using them. All efforts will be
made to use SI units, or at least give the conversion to other units. The sign conventions used
here are similar to those in thermodynamics where a process with a final energy state lower than
that of the initial state is called an “exothermic” process, where heat or energy is lost The energy
is lost as a photon and is called an emission transition. When describing an absorption process,
the energy difference of the atom is negative, ΔEatom <0, that is, the atom has gained energy
(“endothermic” process in thermodynamics). Following the procedure outlined in Example 1.2
would lead to a negative wavelength of the photon, which of course is physically meaningless,
and one has to remember that the negative ΔEatom implies the absorption of a photon.
1.1.5 Molecular Spectroscopy
rotational, or spin energy levels. Thus, molecular spectroscopy often is classified by the
wavelength ranges of the electromagnetic radiation (for example, microwave or infrared
spectroscopies) or changes in energy levels of the molecular systems. This is summarized in
Table 1.1, and the conversion of wavelengths and energies were discussed in Eqs. (1.11)–(1.15)
and are summarized in Appendix 1. In this table, NMR and EPR stand for nuclear magnetic and
electron paramagnetic resonance spectroscopy, respectively. In both these spectroscopic
techniques, the transition energy of a proton or electron spin depends on the applied magnetic
field strength. All techniques listed in this table can be described by absorption processes
although other descriptions, such as bulk magnetization in NMR, are possible as well. As seen
in Table 1.1, the photon energies are between 10−16 and 10−25 J/photon or about 10−4–105
kJ/(mol photons). Considering that a bond energy of a typical chemical (single) bond is about
250–400 kJ/mol, it shows that ultraviolet photons have sufficient energy to break chemical bonds
or ionize molecules. Most of the spectroscopic processes discussed are absorption or emission
processes as defined by Eq. (1.18):
However, interactions between light and matter occur even when the light’s wavelength is
different from the specific wavelength at which a transition occurs. Thus, a classification of
spectroscopy, which is more general than that given by the wavelength range alone, would be a
resonance/off-resonance distinction. Many of the effects described and discussed in this book are
observed as resonance interactions where the incident light, indeed, possesses the exact energy
of the molecular transition in question. IR and UV/vis absorption spectroscopy, microwave
spectroscopy, and NMR are examples of such resonance interactions. The off-resonance
interactions between electromagnetic radiation and matter give rise to well-known phenomena
such as the refractive index of dielectric materials. These interactions arise since force is exerted
by the electromagnetic radiation on the charged particles of matter even at off-resonance
frequencies. This force causes an increase in the amplitude of the motion of these particles. When
the frequency of light reaches the transition energy between two states, an effect known as
anomalous dispersion of the refractive index takes place. This anomalous dispersion of the
refractive index always accompanies an absorption process. This phenomenon makes it possible
to observe the interaction of light either in an absorption or as a dispersion measurement, since
the two effects are related to each other by a mathematical relation known as the Kramers–Kronig
relation. This aspect will be discussed in more detail in Chapter 5. The normal (nonresonant)
Raman effect is a phenomenon that also is best described in terms of off-resonance models, since
Raman scattering can be excited by wavelengths that are not being absorbed by molecules. A
discussion of nonresonant effects ties together many well-known aspects of classical optics and
spectroscopy.
1.2.1 Superposition
where ∣ψ⟩| is the state of the qubit, ∣0⟩ and ∣1⟩ are the basis states (or the computational basis
states), and ααα and βββ are complex numbers called probability amplitudes. The probability
amplitudes determine the probability of measuring the qubit in either state when a measurement
is made.
Importantly, the state of superposition can be maintained only while a quantum system is
unobserved. Once measured, the wave function of a quantum system in a state of superposition
"collapses" into one of the basis states.
For a concrete example of how this might work if superposition could exist in the everyday world,
imagine that a coin that is flipped and lands on a table. In classical mechanics -- and in the
everyday world as we know it -- the coin ends up in a state of either heads or tails. In a quantum
mechanical system, the coin could be both heads and tails at the same time, but only until
someone or something observes it or measures it. In this analogy, once observed, the coin would
take on the state of either heads or tails.
1.2.2 Entanglement
Entanglement enables quantum computers to implement various protocols and algorithms that
are not possible with classical systems. For example, it is used in quantum teleportation, which
allows for the transfer of quantum states between two distant systems. Entanglement is also a
key resource for quantum error correction, which is necessary to protect quantum information
from decoherence and other errors. By creating and manipulating entangled states, quantum
computers can detect and correct errors in a way that is not possible for classical computers.
Another key use of entanglement is in quantum error correction. Quantum information is fragile
and susceptible to noise and decoherence. By entangling qubits cleverly, quantum error
correction schemes can detect and recover from errors without disturbing the original
information—something impossible in classical computing. Entangled states serve as the
building blocks for logical qubits, which are more stable and can be used for extended
computations.
      It is impossible to simultaneously know both the exact position and the exact
      momentum of a particle.
       The Uncertainty Principle also applies to other pairs of observables, such as energy and
       time, and has deep implications for the behaviour of particles in confined systems, like
       electrons in atoms. It helps explain phenomena like zero-point energy, where particles
       have motion even at absolute zero temperature, and quantum tunneling, where particles
       appear to pass through energy barriers.
Wave-particle duality is a fundamental concept in quantum mechanics which reveals that all
quantum objects, including light and matter, exhibit both wave-like and particle-like properties.
This idea emerged from a series of experiments and theoretical developments in the early 20th
century. Light, which was classically understood as a wave due to its ability to interfere and
diffract, was shown by Einstein in 1905 to also behave like a stream of particles called photons
when explaining the photoelectric effect—where light knocks electrons out of a metal surface.
This demonstrated that light has a particle nature as well. Inspired by this, Louis de Broglie in
1924 proposed that matter, such as electrons, should also exhibit wave-like behavior. He
introduced the concept of the matter wave, assigning a wavelength to any particle based on its
momentum using the relation,
This duality was dramatically confirmed by experiments such as the electron double-slit
experiment. When electrons pass through two slits, they produce an interference pattern typical
of waves—even when fired one at a time. However, each electron is detected as a single point-
like impact on the screen, showing its particle nature. The interference pattern only emerges after
many electrons have passed through, revealing the underlying wave-like behavior. This
paradoxical result means that quantum objects cannot be fully described as just particles or just
waves. Instead, their behaviour depends on how they are measured. Wave-particle duality
challenges our classical intuition and suggests that quantum entities exist in a superposition of
possibilities, governed by a probability wave, until a measurement collapses this wave into a
definite outcome.
This strange duality means that quantum particles do not behave strictly as particles or waves,
but as a blend of both, determined by the measurement setup. If you measure their position, they
appear particle-like; if you observe their path indirectly, they exhibit wave-like interference. This
dual nature defies classical expectations and forces us to adopt a probabilistic interpretation of
nature.
In quantum theory, particles are described by wavefunctions, which represent the probability of
finding a particle in a certain state. Only when a measurement is made does this wavefunction
“collapse” to a single, definite value. Thus, the wave-particle duality reveals that quantum entities
exist in a superposition of possibilities until observed.
This concept not only underpins the foundations of quantum physics but also drives the
functioning of quantum technologies like electron microscopes, quantum computers, and even
lasers. Ultimately, wave-particle duality challenges our intuitive notions of reality and shows that
at a fundamental level, nature behaves in ways that are deeply counterintuitive, yet
experimentally proven.
Classical mechanics and quantum mechanics are two distinct frameworks for understanding
physical phenomena. Classical mechanics, formulated primarily by Newton, governs the motion
of macroscopic objects like planets, cars, and projectiles. Quantum mechanics, developed in the
early 20th century, is essential for accurately describing the behavior of microscopic particles
such as electrons, atoms, and photons. The two theories differ fundamentally in their
assumptions, mathematical formalisms, and interpretations of nature.
In classical mechanics, objects have definite positions and velocities at all times. The state of a
system can be precisely predicted using Newton's laws, and the evolution of that system is
deterministic: given initial conditions, the future behavior is uniquely determined. On the other
hand, quantum mechanics introduces inherent indeterminacy. A particle does not have a definite
position or momentum until it is measured. Instead, it is described by a wavefunction, which
encodes a probability distribution over all possible outcomes. The act of measurement collapses
this wavefunction, resulting in a specific observed value.
Classical mechanics relies on continuous variables and smooth trajectories in phase space. In
contrast, quantum mechanics uses discrete quantized energy levels and operates within a
probabilistic framework, governed by operators on Hilbert space and the Schrödinger equation.
While classical systems obey the principle of determinism and locality, quantum systems exhibit
phenomena like superposition, entanglement, and non-locality, which have no classical analogs.
Moreover, classical mechanics is intuitive and aligns with everyday experiences, whereas
quantum mechanics often defies intuition, requiring abstract mathematical tools and accepting
that some aspects of nature are fundamentally unknowable. Despite their differences, classical
mechanics is actually a limiting case of quantum mechanics—it emerges naturally when dealing
with large systems or high energies where quantum effects become negligible. Thus, quantum
mechanics is more fundamental and universal, with classical mechanics being an effective
approximation in the macroscopic world.
Classical mechanics treats motion and energy as continuous, and systems evolve along smooth
trajectories in space and time. Quantum mechanics, however, reveals that energy is quantized—
only specific, discrete energy levels are allowed. It also uses complex mathematical tools like
operators, matrices, and Hilbert spaces, along with the Schrödinger equation, to describe the
evolution of systems.
Another major difference is that classical physics adheres to local realism, assuming that objects
are only influenced by their immediate surroundings. Quantum systems defy this through
entanglement and non-local interactions, where particles can exhibit strong correlations even
across large distances. Additionally, quantum mechanics introduces superposition, where
particles exist in multiple states simultaneously, a concept with no classical counterpart.
Though quantum theory may seem abstract and counterintuitive, it is more fundamental—
classical mechanics turns out to be a special case of quantum mechanics, valid only when dealing
with large objects or systems where quantum effects are negligible. Thus, while classical physics
provides accurate predictions in everyday scenarios, it fails at microscopic scales, where only
quantum mechanics can accurately describe the behavior of matter and energy.
In essence, quantum mechanics reshaped our understanding of reality, replacing certainty with
probability, and introducing a new framework for describing the strange and fascinating world
that lies beneath our everyday experiences.
In quantum mechanics, a quantum state represents the complete information about a system and
is typically described by a mathematical function called a wavefunction (denoted by Ψ). This
wavefunction encodes the probabilities of finding the system in various configurations. Unlike
classical systems, quantum systems can exist in a superposition of multiple states
simultaneously, meaning a particle can be in many possible states until a measurement is made.
The measurement process in quantum mechanics is fundamentally different from classical
observation—it is not passive. Instead, observing a quantum system collapses the wavefunction
to a single definite state, chosen probabilistically according to the squared magnitude of the
wavefunction. This collapse is instantaneous and unpredictable, highlighting the probabilistic
nature of quantum systems and the active role of the observer in defining the outcome. The
peculiar nature of measurement leads to non-intuitive phenomena such as wavefunction collapse
and quantum entanglement, where the act of observing one particle instantly affects the state of
another, even across large distances.
This phenomenon shows that the act of measurement is not simply revealing a pre-existing value
but is in fact defining the outcome itself. The system chooses one definite state from the spectrum
of probabilities, and all other possibilities vanish upon observation. This nature of quantum
measurement gives rise to deeply non-intuitive effects such as quantum entanglement, where two
or more particles share a linked state. If one entangled particle is measured, the state of its partner
is instantly determined, no matter the distance between them—a phenomenon that baffled even
Einstein, who referred to it as "spooky action at a distance."
Furthermore, this interaction between observer and system implies that objective reality, as
understood in classical terms, does not always exist independently of observation. Instead, the
observer plays an essential role in shaping the physical outcome. This shift from a deterministic
to a probabilistic and observer-dependent framework is what marks one of the most
fundamental departures of quantum mechanics from classical physics. The study of quantum
states and their measurement continues to influence modern fields such as quantum computing,
quantum cryptography, and quantum teleportation, where the principles of wavefunction
manipulation and collapse are harnessed to perform computations and transmit information in
revolutionary ways.
Quantum systems consist of microscopic entities such as electrons, photons, and atoms, which
all exhibit wave-particle duality and are governed by the laws of quantum mechanics. Electrons,
though traditionally thought of as point particles, also behave like waves. This wave nature is
responsible for phenomena like electron diffraction and atomic orbitals. Photons are the quantum
particles of light; they have no rest mass and always move at the speed of light, displaying both
energy quantization (in packets called quanta) and wave-like behavior such as interference.
Atoms are more complex quantum systems made of electrons orbiting a nucleus. In quantum
mechanics, these electrons occupy discrete energy levels or orbitals, and transitions between
levels involve absorption or emission of photons with specific energies. All these systems
demonstrate uniquely quantum effects such as superposition, entanglement, and tunneling—none
of which can be explained using classical physics. These systems form the foundation of modern
technologies such as lasers, semiconductors, quantum dots, and quantum computers.
Electrons, despite being considered point-like particles in classical physics, reveal a wave-like
character at small scales, a fact made evident by experiments such as electron diffraction. Their
dual nature allows them to form standing wave patterns around atomic nuclei, known as orbitals,
which determine the structure of atoms and molecules.
Photons, on the other hand, are massless quantum particles of electromagnetic radiation. They
always travel at the speed of light and carry energy proportional to their frequency, as described
by E = hν. Their wave-particle duality manifests in phenomena like interference and the
photoelectric effect. Photons can also become entangled, making them important in quantum
communication and cryptography. Meanwhile, atoms are composite systems made of nuclei
surrounded by electrons. In a quantum view, electrons do not orbit in classical trajectories but
instead occupy quantized energy states, transitioning between them by absorbing or emitting
photons of discrete energy.
Each of these systems—electrons, photons, and atoms—exhibit hallmark quantum behaviors
including superposition (being in multiple states at once), entanglement (non-local correlation
between particles), and quantum tunneling (the ability to cross classically forbidden barriers).
These phenomena cannot be explained using classical physics and require the probabilistic,
wave-based framework of quantum theory. Understanding these quantum systems is critical
because they are the foundation of modern quantum-enabled technologies. Innovations such as
semiconductors, quantum sensors, lasers, MRI machines, LEDs, and quantum computers all rely
on manipulating the quantum properties of these particles. As our ability to control these systems
improves, their role in computation, communication, and sensing will only grow more significant
in the future of science and technology
Quantization lies at the heart of quantum mechanics, fundamentally changing our understanding
of nature. Unlike classical physics—where properties such as energy or momentum can vary
continuously—quantum mechanics shows that many physical quantities are restricted to discrete
values. This concept was introduced to resolve the inconsistencies observed in classical models,
such as the blackbody radiation problem and the photoelectric effect, where the observed results
could not be explained without assuming that energy comes in discrete packets called quanta.
Max Planck and Albert Einstein were among the pioneers who proposed that light and energy
must be quantized to align with experimental data, setting the stage for a new theory of matter
and energy.
Quantization is a core principle of quantum mechanics that states certain physical quantities,
like energy, angular momentum, and charge, can only take on discrete values, rather than any
value within a continuous range. This idea is radically different from classical physics, where
such quantities can vary smoothly. The earliest evidence for quantization came from the
blackbody radiation problem and the photoelectric effect, which were explained by assuming
that energy is emitted or absorbed in discrete units called quanta. In atoms, electrons can only
exist in specific quantized energy levels, and transitions between these levels result in the
emission or absorption of photons with fixed frequencies. Quantization is also seen in systems
like the harmonic oscillator, where energy levels are separated by fixed intervals. This discrete
nature of quantum systems is mathematically expressed using operators with eigenvalues
corresponding to observable quantities. Quantization is what gives rise to atomic spectra, the
stability of atoms, and the structure of matter itself, making it a cornerstone of all quantum
theories.
In atomic systems, quantization becomes especially evident. Electrons in atoms cannot occupy
arbitrary energy levels; instead, they are found only in certain allowed states. When an electron
transitions between these levels, it absorbs or emits a photon with a specific frequency, giving
rise to the spectral lines seen in emission and absorption spectra. This phenomenon is responsible
for the stability of atoms and the unique identity of elements. Similar principles apply to
rotational and vibrational states of molecules, which are also quantized and form the basis of
various spectroscopic techniques.
Quantization is not limited to energy. Angular momentum, spin, and even electric charge can
also be quantized, leading to surprising consequences in both microscopic systems and
macroscopic quantum phenomena like superconductivity and quantum Hall effects. The
mathematics of quantization involves solving operator equations, where only certain values
(called eigenvalues) correspond to physical observables. This discrete nature of reality is not just
a mathematical curiosity—it underpins the structure and behavior of matter at the most
fundamental level.
Understanding why quantum mechanics is essential goes beyond explaining atomic structure—
it is about embracing a radically different view of reality. Classical physics fails to explain
phenomena like entanglement, superposition, and tunneling, all of which are routinely observed
in quantum systems. Quantum mechanics accounts for these behaviors through a probabilistic
and non-deterministic framework, where the act of measurement plays a crucial role in
determining outcomes.
Moreover, quantum theory is not just a theoretical success; it has practical, transformative
applications. Technologies such as semiconductors, lasers, magnetic resonance imaging (MRI),
LEDs, and atomic clocks are direct outcomes of quantum principles. Even more revolutionary
are the emerging fields of quantum computing, quantum cryptography, and quantum sensing,
which promise to outperform classical technologies in speed, security, and sensitivity.
Quantum mechanics is not just a theoretical breakthrough in physics—it marks a radical shift in
how we understand and interact with the universe at the most fundamental level. Traditional
classical theories fail to explain the behavior of microscopic particles like electrons, photons, and
atoms. Quantum theory fills this gap by accurately describing the probabilistic and non-
deterministic nature of such particles. Its predictions have been experimentally verified with
extraordinary precision, making it a cornerstone of modern physics. More importantly, quantum
mechanics forms the foundation for transformative advancements in technology, computing, and
security, answering both scientific curiosity and real-world challenges.
Governments worldwide are ramping up funding, launching national missions, and forming
alliances to stay competitive in this field. The ability to control and implement quantum systems
will influence power dynamics globally—shaping military capabilities, intelligence operations,
and secure digital infrastructure. As a result, expertise in quantum science is fast becoming a
determinant of geopolitical and economic strength.
1.6.2 Scientific Significance
Quantum mechanics stands as one of the most profound scientific achievements of the 20th
century, dramatically expanding our understanding of the physical world. Scientifically, quantum
mechanics has revolutionized our understanding of nature. It explains phenomena that classical
physics cannot, such as superconductivity, quantum tunneling, and the behavior of particles in
extreme conditions. Quantum theory has also laid the groundwork for fields like quantum
chemistry, condensed matter physics, and particle physics. It enables the modeling of complex
systems with high accuracy, leading to discoveries in materials science, nanotechnology, and
fundamental physics. Beyond practical uses, quantum mechanics continues to challenge our
philosophical notions of reality, causality, and measurement, making it a profoundly rich area of
ongoing scientific inquiry.
Quantum theory allows scientists to model atomic and subatomic systems with remarkable
precision, leading to the discovery of new materials and deeper insights into the behavior of
matter and energy.
Quantum technologies are poised to bring transformative changes to the technological landscape.
Quantum technologies are set to redefine the future of computation, communication, sensing,
and imaging. Quantum computers can solve certain classes of problems exponentially faster than
classical computers, with potential applications in drug discovery, optimization, machine
learning, climate modelling, and logistics optimization. Quantum sensors can measure time,
magnetic fields, and gravitational forces with unprecedented precision, useful in GPS systems,
medical diagnostics, and geological surveys. Meanwhile, quantum cryptography offers solutions
for secure digital infrastructure. These innovations are not distant dreams—they are already in
early stages of development, and their practical impact is beginning to emerge, setting the stage
for the next technological revolution.
In communication, quantum encryption could underpin a new era of ultra-secure digital
infrastructure. These applications, once theoretical, are now transitioning into real-world pilots
and commercial prototypes—marking the beginning of a new age where quantum mechanics
powers next-generation innovation.
1.7 A Snapshot of Quantum Technologies: Computing, Communication, and Sensing
Quantum technologies are at the forefront of a technological revolution, harnessing the unique
and counterintuitive principles of quantum mechanics—such as superposition, entanglement, and
quantization—to build revolutionary tools that far surpass the capabilities of their classical
counterparts. These technologies are being developed across three primary domains: quantum
computing, quantum communication, and quantum sensing, each offering transformative
potential for science, industry, and society everyday life.
As these technologies mature, they will not remain isolated solutions but will become deeply
integrated into a wide range of applications. Quantum computing could redefine how we solve
scientific and industrial problems. Quantum communication may establish new standards of
digital security. Quantum sensing is set to improve how we measure, observe, and navigate the
world. Together, these advances signal a shift toward a new era of quantum-enhanced innovation
that will shape the future of multiple sectors including finance, defense, transportation, health,
and information technology
As quantum technologies emerge as a critical area of innovation and national interest, several
countries have launched ambitious quantum missions to secure strategic and technological
leadership. These initiatives aim to develop quantum computing, communication, and sensing
capabilities through coordinated investments in research, infrastructure, and talent development.
India, along with major global powers like the USA, China, and the European Union, is actively
building its presence in the quantum landscape.
China leads in quantum communications, lags behind in computing (where the United States
excels), and matches the United States in sensing, excelling in market-ready tech, while the
United States dominates high-impact areas.
India launched its National Quantum Mission in 2023, with a budget of ₹6,003 crore
(approximately $730 million) over eight years. The mission seeks to position India among the
top quantum nations by developing indigenous capabilities in quantum computing, quantum
communication, quantum sensing, and quantum materials. It aims to establish four Thematic
Hubs (T-Hubs) in leading academic and research institutions focusing on foundational
technologies. The NQM also plans to build intermediate-scale quantum computers (with 50–
1000 qubits), develop quantum key distribution networks, and promote workforce training and
international collaborations. The mission aligns with India’s larger vision of self-reliance in
strategic technologies and aims to boost national security, telecommunications, and advanced
research.
                       Fig: India Approves National Quantum Mission
The EU Quantum Flagship is a €1 billion, 10-year initiative launched in 2018 to unify Europe’s
fragmented quantum research landscape. It supports hundreds of research institutions, startups,
and industries across member states. The program focuses on four main areas: quantum
communication, quantum simulation, quantum computing, and quantum metrology. The
EU also promotes infrastructure projects like the European Quantum Communication
Infrastructure (EuroQCI), which aims to establish a secure pan-European quantum
communication network. This mission reflects Europe’s intent to compete globally while
fostering innovation, industrial adoption, and academic excellence in quantum science.
1.8.3 United States: National Quantum Initiative Act
The United States formalized its quantum strategy with the National Quantum Initiative Act
passed in 2018. This act coordinates efforts across government agencies, including the
Department of Energy (DOE), National Science Foundation (NSF), and National Institute of
Standards and Technology (NIST), with significant funding and collaboration with private sector
leaders like IBM, Google, and Microsoft. The National Quantum Coordination Office
oversees these efforts, focusing on quantum research, technology transfer, education, and the
creation of quantum research centers. The U.S. aims to maintain its technological leadership,
secure supply chains, and harness quantum advantages for national security, scientific progress,
and economic growth.
China has emerged as a global leader in quantum technology through sustained state-led
investment and rapid deployment. It has achieved several milestones, including launching the
world’s first quantum communication satellite (Micius) and demonstrating satellite-based
quantum key distribution over thousands of kilometers. China also leads in building a
nationwide quantum communication backbone network, connecting major cities through
ultra-secure fiber links. The Chinese government has reportedly invested billions of dollars in
quantum R&D, and projects like the National Laboratory for Quantum Information Science
in Hefei aim to consolidate China’s dominance in this space. China views quantum technologies
as essential to future economic and military competitiveness.
                          Fig: How Innovative Is China in Quantum?
These missions reflect a global “quantum race”, where nations recognize that quantum
supremacy could redefine cybersecurity, artificial intelligence, defense, and economic structures.
International collaboration, balanced with strategic competition, will shape the trajectory of
quantum innovation in the coming decades, with each nation seeking to leverage quantum
breakthroughs for economic growth, defense strength, and scientific prestige.
Reference : https://tifac.org.in/images/nmqta/concept_note12.06.19.pdf
                                    SYLLABUS
Unit 2
  In contrast to classical bits that exist in a definite state of 0 or 1, quantum bits or qubits can
  exist in a superposition of both states simultaneously. This characteristic allows quantum
  systems to perform parallel computations, offering exponential speedups for certain classes
  of problems.
  The theoretical foundation of Quantum Information Systems is built upon:
  The study of these theoretical structures not only lays the groundwork for quantum computing
  and quantum cryptography but also contributes to the understanding of information itself in
  a fundamentally new light.
What is a qubit?
A qubit, or quantum bit, is the fundamental unit of information in a quantum computer.
Unlike a classical bit, which can be either 0 or 1, a qubit can exist in a superposition of both
states simultaneously, represented as |0⟩, |1⟩, or any complex linear combination α|0⟩ + β|1⟩,
where α and β are complex probability amplitudes. This superposition allows quantum
systems to process vast amounts of information in parallel, enabling certain computations to
be executed exponentially faster than their classical counterparts. Qubits can also exhibit
entanglement, a uniquely quantum phenomenon where the state of one qubit is dependent on
the state of another, regardless of the distance between them. This allows for highly correlated
systems that are essential for quantum logic operations. Another key property is quantum
interference, which enables quantum algorithms to amplify correct computational paths while
canceling out incorrect ones. Qubits are extremely delicate and susceptible to noise, so
maintaining coherence—the time over which a qubit retains its quantum state—is a major
challenge. Various physical systems can be used to realize qubits, including superconducting
circuits (used by IBM and Google), trapped ions (IonQ), photons (PsiQuantum), quantum
dots, and NV centers in diamond. Each technology comes with trade-offs in terms of gate
speed, error rates, scalability, and environmental requirements. A qubit must be initializable,
controllable via quantum gates, measurable, and able to participate in entangling operations.
Typically, multiple physical qubits are needed to form a logical qubit that is protected by
quantum error correction codes, due to the inherent instability of quantum states. These
logical qubits serve as the robust foundation for large-scale, fault-tolerant quantum
computation. Qubit manipulation is performed using finely tuned pulses of microwave,
optical, or radio-frequency energy, depending on the implementation. The Bloch sphere is
often used to visually represent a qubit’s state, where the poles correspond to |0⟩ and |1⟩, and
any point on the sphere’s surface represents a superposition. Reading a qubit’s state involves
a measurement, which collapses the qubit into one of the basis states (0 or 1) probabilistically,
determined by |α|² and |β|². This collapse is irreversible, and thus, quantum information must
be processed carefully before measurement. Qubits are the heart of all quantum algorithms,
including Shor’s factoring algorithm and Grover’s search algorithm. The power of a quantum
computer scales not linearly but exponentially with the number of coherent, entangled qubits,
making them uniquely powerful for problems involving massive state spaces. Developing
stable, high-fidelity, scalable qubit systems is one of the grand engineering challenges of our
time. Current quantum systems range from a few to several hundred qubits, but building a
fault-tolerant quantum computer will require millions of physical qubits operating in
synchrony. Despite their potential, qubits remain deeply complex and demand sophisticated
hardware, control electronics, cryogenics, and quantum software stacks. Ultimately, a qubit
is not just a data unit—it is a gateway to an entirely new computational paradigm governed
by the laws of quantum mechanics.
Unit 3
At the heart of a quantum computer lies the qubit — a quantum bit capable of existing in a
superposition of states. While this property enables parallelism and quantum interference, it also
introduces extreme sensitivity to environmental noise and errors. Maintaining quantum
coherence and achieving fault-tolerant computation are among the primary theoretical obstacles.
* Quantum Decoherence and Error Correction: Qubits are highly susceptible to decoherence due
to interactions with their environment. Designing error-correcting codes that preserve quantum
information without direct measurement is a critical requirement.
* Qubit Scalability and Connectivity: Developing scalable architectures that support a large
number of qubits, along with efficient inter-qubit connectivity, is essential for executing complex
quantum algorithms.
* Quantum Gate Fidelity: High-precision control of quantum gates and operations is necessary
to ensure reliable computation. Theoretical models must support the design of gates that meet
fault-tolerance thresholds.
* Measurement and Readout: Extracting information from quantum states without disturbing
them significantly poses both theoretical and practical difficulties.
* Universal Quantum Computation: Establishing the minimum set of gates and operations
necessary for universal computation is a key theoretical concern in quantum computer design.
This topic integrates quantum mechanics, computer science, and information theory to explore
the foundational principles required to construct a working quantum computer. Addressing these
theoretical challenges is vital to transforming quantum computing from a scientific curiosity into
a practical and transformative technology.
To build a quantum computer, one must design and integrate a highly complex system that
leverages quantum mechanical phenomena to process information. Unlike classical computers
that use binary bits (0 or 1), quantum computers use quantum bits or qubits, which can exist in
multiple states simultaneously due to superposition and entanglement. Below is a detailed
conceptual overview of what is required to build a quantum computer.
To perform algorithms, these gates are combined into quantum circuits following the rules of
unitary evolution governed by quantum mechanics.
3.1.6. Scalability
A universal quantum computer must scale to hundreds or thousands of qubits. This involves:
   •   Modular design of qubit systems
   •   Inter-qubit connectivity (nearest-neighbor or all-to-all coupling)
   •   Integration with control electronics and hardware
It’s a deeply interdisciplinary effort, involving physics, electrical engineering, computer science,
and materials science. Only by integrating all these elements can we move from small test
systems to powerful, large-scale quantum computers capable of solving the world’s hardest
problems.
5. Qubit-Specific Measurement
After quantum computation, the result must be read out by measuring the qubits. This
measurement collapses the qubits into classical 0 or 1 values. It is vital that each qubit can be
measured individually, reliably, and without disturbing others. High-fidelity measurement is
essential for accurate output.
Quantum computers require precise control systems to manipulate qubits with microwaves,
lasers, or magnetic fields. The architecture must include high-speed control electronics and
low-noise communication channels between qubits. As systems grow in size, creating efficient
interconnects and control networks becomes even more challenging.
Quantum systems, the foundation of quantum computing and quantum information science, are
governed by the principles of quantum mechanics—namely superposition, entanglement, and
coherence. These properties enable quantum computers to process information in powerful new
ways. However, quantum systems are inherently fragile, meaning they are highly sensitive to
external disturbances, environmental interactions, and imperfections in control mechanisms.
The fragility of quantum systems is one of the most critical challenges in realizing practical
quantum computing. Tiny interactions with the environment can destroy the delicate quantum
states—an effect known as decoherence. Moreover, even small inaccuracies in quantum gate
operations or fluctuations in temperature or electromagnetic fields can introduce errors. Because
quantum information cannot be cloned (as per the no-cloning theorem), standard redundancy and
error-handling techniques from classical computing do not apply directly.
This fragility necessitates stringent control over qubit environments, high-fidelity operations, and
the development of sophisticated quantum error correction strategies. Understanding and
mitigating the fragility of quantum systems is central to building stable, scalable, and fault-
tolerant quantum technologies.
3.2.1 Decoherence
Decoherence is one of the most fundamental and problematic challenges in quantum computing.
It refers to the process by which a quantum system loses its quantum mechanical properties—
particularly superposition and entanglement—due to interactions with the surrounding
environment. In theory, a qubit can exist in a coherent superposition of both 0 and 1, allowing
quantum computers to perform complex parallel calculations. However, in practice, qubits are
never completely isolated. They interact with stray electromagnetic fields, nearby particles,
thermal energy, and even cosmic radiation. These tiny interactions disturb the quantum state,
forcing it to "collapse" into a definite classical state, destroying the computation. Unlike classical
bits, which are stable under most conditions, qubits are fragile and highly sensitive. The
timeframe during which a qubit retains its coherence is known as the coherence time, and this is
often very short—ranging from microseconds to milliseconds depending on the hardware. The
shorter the coherence time, the fewer quantum operations (gates) can be performed reliably.
Extending coherence time is one of the central goals of quantum hardware design, and it requires
extreme isolation techniques, cryogenic temperatures, and highly pure materials. Until
decoherence is significantly minimized or managed with effective quantum error correction,
building large-scale, reliable quantum computers will remain a formidable task.
Decoherence is the process by which a quantum system loses its quantum behavior and begins
to behave classically due to interactions with its environment.
For example, if a qubit in superposition |0⟩ + |1⟩ interacts with a photon, it may end up in either
|0⟩ or |1⟩, destroying the computation. Mathematically, decoherence is modeled as the decay of
off-diagonal terms in the system’s density matrix.
Common types of decoherence:
Dephasing (loss of relative phase between |0⟩ and |1⟩)
Amplitude damping (loss of energy from excited to ground state)
Decoherence Time (T2): The characteristic time over which a qubit remains coherent. Longer
T2 times are desirable for computation.
Noise refers to any unwanted disturbance that affects the state of a quantum system.
Types of quantum noise include:
a) Thermal Noise:
Caused by fluctuations in temperature.
Can cause qubits to flip randomly (bit-flip errors) or change phase (phase-flip errors).
b) Gate Noise:
Arises from imprecise control over quantum gates.
Imperfect calibration leads to small but accumulating errors during gate operations.
c) Measurement Noise:
Occurs when reading out the quantum state.
Detectors may misidentify the qubit state due to limitations in resolution or interference.
d) Crosstalk:
When operations on one qubit unintentionally affect another nearby qubit.
Noise models are often described using quantum channels such as:
    •   Bit-flip channel
    •   Phase-flip channel
    •   Depolarizing channel
Quantum systems are highly susceptible to noise due to their continuous, analog nature and the
lack of built-in error correction as in classical digital systems.
Operating a quantum computer demands extraordinary precision in control systems, far beyond
what is typically required for classical machines. Qubits are manipulated using finely tuned
electromagnetic pulses—such as microwave signals in superconducting qubits or laser beams in
trapped ions—to perform quantum gate operations. These gates must rotate the quantum state
precisely on the Bloch sphere, which geometrically represents the state of a qubit. Even the
slightest error in timing, amplitude, or phase of these control pulses can result in the qubit
deviating from the intended path, leading to computational errors. In classical systems, slight
inaccuracies may go unnoticed due to their binary nature; however, quantum systems demand
continuous, analogy precision, where even a minor fluctuation can ruin a quantum operation.
Additionally, as the number of qubits increases, the complexity of their interactions also rises.
Control systems must not only address individual qubits but also coordinate entanglement
operations between multiple qubits—often requiring synchronization at the nanosecond scale.
Any crosstalk, unintended coupling, or thermal noise in control lines can introduce correlated
errors. Developing scalable and accurate quantum control hardware—such as low-noise signal
generators, error-resilient pulse sequences, and high-speed electronics—is one of the most active
areas in quantum engineering. Without ultra-precise control, even a perfect theoretical algorithm
cannot be reliably executed on real hardware.
Controlling quantum systems with high precision is extremely difficult and crucial for reliable
computation.
a) Precision Requirements:
    •   Qubits must be isolated from environmental noise but accessible for operations and
        measurements.
    •   This duality is difficult to achieve and maintain.
d) Scalability:
    •   As the number of qubits increases, maintaining uniform control and minimizing cross-
        qubit interference becomes exponentially more difficult.
1. Well-defined Qubits
A quantum system must have clearly defined two-level quantum states that act as qubits (quantum bits).
These states (e.g., |0⟩ and |1⟩) must be distinguishable and controllable.
Examples: Spin states of an electron, energy levels of an ion, superconducting loops.
2. Initialization of Qubits
The system must be able to reliably prepare all qubits in a known initial state, typically |0⟩.
Initialization is crucial for consistent quantum algorithm execution.
3. Long Coherence Time
Qubits must maintain their quantum state (coherence) long enough to perform computations.
Coherence time (T2) must be significantly longer than the time it takes to perform quantum gate
operations.
High coherence ensures the integrity of superposition and entanglement.
4. Universal Set of Quantum Gates
The system must support a set of quantum gates that can perform arbitrary operations on qubits.
This usually includes:
    •   Single-qubit gates (e.g., Hadamard, Pauli-X)
    •   At least one entangling two-qubit gate (e.g., CNOT)
Together, these gates must form a universal set, enabling the construction of any quantum algorithm.
5. Qubit-Specific Measurement Capability
It must be possible to measure the state of individual qubits without disturbing others.
Measurement should yield reliable classical outcomes corresponding to quantum basis states.
6. Scalable Architecture
The system must allow for the integration of many qubits (tens to thousands or more) without excessive
overhead or noise.
Scalability involves both hardware and control systems, requiring modularity and fault-tolerance.
7. Qubit Interconnectivity
Qubits must be able to interact with specific others (not necessarily all), enabling entanglement and two-
qubit gates.
Efficient connectivity is essential for implementing quantum algorithms and error correction.
8. Error Correction and Fault Tolerance
The system must support quantum error correction to counteract decoherence and noise.
Error correction requires additional qubits (logical qubits encoded in many physical ones) and complex
operations.
9. Reproducible and Controllable Quantum Dynamics
All quantum operations (initialization, gates, measurements) must be precisely reproducible and
controllable.
Gate fidelities must be extremely high (typically >99.9% for fault-tolerant thresholds).
10. Interface for Input and Output
The system should be able to take classical inputs, execute quantum instructions, and return classical
outputs after quantum measurements.
This involves control electronics, classical computers, and user interfaces.
Building a functional, scalable, and reliable quantum computer involves not just engineering
challenges, but also profound theoretical barriers that stem from the fundamental nature of
quantum mechanics.
3.4.6 Photonics
Photonic quantum computing uses light particles (photons) as qubits, which makes them uniquely suited
for quantum communication and networking. Photons are naturally immune to many environmental
disturbances that affect matter-based qubits, giving them an inherent robustness to noise and decoherence.
Quantum information is typically encoded in properties like polarization, phase, or path of the photons.
Because photons travel at the speed of light, photonic systems promise extremely fast communication,
making them ideal for building the quantum internet. However, photonic quantum computing also faces
significant challenges. Generating single photons on demand, routing them precisely through optical
circuits, and making them interact to perform logic gates require highly advanced technologies. Unlike
ions or superconducting qubits, photons do not naturally interact, so nonlinear optical components or
measurement-based schemes are needed to perform two-qubit gates. Despite these hurdles, advances in
integrated photonics and optical chips are making photonic quantum systems increasingly viable. Their
ability to operate at room temperature and interface with fiber-optic networks gives them a distinct edge
for scalable communication-focused quantum applications.
Unit 4
The theoretical foundations of quantum communication and computing provide the framework
to understand, design, and analyse the behaviour and capabilities of quantum systems. These
principles form the backbone for developing quantum algorithms, secure communication
protocols, and scalable quantum architectures.
In quantum computing, the theoretical viewpoint addresses how quantum mechanics can be
harnessed to perform computations that are intractable for classical systems. It includes the study
of quantum gates, quantum circuits, algorithm complexity, and error correction models, as well
as the mathematical underpinnings of quantum logic and measurement.
This theoretical lens is essential to understand both the potential and limitations of quantum
technologies, guiding researchers in overcoming key challenges such as decoherence, scalability,
fault-tolerance, and algorithmic development.
4.1.3. Entanglement
   •   Classical: Bits operate independently. The state of one bit does not affect another unless
       explicitly connected via logic operations.
   •   Quantum: Qubits can become entangled, meaning the state of one qubit directly affects
       the state of another, even over long distances. This allows for powerful correlations used
       in quantum algorithms and quantum teleportation.
4.1.4. Measurement and Observation
   •   Classical: Measuring a classical bit simply reveals its value (0 or 1), and the bit remains
       unchanged by the observation.
   •   Quantum: Measuring a qubit collapses its superposition to a single classical state (0 or 1),
       altering its original state. This makes observation destructive and requires careful design
       of quantum algorithms.
Unit 5
5.0 Introduction
Quantum computing is poised to revolutionize numerous fields by solving problems that are
practically impossible for classical computers. In medicine, it can simulate molecular
interactions at an atomic level, enabling the discovery of new drugs and personalized
treatments. In finance, quantum algorithms may drastically improve risk analysis, portfolio
optimization, and fraud detection by processing vast datasets in real time. In logistics and supply
chain management, companies like DHL and Volkswagen are already exploring quantum
algorithms to optimize delivery routes and reduce operational costs. Cybersecurity, too, is
expected to transform, as quantum computers may break current encryption methods,
prompting the development of quantum-safe cryptography.
In artificial intelligence, quantum computing can enhance machine learning models, enabling
faster training and better pattern recognition for applications like autonomous driving or
language translation. Climate modelling is another significant use case, where quantum
simulations can offer better predictions for global warming and natural disasters. Material
science can benefit as well, with the discovery of new materials for batteries, superconductors,
or solar panels. Moreover, quantum computing can simulate quantum systems themselves,
aiding the development of better quantum devices. As we look into the future, a quantum-
powered world could bring disruptive innovation, but it will also require entirely new
programming models, infrastructure, and ethical considerations to harness its full potential
responsibly.
Quantum technologies are increasingly moving from theory to real-world application. These
technologies exploit principles of quantum mechanics—such as superposition, entanglement,
and quantum tunneling—to perform tasks that classical systems struggle with or cannot do at
all.
                                  Fig 5.0 Application of Quantum Computing
5.1.1 Healthcare
Drug Discovery
Drug discovery is one of the most promising real-world applications of quantum technologies.
The process of discovering new drugs involves simulating complex molecules and chemical
reactions—tasks that are extremely difficult and time-consuming for classical computers.
Quantum computers offer a revolutionary approach.
i. Molecular Simulation
 Molecules follow the laws of quantum mechanics. Quantum computers can naturally model
 these behaviors:
 Traditional supercomputers use approximations for quantum behavior, which limits accuracy.
 Quantum computers can perform these simulations exponentially faster and more accurately,
 leading to:
Traditional drug development takes 10–15 years and billions of dollars. Quantum-enabled
simulations could significantly shorten R&D cycles.
Quantum computers help simulate how proteins fold and how drugs bind to them.
Understanding folding is critical for targeting diseases like Alzheimer’s, cancer, and viral
infections.
2. Chemical Reaction Simulation
 Organization                 Contribution
 IBM Quantum                  Simulated small molecules like LiH and BeH₂. Collaborating with
                              biotech firms.
 Google Quantum AI            Simulated basic molecules using Sycamore quantum processor.
 D-Wave                       Exploring quantum annealing for molecule optimization
 AstraZeneca                  Collaborating with Quantinuum and Cambridge Quantum for drug
                              design.
 Roche        &   Boehringer Partnering with quantum startups to simulate complex molecules.
 Ingelheim
 ProteinQure                  Uses quantum computers for protein-drug interactions and
                              optimization
Quantum technologies are transforming material science by enabling scientists to discover and
design new materials with unprecedented accuracy and speed. Quantum computers and
quantum simulations help model complex atomic interactions that are too difficult for classical
computers to handle.
Quantum computing enables the accurate simulation of material behavior at the atomic level,
which is difficult for traditional systems to achieve. This opens the door to discovering new
materials with tailored properties for use in industries such as energy, electronics, and
aerospace. For instance, researchers could design more efficient superconductors, lighter and
stronger metals, or advanced polymers for biodegradable packaging. Quantum simulations
allow scientists to test and tweak atomic structures before they are physically created, saving
time and resources.
Quantum computing is set to revolutionize logistics and optimization problems that are
computationally intensive for classical systems. These include route optimization, supply chain
management, inventory forecasting, and delivery scheduling. Quantum algorithms like the
Quantum Approximate Optimization Algorithm (QAOA) are being explored to solve such
combinatorial problems more efficiently.
In this, quantum sensing and precision timing stand to redefine the limits of measurement
and detection across multiple sectors. Their impact, though often behind the scenes, will be
foundational in enabling next-generation technologies in science, security, health, navigation,
and communication. As these tools move from the lab to real-world deployment, they will play
a critical role in building a more precise, responsive, and interconnected future.
5.2 Industrial case studies:
5.2.1 IBM
IBM has been one of the earliest and most active players in the quantum computing industry.
Its flagship platform, IBM Quantum, provides cloud-based access to quantum processors,
allowing researchers, students, and developers to experiment with quantum algorithms. IBM
introduced the Qiskit open-source framework to encourage quantum programming and research
collaboration.
Their IBM Quantum System One, the world’s first integrated quantum system for commercial
use, has been deployed in multiple locations globally. IBM’s roadmap is transparent and
ambitious— they aim to scale quantum hardware from hundreds to thousands of qubits using
error-corrected quantum processors. IBM is also making progress in quantum error correction,
recently demonstrating the use of quantum LDPC (Low-Density Parity-Check) codes, which
are essential for building reliable, large-scale quantum systems. Their 2023 milestone—the
433-qubit “Osprey” processor—showcases their hardware scalability. By 2025, IBM plans to
release Condor, a processor with over 1,000 qubits, further pushing the envelope. IBM is
pioneering modular quantum computing, where smaller quantum chips are interconnected to
function as a larger system. This approach mirrors classical multi-core processing and is crucial
for scalability.
In parallel, IBM continues to enhance Qiskit Runtime, an execution environment that optimizes
quantum circuit performance through advanced compilation and error mitigation. IBM also
publishes a transparent quantum roadmap, updated annually, which guides developers,
educators, and researchers globally. Their presence in quantum education is unmatched,
offering resources like Quantum Composer, hands-on labs, and hackathons through the IBM
Quantum Network. Collaborations with organizations such as CERN and MIT underline their
leadership in open science. IBM’s dual commitment to technological progress and community
development positions it as a central pillar in the global quantum ecosystem.
Notably, IBM is collaborating with industries such as healthcare, finance, and chemicals to
apply quantum computing to real-world challenges, including molecule simulation, portfolio
optimization, and materials discovery. They have also partnered with governments and
academic institutions to develop the quantum workforce, showcasing a commitment not only
to technology but also to ecosystem building.
5.2.2 Google
5.2.3Microsoft
Microsoft is approaching quantum computing from a full-stack perspective. Unlike IBM and
Google, Microsoft is working on topological qubits, a type of qubit expected to be more stable
and less error-prone than traditional ones. While topological qubits are still in early stages,
Microsoft is simultaneously providing tools and platforms through Azure Quantum, a
cloudbased ecosystem that offers access to quantum hardware and simulators from multiple
vendors.
Azure Quantum’s integration with Microsoft’s cloud ecosystem gives users access to quantum
solutions alongside tools like Azure AI and Azure HPC—bridging classical and quantum
workflows. Microsoft’s Quantum Innovator Series and technical documentation have been
influential in educating enterprises on how to prepare for the quantum era. They are also
focusing on quantum-resilient cryptography, developing protocols that can withstand both
classical and quantum attacks. By building an abstraction layer across diverse hardware,
Microsoft is enabling developers to write once and deploy across platforms, accelerating
application prototyping. Their end-to-end approach reflects a deep commitment to usability,
scalability, and enterprise adoption.
Their Quantum Development Kit (QDK) includes Q#, a programming language designed
specifically for quantum algorithms. By focusing on integration and developer accessibility,
Microsoft’s contribution lies not only in quantum research but also in making quantum
technologies available and practical for developers and organizations across various sectors.
Microsoft is also deeply invested in quantum error correction, actively exploring Majorana
fermions—exotic particles believed to make topological qubits naturally error-resistant. Their
StationQ lab, headquartered at UC Santa Barbara, focuses on this ambitious path, which, if
successful, could leap ahead of current noisy qubit approaches. Microsoft has also established
partnerships with academic institutions and quantum startups to develop hybrid
quantumclassical algorithms tailored for early business use cases. They emphasize the
importance of resource estimation tools, allowing developers to assess what kind of quantum
system is required to run a given algorithm.
5.2.4PsiQuantum
PsiQuantum takes a unique and bold approach to quantum computing by building a photonic
quantum computer using conventional semiconductor fabrication techniques. Their goal is to
build a fault-tolerant, million-qubit quantum computer using photons as qubits instead of
superconducting circuits. PsiQuantum’s photonic approach benefits from the low decoherence
of photons, which can travel long distances without interacting with their environment—a
major advantage over fragile superconducting qubits. Their system uses linear optical
elements, such as beam splitters and phase shifters, along with single-photon sources and
detectors, which can be manufactured using standard CMOS fabrication techniques. This
positions PsiQuantum to benefit from existing semiconductor supply chains and reduce
hardware costs in the long run.
They are also investing in cryogenic electronics and quantum-classical control systems that
can scale with photonic architectures. PsiQuantum has filed numerous patents related to fault-
tolerant architecture design, photon routing, and quantum error correction, highlighting
the depth of their IP strategy. The firm collaborates with government agencies like DARPA
and national laboratories, and is exploring applications in energy optimization, quantum
networking, and climate modeling. Though still in stealth for some aspects of their
technology, PsiQuantum aims to build a utility-scale quantum computer that could run
meaningful applications with full error correction. Their combination of high ambition,
deep physics, and scalable engineering could allow them to emerge as a disruptive force in
the global quantum race.
This design choice aims to solve scalability and error correction challenges from the ground up.
Unlike other quantum startups, PsiQuantum emphasizes working with existing silicon foundries
to leverage mature infrastructure and reduce manufacturing risk. Although their systems are not
yet publicly available, the company has received significant investment and is partnering with
industry leaders and government bodies to advance its technology. If successful, PsiQuantum
could leapfrog traditional architectures by introducing a scalable and manufacturable approach
to quantum hardware.
The road to mainstream adoption of quantum computing is filled with significant challenges,
the most immediate being cost. Building and maintaining quantum systems—especially those
based on superconducting qubits— requires not only sophisticated technology but also
environments cooled to near absolute zero, typically using expensive dilution refrigerators. The
infrastructure needed to support such systems involves complex shielding from electromagnetic
interference, ultra-stable power sources, and precise control equipment. These requirements
drive up capital and operational expenses, making it nearly impossible for small startups,
educational institutions, or developing countries to participate meaningfully in quantum
research and development. As of now, only a handful of tech giants and government-backed
research labs possess the resources needed to invest in such large-scale quantum initiatives.
Beyond cost, the shortage of skilled professionals in the quantum ecosystem is a pressing
concern. Quantum computing is a multidisciplinary domain that spans quantum mechanics,
advanced mathematics, classical and quantum algorithms, and computer engineering. However,
academic programs offering dedicated training in quantum information science are still limited.
This creates a bottleneck in talent availability, with companies and universities struggling to
find individuals who can bridge the gap between theoretical research and practical system
development. The few who are highly skilled are in such high demand that they are often
absorbed into elite roles within top-tier tech companies or academic institutions, further limiting
broad-based industry access.
The skills gap also hampers innovation. Without a sufficiently large and well-trained
workforce, progress in algorithm design, hardware testing, and software integration slows
considerably. This shortage extends to educators and trainers as well, meaning that scaling up
learning programs is itself a challenge. Governments and educational institutions have started
investing in quantum literacy initiatives, but progress is slow compared to the pace of
technological advancement.
Quantum computing demands a rare combination of knowledge in quantum physics,
mathematics, computer science, and engineering. As a result, the number of trained
professionals capable of designing, building, and programming quantum systems is critically
low.
Another formidable barrier is the lack of standardization across the quantum computing
ecosystem. In classical computing, universal programming languages (like C, Java, or Python),
standardized chip architectures (like x86 or ARM), and defined protocols for data exchange
have created an ecosystem where hardware and software can evolve rapidly and cooperatively.
In contrast, the quantum world remains fragmented. Each hardware vendor—whether working
on superconducting qubits, trapped ions, photonic systems, or topological qubits—uses unique
control systems, programming environments, and error correction methods. As a result,
software written for one platform is rarely portable to another, making collaboration and system
integration difficult.
The absence of standardization also means there is no shared benchmarking system to measure
progress objectively across platforms. This makes it harder for organizations to make informed
decisions about which quantum technologies to invest in, and for researchers to compare results
and replicate studies. Without agreed-upon protocols, it’s also difficult to ensure compatibility
between different layers of the quantum computing stack—from hardware to middleware to
application software.
Until these core challenges—cost, workforce skills, and system standardization—are
addressed, quantum computing will continue to remain largely in the domain of research and
experimentation. For the technology to achieve widespread adoption and commercial viability,
there must be concerted efforts by governments, academia, and industry to democratize access,
invest in education, and agree on shared frameworks and protocols. Only then can the true
transformative potential of quantum computing be fully realized across sectors such as
healthcare, finance, energy, logistics, and beyond
5.5 Emerging careers in quantum : roles, skillsets, and preparation pathways
The rise of quantum computing is generating an exciting array of new career opportunities,
blending physics with computer science, mathematics, and engineering. As quantum
technologies move closer to practical application, the demand for skilled professionals is
growing rapidly. Among the most prominent emerging roles is the Quantum Software
Developer, responsible for writing algorithms tailored to quantum computers using specialized
frameworks such as IBM’s
Qiskit, Google’s Cirq, Xanadu’s PennyLane, or Microsoft’s Q#. These developers work on
creating quantum programs for applications in cryptography, optimization, chemistry, and
machine learning. Another critical role is that of the Quantum Hardware Engineer, who
designs, tests, and maintains the delicate physical systems—such as superconducting circuits,
ion traps, or photonic chips—that serve as the backbone of quantum computation. These
engineers must understand cryogenics, quantum control systems, and the physics of qubit
interactions. Their work ensures the reliable operation of quantum processors under extreme
environmental conditions.
Quantum Researchers and Quantum Algorithm Scientists play a foundational role in
pushing the frontiers of the field. They focus on developing more stable and error-resistant
qubits, inventing novel quantum algorithms, and improving quantum error correction and fault-
tolerance mechanisms. Many of these professionals work in academia or research labs but are
increasingly being recruited into private-sector R&D roles.
In parallel, Quantum Information Scientists work on the theoretical aspects of how quantum
systems process, transmit, and secure information. Their insights underpin advances in areas
like quantum cryptography, quantum communications, and entanglement-based networks.
Meanwhile, the industry is seeing the emergence of roles such as Quantum Systems
Integrators, who bridge the gap between hardware, software, and applications—ensuring that
quantum components work together efficiently across the tech stack.
With the growing intersection of business and quantum, companies are also hiring Quantum
Product Managers, who guide the development and delivery of quantum solutions aligned
with customer needs and market trends. Similarly, Quantum Cybersecurity Analysts are
becoming vital in preparing organizations for a post-quantum world by analyzing encryption
vulnerabilities and implementing quantum-safe cryptographic protocols.
The required skillsets for these careers are diverse but generally include a strong foundation in
quantum mechanics, linear algebra, probability theory, and classical programming
languages like Python or C++. Knowledge of quantum programming platforms, familiarity
with quantum gates and circuits, and experience with simulation tools are increasingly
expected. In hardware-related roles, additional expertise in electrical engineering,
nanofabrication, cryogenics, or optics may be essential.
To prepare for a career in quantum technologies, students and professionals can pursue formal
degrees in physics, computer science, mathematics, or electrical engineering. Many
universities   now    offer   specialized    quantum       computing      master’s   programs,
interdisciplinary
PhDs, and research assistantships in quantum labs. For those seeking flexible learning paths,
numerous online platforms—including edX, Coursera, QuTech Academy, and MITx—offer
quantum computing courses. Additionally, companies like IBM, Microsoft, and Google
provide free tools and resources for self-learning and experimentation.
Hands-on training is increasingly vital. Platforms such as IBM Quantum Experience, Azure
Quantum, and Amazon Braket allow users to access real quantum hardware and simulators.
Industry certifications, hackathons, internships, and quantum developer bootcamps are also
emerging as effective ways to gain practical exposure and build credibility in the field.
As quantum technology evolves, so too will the career landscape. Interdisciplinary
collaboration— combining physics, engineering, AI, and cybersecurity—will be essential.
Lifelong learning and adaptability will remain key traits for anyone aiming to build and sustain
a successful career in the quantum workforce of the future.
5.6 Educational and research landscape – India's opportunity in the global quantum race
India is uniquely positioned to play a pivotal role in the global quantum revolution, thanks to
its vast pool of scientific talent, growing technology infrastructure, and increased policy-level
attention to emerging technologies. Recognizing the transformative potential of quantum
computing The Indian government launched the National Mission on Quantum Technologies
& Applications (NM-QTA) with a significant outlay of ₹8,000 crores (around $1 billion
USD). Premier institutes like IISc Bangalore, IIT Bombay, Delhi, Madras, and Kharagpur,
and Tata Institute of Fundamental Research (TIFR) are at the forefront of academic research
in quantum physics and quantum computing. These institutions are engaged in pioneering work
on quantum algorithms, quantum key distribution (QKD), quantum error correction,
quantum optics, and quantum materials. In parallel, specialized quantum research labs are
being established in collaboration with government agencies such as DRDO, ISRO, and DST,
further expanding India’s R&D footprint.
Educational initiatives are also gathering momentum. Universities are beginning to offer
elective and degree programs in quantum information science, and efforts are underway to
integrate quantum modules into engineering and physics curricula at both undergraduate and
postgraduate levels. The Quantum Computer Simulator Toolkit (QSim), launched by the
Ministry of Electronics and Information Technology (MeitY), is an important step toward
democratizing quantum education. QSim allows students and researchers to develop and test
quantum algorithms on simulated environments without needing access to real quantum
hardware.
Despite this promising start, India must address several systemic challenges to fully harness its
potential. A major bottleneck is the shortage of trained faculty and researchers who specialize
in quantum science. Additionally, infrastructure gaps—such as the lack of high-fidelity
quantum hardware, advanced fabrication labs, and dedicated quantum computing centers—
impede rapid progress. There is also a pressing need to foster deeper industry-academic
collaborations, which remain limited compared to global counterparts.
To bridge these gaps, public-private partnerships (PPP) must be scaled up. Tech companies
like TCS, Infosys, and HCL are beginning to explore quantum computing applications and
can play a vital role in commercializing academic research. India should also focus on
international collaborations with leading quantum research hubs in the US, EU, Canada, and
Japan to gain access to expertise, platforms, and funding. Encouraging student participation
through quantum hackathons, fellowships, and global internships will further energize the
ecosystem.
With its robust IT and software industry, deep mathematical and scientific base, and strong
policy direction, India has the potential not only to catch up with global quantum leaders but
also to lead in select areas such as quantum software development, theoretical quantum
research, quantum cryptography, and simulation technologies. For this vision to
materialize, a long-term commitment to curriculum reform, faculty development, infrastructure
investment, and ecosystem collaboration is essential.
If India leverages these strengths strategically, it can transform from a follower to a global
innovator in quantum technologies—contributing significantly to secure communications,
next generation computing, precision medicine, and national defence.