NANOTECHNOLOGY
NANOTECHNOLOGY
1.0 INTRODUCTION
Nanotechnology is the technology based on tiny things mostly comprising of nanostructures, atoms
and molecules. Its scale of measurement is nanometer scale, which is one billionth of a meter
which is smaller than wavelength of visible light or hundred thousand width of a human hair. It is
the engineering of functional system at molecular scale.
In its original sense, 'nanotechnology' refers to the projected ability to construct items from
the bottom up, using techniques and tools being developed today to make complete, high
performance products. The idea of nanotechnology was first proposed by a physicist named
Richard Feynman in 1959. Feynman never technically used the term “nanotechnology” or “nanite”
but he gave a speech called “There’s Plenty of Room At the Bottom” in which he talked about
how we would one day be able to manipulate atoms and molecules and actually craft them into
whatever we wanted them to be. He then discussed the possibility of us, in the distant future,
creating extremely small machines that would serve as tiny tools.
This idea was earlier considered completely radical; we now see nanotechnology as a very
real and potential technology in the near future. Nanotechnology wasn’t seen as a considerable
concept until the 1980s when Eric Drexler began doing research into nanotechnology, including
the observation of Feynman’s speech. Drexler spent countless years perfecting this concept and
getting many different scientists involved with actually producing nanotechnology. It has impacts
in various fields such as Computing and Data storage, Materials and Manufacturing, Health and
medicine, Energy, Transportation etc. In Computing and Data storage field, it helps in
development of processors with high speed, high durability, less energy consumption etc. Also
helps in improvement in display and quantum technologies.
1
also thermal and wear resistant coatings. In energy field, it provides alternative source of energy
which has the capability to replace even solar energy.
The basic concept that links nanotechnology to computer science just like many other
applications is when materials are scaled to the nano level they develop various tunable and
desirable properties such as optical, electronic, mechanical, magnetic which are otherwise absent
in bulk materials. Also implied by Moore's law - which has observed that, over the history of
computing hardware, the number of transistors in a dense integrated circuit has doubled
approximately every two years - computer device components are being scaled down further and
further as a consequence.
At the last turn of the century, the average person would have had a hard time trying to
understand how cars and airplanes worked, and computers and nuclear bombs exist only in theory.
By the next turn of the century, we may have submicroscopic, self-replicating robots; machine
people; the end of disease; even immortality.
Hard to imagine? Not for the new breed of scientist who says that the 21st century could
see all these science fiction dreams come true the is because of molecular nanotechnology, a hybrid
of chemistry and engineering that would let us manufacture anything with atomic precision.
In fact, scientists claim that even within the next 50 years, this new technology will change the
world in ways we can barely begin to imagine today. Just as computers break down data into its
most basic form 1’s and 0’s — nanotechnology deals with matter in its most elemental form: atoms
and molecules.
With a computer, once data is broken down and organized into combinations of 1s and
0s, it can be easily reproduced and distributed. With matter, the basic building blocks are atoms
and the combinations of atoms that make up molecules. Nanotechnology lets you manipulate
those atoms and molecules, making it possible to manufacture, replicate, and distribute any
substance known to humans as easily and cheaply as you can replicate data on a computer.
To build machines on the scale of molecules. Basically, nanotechnology works with materials,
devices and other structures with at least one dimension sized from 1 to 100 nanometres. Examples
2
are: a few nanometres wide--motor, robot arms, small electronic components, novel semiconductor
devices and even whole computer far smaller than a cell.
Example Aerosols, colloids, coatings, nanoparticle reinforced composites, nano structured metals,
3
Use of natural resources
Rather than clear-cutting forests to make paper, we'd have assemblers synthesizing paper.
Rather than using oil for energy, we'd have molecule-sized solar cells mixed into road pavement
a few hundred Famine would be obliterated, as food could be synthesized easily and cheaply with
a microwave-sized nanobox that pulls the raw materials (mostly carbon) from the air or the
soil. And by using nanobots as cleaning machines that break down pollutants, we would be able
to counteract the damage we've done to the earthsince the industrial revolution.
Nanotechnology will render the traditional manufacturing process Obsolete. For example,
we'd no longer have a steel mill Outfitted with enormous, expensive machinery, running on
fossi fuels and employing hundreds of human workers; instead we'd have a nanofactory with
trillions of nanobots synthesizing steel, molecule by molecule.
Despite, many important applications of nanotechnology, here emphasis has been given only
on the area of information and communication that deals with the manufacturing and development
of micro devices or electronic components required to make nano computer.
Nanotechnology has benefited computer science in many ways such as by increasing the
efficiency of computer processors, by ensuring the continuity of MOORE’S LAW etc. A billion
(or trillion) tiny particles, whether complex molecules or miniature machines, must all cooperate
and collaborate in order to produce the desired end result.
None will have, individually, sufficient computing power to enable complex programming. Like
the growth of crystals, the development of embryos, or the intelligent behavior of ants, bottom up
nanotechnology must be achieved through collective, emergent behaviors, arising through simple
interactions amongst itself and its environment. Computer science, and especially fields of
research such as swarm intelligence, will be critical for the future of bottom-up nanotech.
Nanotechnology is already in use in many computing, communications, and other electronics
4
applications to provide faster, smaller, and more portable systems that can manage and store
larger and larger amounts of information. These continuously evolving applications include:
Nano scale transistors that are faster, more powerful, and increasingly energy efficient;
soon your computer’s entire memory may be stored on a single tiny chip.
Magnetic random-access memory (MRAM) enabled by nanometer‐scale magnetic tunnel
junctions that can quickly and effectively save even encrypted data during a system
shutdown or crash, enable resume‐play features, and gather vehicle accident data.
Displays for many new TVs, laptop computers, cell phones, digital cameras, and other
devices incorporate nanostructured polymer films known as organic light-emitting diodes,
or OLEDs. OLED screens offer brighter images in a flat format, as well as wider viewing
angles, lighter weight, better picture density, lower power consumption, and longer
lifetimes.
A nano computer is a computer whose physical dimensions are microscopic. The field of nano
computing is part of the emerging field of nano technology. Several types of nano computers have
been suggested or proposed by researchers and futurists Computers. Following are the 4 categories
of nano computing –
Electronic nano computers would operate in a manner similar to the way present-day
microcomputers work. The main difference is one of physical scale. More and more transistor s
are squeezed into silicon chips with each passing year; witness the evolution of integrated circuits
( IC s) capable of ever-increasing storage capacity and processing power. The ultimate limit to the
number of transistors per unit volume is imposed by the atomic structure of matter. Most engineers
agree that technology has not yet come close to pushing this limit. In the electronic sense, the term
nano computer is relative. By 1970s standards, today's ordinary microprocessors might be called
nano devices.
5
1.5.2 Chemical Nanocomputers
Chemical nano computers would store and process information in terms of chemical structures and
interactions. Biochemical nano computers already exist in nature; they are manifest in all living
things. The development of a true chemical nano computer will likely proceed along lines similar
to genetic engineering. Engineers must figure out how to get individual atoms and molecules to
perform controllable calculations and data storage tasks. Mohd. Suhail Ansari & Pawan Gupta
216
To explain what are quantum computers, we begin by having a closer look at a basic chunk of
information, namely one bit. From a physical point of view a bit is a physical system which can
be prepared in one of the two different states representing two logical values — no or yes, false or
true, or simply 0 or 1. One bit of information can be also encoded using two different polarisations
of light or two different electronic states of an atom. However, if we choose an atom as a physical
bit then quantum mechanics tells us that apart from the two distinct electronic states the atom can
be also prepared in a coherent superposition of the two states. This means that the atom is both
in state 0 and state 1.Now we push the idea of superposition of numbers a bit further. Consider a
register composed of three physical bits. Any classical register of that type can store in a given
moment of time only one out of eight different numbers i.e. the register can be in only one out of
eight possible configurations such as 000, 001, 010, ... 111. A quantum register composed of three
qubits can store in a given moment of time all eight numbers in a quantum superposition.
6
Another benefit: they require less power to operate, meaning they don’t need the energy sucking
cooling systems required by conventional computers.
1.5.5 Nanorobots
One vision of a nano assembler or nano robot is a device with robotic arms, motors, sensors and
computer to control the behavior, all at the scale of nano meters. In 1992, the book called
“Nanosystem” by Drexler gives an analysis of the feasibility of machine components for such nano
robots [24]. However, even to build a molecular motor, researchers have to consider laws of
thermodynamics when motors are actually in operation [25]. Just building a miniature version of
an ordinary motor is not adequate. Recently, a controversy arose surrounding Feynman’s vision of
nano robots. In 2003, an open debate through letters between K. E. Drexler and R. E. Smalley
(who was awarded a Nobel Prize for the discovery of fullerenes)was presented to public22.
Smalley was not convinced that such molecular assemblers envisioned by Drexler are physically
possible, while Drexler insists on his previous findings. Certainly, the study of similarly-sized
biological machines – organic cells – suggests there may be more effective alternatives to Drexler’s
nano robots. Even if nano robots can be realised, they will not be available in the near future.
7
CHAPTER TWO
In 1959, Richard Feynman, a future Nobel Laureate, gave a visionary talk entitled “There’s Plenty
of Room at the Bottom”1 on miniaturization to nanometre-scales. Later, the work of Drexler [1,2]
also gave futuristic visions of nanotechnology. Feynman and Drexler’s visions inspired many
researchers in physics, material science, chemistry, biology. Microprocessor is the CPU (Central
processing unit) of the computer which controls the memory, input/output devices, and overall
operation in the computer. OR It is the central processing unit built on single IC. Evolution of
Microprocessor Fairchild Semiconductor (founded in 1957) invented the first IC in 1959. INTEL
4004 It is the first microprocessor of INTEL. It was introduced in 1971. it was a 4 bit
microprocessor, after that the number of transistors in increases accordingto the moors law and
today we have processors like I 5 and I 7which are much more efficient. With the development of
efficient computer technology in the 1940s, the solutions of elaborate wave equations for complex
atomic systems began to be a realizable objective. During the 1970s, widely different methods
began to be seen as part of a new emerging discipline of computational chemistry.] The Journal
of Computational Chemistry was first published in 1980.Computational chemistry has featured in
a number of Nobel Prize awards, most notably in 1998 and 2013.
As yet the potential for nanomaterials to exert deleterious effects on humans or the environment is
poorly understood but data on their possible effects is needed so that expanded development and
use of nanotechnology can proceed. Promoters of nanotechnology are aware of the problems
encountered by genetic engineers when the public suddenly became aware of genetically modified
crops present in fields and in foods.
Already there has been one reported rapid withdrawal of a nanotechnology-based product, Magic
Nano, a spray-on ceramic sealant to repel dirt. Over 110 consumers in Europe reported respiratory
symptoms after using the product and the product was pulled in March 2006. There is a need to
educate the public about these new technologies and to discuss their promise as well a potential
safety issues and how these are being addressed. Assessing the risk of using nanomaterials presents
8
some unique challenges because there is little published research on which to base conclusions
and recommendations.
A preliminary framework has been developed to help determine what research is needed, how it
can be integrated, and how the resulting information can be incorporated into decisions about
safety. Thirteen experts with competence in a variety of relevant fields were interviewed to
establish a list of factors affecting the potential human health risks and ecological risks of
nanoparticles. Information was sorted into an influence diagram with relationships that the experts
hypothesized would affect safety assessment. This framework can help prioritize experiments
necessary to determine safety of nanoparticles. As research results are obtained, they can be
incorporated into the framework and used to estimate potential risks. A Forum Series of seven
articles on Research Strategies for Safety Evaluation of Nanomaterials was presented in
Toxicological Sciences in 2005–2006.
Toxicological risk assessment requires data on both exposure to and uptake of nanoparticles and
the toxic effects of these particles (if any) once they enter the body. Unfortunately, available data
on these topics is sparse although there are some recent reports and ongoing projects. Some data
are available on exposure routes but the unique physicochemical properties associated with
different nanoparticles complicate risk assessment. Techniques were described for basic
nanoparticle characterization within the body to aid in assessing how they will interact with
biological systems. Dissolution may be a critical factor determining biological fate and effects of
nanoparticles in the body. Numerous factors affect dissolution, including concentration, surface
area, surface energy, surface morphology, aggregation, dissolution layer properties, and adsorbing
species. Recent advances in nanotechnology are aiding development of sensors that determine
markers of
exposure, biological responses, and environmental remediation. Some consumer products
(cosmetics and sunscreens, sports equipment, textiles) containing nanomaterials were examined to
evaluate possible human exposures to nanoparticles from these products and any potential hazards
they pose.
9
2.1 EVOLUTION, TECHNOLOGY AND APPLICATION
2.1.1 Evolution of Nanotechnology
According to Mihail Roco of the U.S. National Nanotechnology Initiative, the evolution of
nanotechnology has been divided into four generations based on the products, Rocco said that the
first generation of Nanotechnology began in 2000. First-generation products, also known as
"passive nanostructures," are designed to perform one task such as colloids and aerosols. The
second generation, which came about in 2005, is called "active nanostructures." They have a
multitasking ability, and the examples are actuators and sensors. The third generation, or the era
of systems of nanosystems, is expected to begin in 2010. These nanosystems will consist of
thousands of interacting components like 3D networking and new hierarchical architectures,
robotics, and guided assembling. The fourth generation of Nanotechnology will begin in around
2015 when the first integrated nanosystems are expected to be developed
10
2.1.2 Technology and Techniques
Nanotechnology uses the technique of Nanofabrication which helps to manipulate and integrate at
atomic level and is particularly interest for computer engineer because it opens the door to super-
high-density microprocessors and memory chips. It is the design and manufacture of devices with
dimensions measured in nanometers. One can broadly divide various nanofabrication techniques
into two categories:
Top-down approach: Top down approach seeks to create smaller devices by using larger
ones to direct their assembly. The top down approach often uses the traditional micro
fabrication methods in which externally controlled tools are used to cut, mill and shape
materials into the desired shape and order .The most common top down fabrication
technique is nano lithography. In this process required material is protected by a mask and
the exposed material is etched away. Depending upon the level of resolution required for
features in the final product etching of the base material can be done chemically using acids
or mechanically using ultra violet light, x-rays or electron beams. This is the technique
applied to manufacture computer chips.
11
CHAPTER THREE
12
In near future, the computer industry will use the above technology extensively to fabricate
microprocessor chips. The microprocessor chips would be smaller, faster, reliable, efficient and
lighter computers.
3.1.2 Quantum dots
Quantum dots are crystals that emit only one wavelength of light when the electrons are excited.
It is a new material made by bottom up method of nanofabrication. In future quantum dots could
be used as quantum bits and to form the basis of quantum computers.
i)Working of quantum computers
In quantum computers, the binary rate in conventional computers are repeated by quantum bits or
qubits, which can be in a state of 0, 1 and superposition (simultaneously both 0 and 1). As the
quantum computer can hold multiple states simultaneously, it is assumed that it has the potential
to perform a million computations at the same time. This would make the computer much more
faster than before. The development of quantum computer is still under research.
ii. Limitations of quantum computer
Since quantum computers are based on quantum mechanical phenomenon, which are vulnerable
to the effects of
noise, coherence disappearance and loss of quantum bits. These problems are discussed below.
• Problem of coherence disappearance: A quantum computer can only function if the
information exists for long enough to be processed. The researchers have discovered that the
coherence spontaneously disappears over the course of time. This could lead to a considerable
problem for the development of a quantum computer.
•Simultaneous existence of two states: In a quantum computer a superconducting quantum bits
can simultaneously exist in two states. Normally one of the two states disappears as soon as the
system comes into contact with the outside world. The coherence then disappears as a result of the
decoherence process and the information in a quantum bit is lost.
iii. Solution to the above problem
More research needed. There is a need to clarify the issue that molecular dynamics simulations
carried out at finite temperatures of machines of some degree of complexity, in which both the
mechanism itself and its mounting are subject to thermal noise.
13
3.1.3 Carbon nanotubes
It is a tube shaped carbon material that I measured in nanometre scales. With the advancement of
nanofabrication technique, researchers used this material to create electronic components like
transistors, diodes, relays and logic gates. These electronic components can be directly applied in
making advanced computer.
d) DNA computing
It is an approach to nanocomputers. DNA computing uses bottom up approach or method to make
DNA molecules and DNA logic gates.
i. Major Events
• In 1994, L. Adleman has tried to solve a complex travelling salesman problem by using DNA
computing technique.
• In 1997, researchers at the University of Rochester built DNA logic gates. This development is
considered as a step towards a DNA computer.
• Researchers have found that a DNA molecule can store more information than any conventional
memory chip and DNA can be used to perform parallel computations.
The above developments make the idea of DNA computing very appealing to the current
researchers and scientists of the world.
Note: DNA:It is a biological term. It stands for Deoxyribonucleic acid (DNA) and it carries genetic
operation for the biological development of life.
e) NVRAM (non volatile RAM)
Argonne research has developed a NVRAM (non volatile RAM) made up of tiny nano engineered
ferroelectric crystals. Since the tiny nano engineered ferroelectric crystals do not revert
spontaneously, RAM made with them would not be erased should there be a power failure. Using
NVRAM laptop computers would no longer need back up batteries, permitting them to be made
still smaller and lighter. This achievement of nanotechnology is considered as a long –standing
dream of the computer industry.
f) Nanodesign (software system)
A research group at NASA has been developing a software system called Nano Design, for
investigating fullerene nanotechnology and designing molecular machines. The software
architecture of Nanodesign is designed to support and enable their group to develop complex
14
simulated molecular machines. The main purpose behind developing this software system is design
and simulation of materials based on nanotechnology.
One principle that can be used to detect differences in masses uses properties of the harmonic
oscillator, where the frequency of the oscillation is indirectly proportional to the mass of the
pendulum. The detectable mass difference (Δm) via shifting of the frequency of the harmonic
oscillator (Δf) can be approximated to ΔmΔf ≈ 2moscf0 with mosc as the oscillator’s mass and its
resonant frequency fo. Recent developments in this field indicate that detection and counting of
single molecules appears to be of sufficient precision and accuracy for use in clinical settings.
More sensitive than the above-described cantilever is a principle by which adhesion forces between
atoms are used to measure forces or distances, as in atomic force microscopy (Figure 2) (AFM).
Here, a very small mass is attached to a cantilever, and the motion of this beam is recorded to
measure distances as small as 100 attometers (am; 1 am = 10−18 m). For comparison, the C–H bond
has a length of roughly 100 picometers (pm; 1 pm = 10−12 m), placing the distance resolution of
AFM at two orders of magnitude below the distance of a C–H bond in a biomolecule.
15
Fig 3.1: Schematic illustration of the principle of atomic force microscopy.
As outlined in the original paper, the sensitivity of AFM critically depends on a low mass of the
cantilever, as well as its high deflection for a given force and a resonance frequency greater than
100 Hz to minimize background vibrational noise. Considering that the resonance frequency fo is
indirectly proportional to the mass of the spring, one is led to the scenario of the limiting practical
case that a single atom is used as spring.
In principle, a variety of cantilever-based detection application can be derived from AFM, where
a pointed tip is used to trace the surface of a sample. For instance, a ferromagnetic tip could be
used to measure electromagnetic forces by deflecting the cantilever. Applications of this could be
used in electron transfer studies of chemical reactions. Another application of AFM could be the
creation of a bimetallic sensor where differences in thermal expansion coefficients of two materials
can be used to measure temperatures by registering cantilever bending. The small mass of the
cantilever results in a very small thermal capacity of the sensor, allowing for an almost real-time
measurement of thermal events. For example, phase transitions could be monitored using a
considerably small amount of sample that needs to be attached to the cantilever. Another scenario
could be to employ such a sensor in photo-thermal spectrometric assays. Alternatively,
measurements of shifts in resonance frequencies may be used to determine mass changes of a
16
probe attached to the tip of the cantilever, e.g., detection of the gain or loss of mass of a sample
due to varying hydratation as a function of temperature to record environmental conditions, such
as humidity. By extension, an array of sensors could be assembled to measure an array of
parameters, e.g., temperature, humidity and magnetic force, while others serve as reference and
internal calibration to insure reproducibility, accuracy and precision.
In this case, a cantilever consisting of a single atom as “sensor” is used to measure forces, such as
adhesion, magnetic momenta etc. in the sensitivity range of 10 pN, roughly equivalent to the force
necessary to rupture an individual hydrogen bond found in a biomolecule. Building on this
foundation, Sbaizero et al. demonstrated force spectroscopy on a single cell using the well-
established fact that the cytoskeleton of a cell is involved in the transduction and transmission of
mechanical force. Specifically, Sbaizero et al. report that the force needed to remove a covered
nanosphere from the surface of a cardiac fibroblast is decreased when the polymerization of the
actin filaments is disturbed by pretreating cells with Cytochalasin D. This example illustrates how
this method may be used for assessing biophysical properties of the interface between tissue and
cells on the one hand and implants or other biomedical devices on the other (see and references
therein).
Leaving the above-discussed aside, one can formulate the normalized mode functions of the input
and output waveguides (abbreviated as φi and φo) in a three-dimensional Cartesian system of the
dimensions x, y and z. The resulting transmission T from one cantilever to the other is then
expressed by T = T (x, y, z). Any out-of plane movement of the cantilevers (z + Δz) in turn alters
ΔT = T (x, y, z + Δz), such that the linear displacement response RO as a function of Δz of the
output cantilever can be calculated as the ratio of changes of T as a function of the displacement
Δz: Ro= ∂∂z ∣∣∣∣⎡⎣∫∞−∞φi(x, y, z)φo(x, y, z + Δz)dxdy⎤⎦2∣∣∣∣.
Because both cantilevers can function as receiver for a light signal, the optimal linear displacement
for the two cantilevers in this apparatus relative to the point Δz = 0 and RO = 0 needs to be
determined. As shown in Figure 3, RO functions of cantilevers (here named “a” and “b”) do not
need to be congruent and the optimal linear out-of plane replacement in relation to the point Δz =
0 (no displacement) has to be determined for the cantilever to safeguard proper functionality
17
(indicated by the points a and b on the abscissa), especially significant for a system, where more
than one cantilever is used in signal transmission and processing.
Fig 3.2: Linear displacement response of cantilevers in broad-band all-photonic transduction using
a uni-modular photonic waveguide.
Another direct application of the AFM principle for the search for bio-active compounds is the
single-molecule force spectroscopy; the authors elected to screen for new inhibitors of the 2-C-
methyl-D-erythritol-4-phosphate (MEP) pathway designed to specifically affect members of the
kingdom bacteria, including human pathogens, such as malaria parasites, but not members of the
kingdom animalia.
18
fig 3.3: Single-molecule force spectroscopy is an AFM application.
19
CHAPTER FOUR
Nanotechnology will be able to improve scientific exploration by far simply because nanites are
so small. The current trends and the future development will lead to huge contribution in the field
of computer science. Today nanotechnology being in the fourth generation of evolution is likely
to show outstanding innovations in near future. This will increase the quality of life in our society.
The emerging fields of nano science and nano engineering are leading to unpresented
understanding and control over the fundamental building blocks of all physical things.
Mechanical devices allowed us to reach beyond our physical strength and advance into technical
civilization.
Nanoscience and nanoscale manufacturing will allow us to reach beyond our natural size
limitation, and work directly at the building blocks of matter where properties are defined and can
be changed. Combining the present technology with the staggering potential of nanotechnology to
forge devices from the smallest building blocks we are certainly on the verge of a revolution in the
way we sense and control the physical world around us.
This is likely to change the way almost everything –from vaccines to computers to automobiles
tyres to objects not yetimagined –is designed and made.
20
REFERENCES
K. E. Drexler (2020). Engines of Creation: the coming era of nanotechnology.Anchor Press.
K.E. Drexler, C. Peterson and G. Pergamit, 2020. Unbounding the Future: the Nanotechnology
Revolution.
Rueckes, T. et. al, (2000). “Carbon nanotube based nonvolatile random-access memory for
molecular computing,” Science, vol. 289, pp. 94–97.
Zobair Ullah, (2012).” Nanotechnology and Its Impact on Modern Computer”, Global Journal of
Researches in Engineering General Engineering, Volume 12 Issue 4 Version 1.0.
21