Cloud computing
Unit-1
High-Performance Computing
A pool of processors (processor machines or central processing
units [CPUs]) connected (networked) with other resources like
memory, storage, and input and output devices, and the deployed
software is enabled to run in the entire system of connected
components.
The processor machines can be of homogeneous or heterogeneous
type.
HPC is the supercomputers,
HPC include a small cluster of desktop computers or personal
computers (PCs) to the fastest supercomputers.
HPC applications:
solve scientific problems.
protein folding in molecular biology
studies on developing models and applications based on nuclear
fusion
Why Is HPC Important?
It is through data that groundbreaking scientific discoveries
are made, game-changing innovations are fueled, and quality
of life is improved for billions of people around the globe. HPC
is the foundation for scientific, industrial, and societal
advancements.
As technologies like the Internet of Things (IoT),
artificial intelligence (AI), and 3-D imaging evolve, the size
and amount of data that organizations have to work with is
growing exponentially. For many purposes, such as streaming
a live sporting event, tracking a developing storm, testing new
products, or analyzing stock trends, the ability to process data
in real time is crucial.
To keep a step ahead of the competition, organizations need
lightning-fast, highly reliable IT infrastructure to process,
store, and analyze massive amounts of data.
How Does HPC Work?
HPC solutions have three main components:
•Compute
•Network
•Storage
Parallel Computing
Parallel computing is also one of the facets of HPC. Here, a set of proces-
sors work cooperatively to solve a computational problem. These processor
machines or CPUs are mostly of homogeneous type. In serial or sequential
computers, the following apply:
It runs on a single computer/processor machine having a single CPU.
A problem is broken down into a discrete series of instructions.
Instructions are executed one after another.
In parallel computing, since there is simultaneous use of multiple processor
machines, the following apply:
• It is run using multiple processors (multiple CPUs).
• A problem is broken down into discrete parts that can be solved
concurrently.
• Each part is further broken down into a series of instructions
• Instructions from each part are executed simultaneously on differ- ent
processors.
• An overall control/coordination mechanism is employed.
USES of parallel computing:
Science and Engineering
Industrial and commercial
Why use parallel computing:
Save time and Money
Solve larger problems
Provide Concurrency
Use of non-local resources
Distributing Computing
PROS:
Reliability, high fault tolerance
A system crash on one server does not affect other servers.
Scalability
In distributed computing systems you can add more machines as
needed.
Flexibility
It makes it easy to install, implement and debug new services.
Fast calculation speed
A distributed computer system can have the computing power of
multiple computers, making it faster than other systems.
Openness
Since it is an open system, it can be accessed both locally and
remotely.
High performance
Compared to centralized computer network clusters, it can provide
higher performance and better cost performance.
CONS:
Difficult troubleshooting
Troubleshooting and diagnostics are more difficult
due to distribution across multiple servers.
Less software support
Less software support is a major drawback of
distributed computer systems.
High network infrastructure costs
Network basic setup issues, including transmission,
high load, and loss of information.
Security issues
The characteristics of open systems make data
security and sharing risks in distributed computer
systems.
Difference between Parallel Computing and Distributed Computing:
S.NO Parallel Computing Distributed Computing
Many operations are performed System components are located
1.
simultaneously at different locations
2. Single computer is required Uses multiple computers
Multiple processors perform Multiple computers perform
3.
multiple operations multiple operations
It may have shared or
4. It have only distributed memory
distributed memory
Computer communicate with
Processors communicate with
5. each other through message
each other through bus
passing.
Improves system scalability,
Improves the system
6. fault tolerance and resource
performance
sharing capabilities
Cluster Computing
Cluster computing refers that many of the computers
connected on a network and they perform like a single entity.
Each computer that is connected to the network is called a
node.
Cluster computing offers solutions to solve complicated
problems by providing faster computational speed, and
enhanced data integrity.
Cluster is categorized as Open and Close clusters wherein Open
Clusters all the nodes need IP’s and those are accessed only
through the internet or web
Types of Cluster Computing
As clusters are extensively utilized in correspondence to the
complexity of the information, to manage content and the
anticipated operating speed. Many of the applications that
anticipate high availability without a reduction in downtime
employ the scenarios of cluster computing. The types of
cluster computing are:
• Cluster load balancing
• High–Availability clusters
• High-performance clusters
Grid Computing
Grid computing can be stated as the network of either
heterogeneous or homogeneous computer systems all
functioning together over far distances to achieve a task that
would rather be complicated for a single computer to achieve.
Differences between cluster computing & grid computing
Cluster Computing Grid Computing
Connected computers have to be Connected computers can have dissimilar OS
heterogeneous which means that they should and hardware. They can be either
have a similar kind of OS and hardware. heterogeneous or homogeneous.
All the nodes in this are committed to
The nodes here allot their unused processing
performing a similar operation and no other
resources for the grid computing network.
operation is allowed to be done
The nodes can be placed at a far distance
Every node is so close to the next one
from each other
All the nodes in the network are connected
All the nodes in the network are connected
either through low-speed buses or via the
through the high-speed LAN connection.
internet.
The nodes are connected in the format of
They are connected in the format of
either distributed or de-centralized network
centralized network topology
topology
Here, scheduling is managed by the central It can also have a server, but each computer
server performs in its own method
The entire system is with a centralized Each of the computers handles its resources
resource administrator in an independent way
Each computer is autonomous, and anyone
The entire system works as one
can choose for this at anytime
These are tightly connected systems These are loosely connected systems
It follows single system functionality It follows diversity and dynamism
1.Write a short note on Performance Metrics and Scalability Analysis of Distributed
systems
2. Explain the role of Fault Tolerance and System Availability in Distributed
Computing System.
3. Differences between cluster computing and grid computing.
4. Differences between HPC and grid computing
Cloud computing
Cloud computing is the on-demand availability of computer
system resources, especially data storage (cloud storage)
and computing power, without direct active management by
the user.
The term is generally used to describe data centers
available to many users over the Internet.
Large clouds, predominant today, often have functions
distributed over multiple locations from central servers.
If the connection to the user is relatively close, it may be
designated an edge server.
Clouds may be limited to a single organization (enterprise
clouds), or be available to many organizations (public cloud)
Early history
During the 1960s, the initial concepts of time-sharing became popularized via RJE(
Remote Job Entry).
1970s on such platforms as Multics (on GE hardware)
In the 1990s, telecommunications companies, who previously offered primarily dedicated
point-to-point data circuits virtual private network
In 2006, Amazon created subsidiary Amazon Web Services and introduced its
Elastic Compute Cloud (EC2).[13]
In 2008, Google released the beta version of Google App Engine.
By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship
among consumers of IT services.
n February 2010, Microsoft released Microsoft Azure, which was announced in October
2008.[33]
In July 2010, Rackspace Hosting and NASA jointly launched an open-source cloud-
software initiative known as OpenStack
On March 1, 2011, IBM announced the IBM SmartCloud framework to support
Smarter Planet.[41] Among the various components of the Smarter Computing foundation,
cloud computing is a critical part.
On June 7, 2012, Oracle announced the Oracle Cloud.[42] This cloud offering is poised to
be the first to provide users with access to an integrated set of IT solutions, including the
Applications (SaaS), Platform (PaaS), and Infrastructure (IaaS) layers.[43][44][45]
In May 2012, Google Compute Engine was released in preview, before being rolled out
into General Availability in December 2013.[46]
In 2019, it was revealed that Linux is most used on Microsoft Azure.[11]
Biocomputing
Bio computers use systems of biologically
derived molecules—such as DNA and proteins
—to perform computational calculations
involving storing, retrieving, and processing
data. Distinguishable types of biocomputers
include
Biochemical computers
Biomechanical computers
Bioelectronic computers
Network-based biocomputers
Mobile Computing
Mobile Computing is a technology that allows
transmission of data, voice and video via a
computer or any other wireless enabled
device without having to be connected to a
fixed physical link. The main concept involves
• Mobile communication
• Mobile hardware
• Mobile software
Mobile communication
The mobile communication in this case, refers
to the infrastructure put in place to ensure that
seamless and reliable communication goes on.
These would include devices such as protocols,
services, bandwidth, and portals necessary to
facilitate and support the stated services.
Mobile Hardware
Mobile hardware includes mobile devices or
device components that receive or access the
service of mobility. They would range from
portable laptops, smartphones, tablet Pc's,
Personal Digital Assistants.
Mobile software
Mobile software is the actual program that runs
on the mobile hardware. It deals with the
characteristics and requirements of mobile
applications. This is the engine of the mobile
device. In other terms, it is the operating
system of the appliance.
Advantages
Location Flexibility
Saves Time
Enhanced Productivity
Entertainment
Quantum computing
Quantum computers perform calculations
based on the probability of an object's state
before it is measured - instead of just 1s or
0s - which means they have the potential to
process exponentially more data compared to
classical computers.
History
Quantum computing began in the early 1980s,
when physicist Paul Benioff proposed a
quantum mechanical model of the
Turing machine.
In 1994, Peter Shor developed a quantum
algorithm for factoring integers that had the
potential to decrypt RSA-encrypted
communications.
In recent years, investment into quantum
computing research has increased in both the
public and private sector
There are currently two main approaches to
physically implementing a quantum
computer: analog and digital.
Analog approaches are further divided into
quantum simulation, quantum annealing, and
adiabatic quantum computation.
Digital quantum computers use
quantum logic gates to do computation. Both
approaches use quantum bits or qubits
Optical computing
What is nano computing?
• Nano computing describes computing that
uses extremely small, or nano scale, devices.
• Nano computing come from two sources:
• It will be integrated into existing products and technology
(disk drives, for example).
• Fundamentally new products, software, and architectures will
be developed.
• Nano computing change the computer industry in
many ways.
• Existing technologies, like memory and backing storage,
will become even more plentiful than they already are.
• New technologies will be created to replace
obsolete machines.
• New standards and architectures will be needed to make
use of the new systems when they are created.
• All of this requires enormous effort and resources.
So why bother?
Types of nano computers
• Nano electronic computers
• Nano mechanical computers
• Nano chemical and biochemical computers
• Nano Quantum computers
Basic terms
• Quantum-Computer : A quantum computer
is a machine, that performs calculations based
on the behavior of particles at the sub-atomic
level .
• Qubits : Engineers have coined the term qubit
( pronounced KYEW - bit ) to denote
the
fundamental data unit in a quantum computer.
• Teleportation: it involves dematerializing an
object at one point and sending the details of that
object's precise atomic configuration to another
location, where it will be reconstructed.
• Quantum mechanics: a quantum computer
involves quantum mechanics quantum computer
uses subatomic particles, such as electrons and
protons, to solve problems
Devices of nano
computing
ipad, iPhone, HP Slate, Android devices like ICD Gemini etc
How nano computing works?
• Nano computer would work by storing data in the
form of atomic, quantum states or spin. (SEM
and quantum dots)
• There are several methods of nano electronic data
storage currently being researched. Among the
most promising are set electron transistors and
quantum dots.
• All of these devices function based upon the
principles of quantum mechanics…
Risks in nano computing
• Electrons scientists need to develop new
circuits to cope with nano circuits working
with carbon nano tubes.
• The transistors will be 100 times smaller than
the thickness of human hair.
• The ultra capacitors produces high heat till
today there is no remedy given to cool it.
Application of nano
technologies
• Washing machine that inhibits bacterial
growth it washes the cloths.
• It is on the market today and 60% of cars these
fuel lines.
• They make a refrigerator.
• Contact lens that let you check your blood
sugar level by looking at a mirror.
Advantages
• High computing performance
• Low power computing
• Easily portable flexible
• Faster processing
• Lighter and small computer devices
• Noise Immunity: isolating the circuits from noise
from both inside and outside the circuit.
Disadvantages
• Possible loss of jobs in the traditional farming and
manufacturing industry
• It is very expensive and developing it can cost you a lot of
money. It is also pretty difficult to manufacture, which is
probably why products made with nanotechnology are
more expensive.
• These particles are very small, problems can actually
arise from the inhalation of these minute particles.
• Products made with nanotechnology are more
expensive.
Future of nano computing
• National science and technology council
(USA) claim that
“Nano computing is an enabling
technology that change the nature of
almost every human made object in the
next century.”
Quantum Computing
The Next Generation of Computing Devices?
by Heiko Frost, Seth Herve and Daniel Matthews
What is a Quantum Computer?
Quantum Computer
A computer that uses quantum mechanical
phenomena to perform operations on data through
devices such as superposition and entanglement(a
complicated or compromising relation or
suitations).
ClassicalComputer (Binary)
A computer that uses voltages flowing through
circuits and gates, which can be calculated entirely
by classical mechanics.
The Need For Speed...
Classical Digital Computer
Moore’s Law: transistors on chip doubles every 18 months—
microprocessor circuits will measure on atomic scale by
2020-2030
Downscaling of circuit board layout/components is leading to
discrepancies.
Copper traces are crystallizing
Emergence of quantum phenomena such as electrons
tunneling through the barriers between wires.
Serial Processing – one operation at a time
64-bit classical computer operates speeds measured in
gigaflops (billions of floating-point operations per second).
Quantum Computer
Harnesses the power of atoms and molecules to perform
memory and processing tasks
Parallel Processing – millions of operations at a time
30-qubit quantum computer equals the processing power
of conventional computer that running at 10 teraflops
(trillions of floating-point operations per second).
Classical vs Quantum Bits
Classical Bit
2 Basic states – off or on: 0, 1
Mutually exclusive Pure Quibit State:
Quantum Bit (Qubit)
2 Basic states – ket 0, ket 1: | 0 ,
a | 0 b |1
|1
Superposition of both states – where a, b
(not continuous in nature)
2 2
Quantum entanglement s.t. 1 a b
2 or more objects must be
described in reference to one
another 8 Possible States
Entanglement is a non-local
property that allows a set of per Qubit
qubits to express superpositions
of different binary strings
(01010 and 11111, for example)
simultaneously
Quantum Computing Power
Integer Factorization
Impossible for digital computers to factor large
numbers which are the products of two primes of
nearly equal size.
Quantum Computer with 2n qubits can factor
numbers with lengths of n bits (binary)
Quantum Database Search
Example: To search the entire Library of Congress
for one’s name given an unsorted database...
Classical Computer – 100 years
Quantum Computer – ½ second
Practical Quantum Computer
Applications
Quantum Mechanics Simulations
physics, chemistry, materials science,
nanotechnology, biology and medicine.
Computer can compute millions of variables at
once.
All are limited today by the slow speed of
quantum mechanical simulations.
Cryptoanalysis
Capable of cracking extremely complicated codes
RSA encryption
Typically uses numbers with over 200 digits
Quantum Computing History
1973 - Alexander Holevo publishes paper showing that n qubits cannot
carry more than n classical bits of information.
1976 - Polish mathematical physicist Roman Ingarden shows that Shannon
information theory cannot directly be generalized to the quantum case.
1981 - Richard Feynman determines that it is impossible to efficiently
simulate a evolution of a quantum system on a classical computer.
1985 - David Deutsch of the University of Oxford, describes the first
universal quantum computer.
1993 - Dan Simon, at Universite de Montreal, invents an oracle problem
for which quantum computer would be exponentially faster than
conventional computer. This algorithm introduced the main ideas which
were then developed in Peter Shor's factoring algorithm.
1994 - Peter Shor, at AT&T's Bell Labs discovers algorithm to allow
quantum computers to factor large integers quickly. Shor's algorithm
could theoretically break many of the cryptosystems in use today.
1995 - Shor proposs the first scheme for quantum error correction.
1996 - Lov Grover, at Bell Labs, invents quantum database search algorithm
1997 - David Cory, A.F. Fahmy, Timothy Havel, Neil Gershenfeld and Isaac
Chuang publish the first papers on quantum computers based on bulk spin
resonance, or thermal ensembles. Computers are actually a single, small
molecule, storing qubits in the spin of protons and neutrons. Trillions of
trillions of these can float in a cup of water.
1998 - First working 2-qubit NMR computer demonstrated at University of
California, Berkeley.
1999 - First working 3-qubit NMR computer demonstrated at IBM's
Almaden Research Center. First execution of Grover's algorithm.
2000 - First working 5-qubit NMR computer demonstrated at IBM's
Almaden Research Center.
2001 - First working 7-qubit NMR computer demonstrated at IBM's
Almaden Research Center.
First execution of Shor's algorithm. The number 15 was factored using
1018 identical
molecules, each containing 7 atoms.
Candidates for Quantum Computers
Superconductor-based quantum computers
(including SQUID-based quantum computers)
Ion trap-based quantum computers
"Nuclear magnetic resonance on molecules in solution"-based
“Quantum dot on surface"-based
“Laser acting on floating ions (in vacuum)"-based (Ion trapping)
"Cavity quantum electrodynamics" (CQED)-based
Molecular magnet-based
Fullerene-based ESR quantum computer
Solid state NMR Kane quantum computer
Quantum Computing Problems
Current technology
≈ 40 Qubit operating machine needed to rival current
classical equivalents.
Errors
Decoherence - the tendency of a quantum computer to
decay from a given quantum state into an incoherent
state as it interacts with the environment.
Interactions are unavoidable and induce breakdown of information
stored in the quantum computer resulting in computation errors.
Error rates are typically proportional to the ratio of
operating time to decoherence time
operations must be completed much quicker than the
decoherence time.
1. Discuss
about Quantum, optical and bio computing?
2. Will mobile computing play a dominant role in the future?
Discuss