0% found this document useful (0 votes)
36 views25 pages

CC 5

Uploaded by

neera8377
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views25 pages

CC 5

Uploaded by

neera8377
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Main Issues in Cloud Computing:

1. Security Issues:

These are problems related to keeping the data safe on the cloud.

• Data Integrity: Anyone can access cloud data if not protected well. Cloud does not separate
sensitive data from regular data, so it's easy to misuse.

• Data Theft: Vendors often rent servers instead of owning them, increasing the risk of hacking
and stealing data.

• Vendor Security: The security depends on the cloud vendor. If the vendor doesn’t ensure
strong security, the data is at risk.

• User-level Security: Even if a customer is blocked from some actions, they might still find
ways to misuse data.

• Information Security: Data can be stolen while moving between servers or while being
processed.

2. Data Issues:

These problems are related to storing, accessing, and managing data in the cloud.

• Data Loss: Data can be lost due to technical failures or legal problems.

• Data Location: Users don’t always know where their data is stored, which raises trust issues.

• Data Lock-in: Difficult to move data from one cloud provider to another due to different
formats or rules.

• Data Segregation: Data from different users is stored together, which may lead to leakage or
mixing.

• Data Confidentiality & Auditing: Cloud data is often stored in public environments,
increasing the chance of attack.

• Data Deletion: Deleting data may not be possible sometimes, and this can affect backups.

• Data Integrity Check: Ensuring that data is accurate and untampered is challenging.

3. Performance Issues:

Problems affecting speed and smooth working of cloud services.

• Slow Access: Due to poor network connections or heavy traffic.

• App Hang-ups: Occurs when the server runs out of memory or processing power.

• Scalability Problems:

o Vertical Scaling: Adding more power to the same server.


o Horizontal Scaling: Adding more servers. Both can be hard to manage during high
demand.

4. Energy-Related Issues:

These refer to the large power usage of cloud systems.

• High Power Use: Running big servers needs a lot of electricity.

• Cooling Problems: Improper cooling can reduce server life.

• Carbon Emission: High energy use leads to more pollution.

• Rising Energy Bills: Electricity costs are a big challenge for cloud providers.

5. Fault Tolerance Issues:

Fault tolerance means the system continues to work even if something goes wrong.

• System Failures: If systems crash, data may be lost or services may stop.

• Software Bugs: Some bugs may only appear during real use, not during testing. Cloud
systems must be ready to handle such failures.

What are the Cloud Security Challenges?

Cloud security challenges are the risks or threats that can affect the confidentiality, integrity, and
availability of your data stored on cloud platforms. These include:

1. Deployment Model Risks

Depending on the type of cloud used:

• Public Cloud: Data is more exposed to the public, so it has higher risk.

• Private Cloud: Safer, but expensive.

• Hybrid Cloud: Managing security in mixed environments can be tough.

2. Service Model Risks


Each cloud service model has its own risks:

• SaaS (Software as a Service): Users depend on the provider for all security.

• PaaS (Platform as a Service): Developers must secure the apps they build.

• IaaS (Infrastructure as a Service): Users are responsible for securing OS, data, and apps.

3. Network Security Issues

• Browser Security: Weak browsers can leak data.

• SQL Injection: Attackers inject malicious code through forms or URLs.

• Flooding Attack: Too many requests are sent to crash the system.

• XML Attacks: Data formats can be misused to corrupt the system.

• Data Deletion & Lock-in: Data may be deleted accidentally or locked by vendors.

4. Other Security Threats

• Data Leakage: Sensitive data may leak due to poor security.

• Hacking / Malicious Attacks: Hackers may steal or damage data.

• Shared Resources: In cloud, many users share the same server, increasing risk.

• Weak Authentication: Poor passwords or access control can lead to unauthorized access.

How is Data Secured at Various Stages in Cloud?

Cloud security protects your data at three main stages:

1. Data at Rest (Stored Data)

Data that is stored in the cloud.

• Encryption is used to protect stored data.

• Access Control ensures only authorized users can access the data.

• Backup & Recovery solutions help restore data if lost or corrupted.

2. Data in Transit (Moving Data)

Data that is being transferred between user and cloud or between cloud servers.

• SSL/TLS Protocols encrypt the data during transfer.

• Secure APIs and VPNs (Virtual Private Networks) are used.

• Firewalls block unwanted access.


3. Data in Use (Processing Data)

Data being processed by cloud applications.

• Authentication & Identity Management ensures only verified users access data.

• Monitoring & Logging tracks who accessed the data and when.

• Trusted Execution Environments (TEEs) protect sensitive operations.

Hurdles in Cloud Computing

Cloud computing has many benefits, but there are some major challenges or hurdles that affect its
growth and usage. These are:

1. Security and Privacy Issues

• Sensitive data is stored online, so there is always a risk of data breaches, hacking, and
misuse.

• Users may not fully trust cloud providers with their confidential data.
2. Downtime or Unavailability

• Cloud services depend on the internet. If the internet or cloud server goes down, services
are not accessible.

• Even big companies like AWS or Google Cloud sometimes face outages.

3. Limited Control Over Infrastructure

• Users do not own the hardware or data centers.

• They have less control over how data is stored, managed, or backed up.

4. Data Transfer and Bandwidth Costs

• Uploading and downloading large data to/from the cloud can be slow and expensive.

• Users may face latency (delay) issues in data access.

5. Data Loss or Leakage

• If a cloud server crashes or is attacked, data can be lost.

• Poor security settings can lead to unauthorized access and data leaks.

6. Vendor Lock-In

• Moving data from one cloud provider to another can be difficult.

• Each provider has different platforms, which may not be compatible.

7. Compliance and Legal Issues

• Different countries have different rules about where and how data can be stored (like GDPR
in Europe).

• Cloud providers must meet all the legal and compliance requirements.

8. Performance Issues

• If the application is not optimized or if cloud servers are overloaded, users may experience
slow performance.

• Shared resources (multi-tenant environments) can also lead to performance drops.


9. Integration with Existing Systems

• Integrating cloud solutions with current on-premise systems can be complex.

• Old systems may not support modern cloud platforms easily.

10. Lack of Expertise

• Many organizations don’t have skilled cloud professionals.

• Managing cloud infrastructure needs special knowledge.

Challenges in Cloud Computing

Even though cloud computing offers many benefits like flexibility, cost-saving, and scalability, it also
faces several challenges. These include:

1. Security and Privacy

• Storing sensitive data on third-party servers raises concerns.

• Risks include data breaches, cyberattacks, and insider threats.

• Ensuring encryption, authentication, and access control is critical.

2. Internet Dependency / Downtime

• Cloud services require a stable internet connection.

• If the internet or cloud provider is down, services become unavailable.

3. Data Loss or Leakage

• Accidental deletion, server crashes, or cyberattacks can lead to data loss.

• Improper backups or weak disaster recovery plans increase this risk.

4. Cost Management

• Cloud may appear cheaper, but costs can increase unexpectedly due to:

o High data transfer

o Scaling resources without control

o Hidden service charges


5. Vendor Lock-In

• Switching cloud providers is difficult due to:

o Incompatible platforms

o Data transfer complexity

o Technical and financial costs

6. Compliance and Legal Challenges

• Different countries have different data protection laws (like GDPR, HIPAA).

• Cloud providers must ensure legal compliance.

• Organizations need to ensure data is stored in approved regions.

7. Limited Control Over Infrastructure

• The cloud provider manages the infrastructure, not the user.

• Less visibility and control can create dependency on the provider.

8. Performance Issues

• Latency and speed issues may arise, especially in high-traffic or global applications.

• Shared resources may impact response times.

9. Integration with Legacy Systems

• Older systems may not easily integrate with modern cloud platforms.

• This increases cost and complexity during migration.

10. Lack of Skilled Professionals

• Managing cloud environments needs expertise in:

o DevOps

o Cloud architecture

o Security and automation tools

• Shortage of skilled cloud engineers is a common issue.


11. Service Reliability and Support

• Cloud service issues (like outages or bugs) may take time to resolve.

• Limited support from providers in basic plans.

Software as a Service (SaaS)

Definition:

Software as a Service (SaaS) is a cloud computing model where software applications are delivered
over the internet. Users can access the software via a web browser without installing or maintaining
it on their local machines.
How SaaS Works:

1. The service provider hosts the application and infrastructure.

2. Users access the software through the internet (no need to install).

3. The provider manages all updates, security, and maintenance.

4. Users pay based on usage (monthly/yearly subscription).

Advantages of SaaS:

• No installation needed.

• Accessible from anywhere.

• Cost-effective (no infrastructure required).

• Scalable (add/remove users easily).

• Reduced IT workload.
Disadvantages of SaaS:

• Internet-dependent.

• Less control over software.

• Data security concerns.

• Limited customization compared to self-hosted software.

Real-Life Analogy:

Think of SaaS like renting a car:

• You use the car (software) when you need it.

• The company takes care of maintenance, insurance, etc.

• You just pay and drive — no ownership or repairs needed.


Short Note on Driving Forces of SaaS

Software as a Service (SaaS) has become popular due to several key factors that support its rapid
growth and adoption. These driving forces include:

1. Low Cost

• SaaS reduces the need for expensive hardware and software installations.

• It follows a subscription model (pay-as-you-go), making it affordable for businesses of all


sizes.

2. Easy Accessibility

• SaaS applications are accessible via the internet from anywhere, using any device.

• This is especially useful for remote work and mobile users.

3. Automatic Updates

• The service provider handles updates, patches, and maintenance.


• Users always get the latest features without extra effort.

4. Scalability

• SaaS platforms can easily scale up or down based on the user’s needs.

• Ideal for startups and growing businesses.

5. Faster Deployment

• SaaS apps are ready to use quickly, with no lengthy installation or setup process.

6. Security and Backup

• Cloud providers offer built-in data security, backups, and disaster recovery, which are hard to
manage in local systems.

7. Increasing Internet Penetration

• With better internet access worldwide, more users can rely on online applications like SaaS.

Question 5.10:

Explain standards used for resource accessing in cloud computing


OR
Identify NIST cloud computing reference architecture with a neat schematic diagram

Answer:

The question has two options:

Option 1: Standards used for resource accessing in cloud computing

Cloud computing relies on a set of standards to ensure interoperability, security, and scalability.
Common standards include:

1. Common Standards: Ensure systems work together, especially across different cloud
providers.

2. Open Cloud Consortium: Promotes open frameworks and data standards for cloud
computing.

3. Distributed Management Task Force (DMTF): Provides specifications like CIMI (Cloud
Infrastructure Management Interface) for resource management.
4. Standards for Applications: Define how apps can be deployed and managed (e.g., SOA -
Service-Oriented Architecture).

5. Standards for Developers: APIs and tools that developers use for cross-platform support.

6. Standards for Messaging: Enable communication between distributed components (e.g.,


AMQP, XMPP).

Option 2: NIST Cloud Computing Reference Architecture

This model, developed by NIST (National Institute of Standards and Technology), includes the
following components:

• Cloud Consumer: The end user of the cloud services.

• Cloud Provider: Offers cloud services (IaaS, PaaS, SaaS).

• Cloud Auditor: Evaluates cloud services for performance, security, and compliance.

• Cloud Broker: Manages service delivery and relationships between provider and consumer.

• Cloud Carrier: Provides connectivity and transport of cloud services.

There’s also a diagram in your book showing these entities and their interactions (e.g.,
communication lines between consumer and provider via broker and carrier).

Question 5.11:

Discuss the scope between provider and consumer of NIST cloud computing reference
architecture.

Answer:

This question is asking you to describe how responsibilities and operations are shared between cloud
providers and cloud consumers.

Key Points:

1. Cloud Provider Responsibilities:

o Manages cloud infrastructure.

o Offers services (IaaS, PaaS, SaaS).

o Ensures security, availability, and scalability.

o Signs Service Level Agreements (SLAs) with consumers.

2. Cloud Consumer Responsibilities:

o Uses the services based on SLA terms.

o Responsible for managing their own data and applications hosted on the cloud.
o May need to configure settings or manage virtual machines depending on service
type.

3. Scope of Interaction:

o SaaS: Consumer interacts with the application only.

o PaaS: Consumer manages applications and data.

o IaaS: Consumer manages the full virtual machine, OS, applications, etc.

4. Other Entities:

o Cloud Broker helps negotiate or manage services.

o Cloud Carrier ensures service delivery and network connectivity.

o Cloud Auditor evaluates security, performance, and compliance.

Sure! Here are the characteristics of NIST explained simply:

1. Clear and Complete


NIST makes rules and guidelines that cover everything important about a topic.

2. Made with Experts Together


NIST works with many smart people from companies, schools, and government to make
good standards.

3. Not Forced, But Many Follow It


You don’t have to use NIST rules by law, but lots of organizations use them because they are
helpful.
4. Can Be Changed to Fit Your Needs
NIST’s guidelines can be adjusted to fit small or big companies and different industries.

5. Focus on Managing Risks


NIST tells you to find out what problems you might have and fix them step by step.

6. Easy to Check Progress


You can measure how well you are doing by following NIST’s steps.

7. Works with Any Technology


NIST doesn’t force you to use one specific tool or software, you can pick what works best.

8. Encourages Getting Better Over Time


NIST helps you keep improving your security or processes, not just do it once.

For messaging in cloud computing, the suitable standards and protocols often depend on the use
case (like asynchronous messaging, real-time messaging, or integration). But here are some widely
accepted standards and protocols commonly used for messaging in cloud environments:

Common Messaging Standards for Cloud Computing

1. AMQP (Advanced Message Queuing Protocol)

o An open standard protocol for message-oriented middleware.

o Supports reliable, secure, and interoperable messaging.

o Used in platforms like RabbitMQ, Apache Qpid.

2. MQTT (Message Queuing Telemetry Transport)

o Lightweight messaging protocol designed for low-bandwidth or unreliable networks.

o Popular in IoT and cloud messaging for devices.

3. JMS (Java Message Service)

o A Java API standard for sending messages between two or more clients.

o Used in Java-based cloud apps.

4. REST/HTTP-based Messaging

o Many cloud messaging services (like AWS SNS/SQS, Azure Service Bus) support REST
APIs for messaging.

o Simple and widely compatible.

5. WebSockets

o Enables real-time, full-duplex communication channels over a single TCP connection.

o Useful for real-time cloud messaging.


Which One to Choose?

• For reliable enterprise messaging: Use AMQP or JMS (if using Java).

• For IoT or lightweight devices: Use MQTT.

• For easy cloud integration with web apps: Use REST/HTTP APIs or WebSockets.

Sure! Here’s a simple explanation of different security standards in cloud computing — these are like
rules or guidelines to keep data safe when using cloud services.

1. ISO/IEC 27001

• This is an international standard for managing information security.

• It helps organizations keep their data safe by setting up security rules, processes, and checks.

• Think of it like a checklist to make sure a company protects information properly.

2. NIST SP 800-53

• Created by NIST (a U.S. government group), this gives a list of security controls (rules) to
protect computer systems and data.

• It’s detailed and helps organizations manage risks and secure their cloud systems.

3. CSA CCM (Cloud Security Alliance - Cloud Controls Matrix)

• Specifically made for cloud computing security.

• It lists important security controls and best practices that cloud providers and users should
follow.

• Helps to check if a cloud provider is trustworthy.

4. PCI DSS (Payment Card Industry Data Security Standard)

• This standard protects payment card information like credit card data.

• Important for any cloud service handling online payments.

5. HIPAA (Health Insurance Portability and Accountability Act)

• U.S. law for protecting health information.

• If cloud services store or process medical data, they must follow HIPAA rules.

6. FedRAMP (Federal Risk and Authorization Management Program)

• A U.S. government standard for cloud security.

• It sets requirements for cloud providers to work with government agencies securely.
Why these standards matter?

• They help protect data from hackers and mistakes.

• They build trust between cloud users and providers.

• They guide companies on how to keep cloud systems safe and follow laws.

End User Computing (EUC)

End User Computing refers to systems and tools that let non-technical users (like employees or
customers) create and manage applications or data without needing help from IT experts.

Examples include:

• Using Excel to analyze data

• Creating simple websites or forms

• Customizing dashboards or reports

It helps users solve problems quickly and increases productivity, but may also lead to security or data
issues if not managed properly.

On Cloud (Cloud Computing)

Cloud Computing means using the internet to access and store data or run software instead of using
a local computer or server.

For example:

• Storing files on Google Drive

• Running software like Zoom or Microsoft Office online

Benefits include:

• Access from anywhere

• No need for expensive hardware

• Easy to scale up or down

Short Note on Hadoop (Expanded)

Hadoop is an open-source software framework created by Apache to deal with big data – which
means very large and complex sets of data that are hard to manage using traditional tools.

Hadoop helps in both storing and processing such data efficiently across many computers connected
in a network.
How Hadoop Works

1. HDFS (Hadoop Distributed File System):

o This part is used to store data.

o It splits large files into smaller parts and stores them on multiple computers.

o This makes storage fast, reliable, and safe (even if one computer fails, data is not
lost).

2. MapReduce:

o This is used to process data.

o It divides a task into smaller parts, processes them in parallel on many computers,
and then combines the results.

o This saves time and handles huge data quickly.

Key Features of Hadoop

• Cost-effective: Works on normal computers, not expensive servers

• Scalable: You can add more computers as your data grows

• Flexible: Can handle all kinds of data – text, images, videos, etc.

• Fault-tolerant: Even if some computers crash, Hadoop keeps working without data loss

Uses of Hadoop

1. Big Data Analysis – Used by companies to analyze huge data sets (e.g., customer behavior,
trends).

2. Search Engines – Used by Google, Yahoo, etc., to index and search data quickly.

3. Social Media – Platforms like Facebook use it to analyze user activity and ads performance.

4. Healthcare – Helps in analyzing medical records and predicting diseases.

5. Banking & Finance – Detects fraud, risk management, and customer analysis.

6. Retail – Tracks buying patterns, manages inventory, and personalizes recommendations.

==========

Here's a simple explanation of how cloud computing provides scalability and fault tolerance:

1. Scalability in Cloud Computing


Scalability means the ability to increase or decrease computing resources (like storage, memory, or
processing power) based on the current need.

How the Cloud Provides Scalability:

• Auto-scaling: Cloud platforms like AWS, Azure, or Google Cloud automatically add more
resources when demand increases (like more users on a website).

• On-demand resources: You can add more servers, storage, or services instantly, without
buying new hardware.

• Pay-as-you-go: You only pay for what you use, making it affordable to scale up or down
anytime.

Example: An e-commerce site gets high traffic during a sale. Cloud automatically adds more
servers to handle it and removes them later.

2. Fault Tolerance in Cloud Computing

Fault tolerance means the system keeps working even if some parts fail.

How the Cloud Provides Fault Tolerance:

• Redundancy: Cloud providers store copies of your data in multiple locations (data centers). If
one fails, another takes over.

• Load balancing: Workloads are spread across several servers, so if one server crashes, others
handle the work.

• Backups and recovery: Regular backups ensure data is safe and can be restored if there's a
problem.

Example: If one server in a cloud data center fails, traffic is redirected to another healthy server
without downtime.

Features of Hadoop

1. Open Source

o Hadoop is free to use and anyone can modify it.

o It’s developed and maintained by the Apache Software Foundation.

2. Scalable

o Hadoop can easily grow from a few computers to thousands.

o You can add more machines as your data increases.

3. Fault Tolerant

o If one computer (node) fails, Hadoop automatically shifts work to other working
nodes.
o It keeps multiple copies of data to prevent loss.

4. Cost-Effective

o Runs on regular, low-cost hardware (not expensive servers).

o Saves money for companies managing large data.

5. High Speed (Parallel Processing)

o Hadoop splits data and processes it at the same time on different machines.

o This makes it much faster for big data tasks.

6. Flexible

o Can handle any type of data: structured (tables), semi-structured (XML), or


unstructured (videos, images, text).

Modules of Hadoop

Hadoop has 4 main modules, each with a special job:

1. HDFS (Hadoop Distributed File System)

• It stores data across many computers.

• Breaks big files into smaller blocks and stores them on different machines.

• Keeps 3 copies of each block by default to prevent data loss.

2. MapReduce

• It processes data in parallel (at the same time) across multiple machines.

• Has two parts:

o Map: Breaks the task into smaller parts

o Reduce: Combines results to give final output

3. YARN (Yet Another Resource Negotiator)

• It manages system resources (like CPU and memory).

• Decides which task runs where and when across the cluster.

4. Hadoop Common

• This is a set of shared tools and libraries used by all other modules.

• Helps everything in Hadoop work together smoothly.

Architecture of Hadoop (Simple Explanation)

Hadoop follows a Master-Slave Architecture. It mainly has two layers:


1. Storage Layer – HDFS (Hadoop Distributed File System)

• NameNode (Master):

o Manages the file system.

o Keeps track of where data is stored.

o Does not store actual data, only metadata (file names, block locations).

• DataNode (Slave):

o Stores the actual data in blocks.

o Sends data to NameNode regularly (heartbeat).

o If a DataNode fails, Hadoop can use copies stored on other nodes.

2. Processing Layer – MapReduce (or YARN)

• JobTracker / ResourceManager (Master):

o Assigns tasks to different nodes.

o Manages job scheduling and resource allocation.

• TaskTracker / NodeManager (Slave):

o Runs the tasks (Map and Reduce).

o Reports progress back to the master.

Note: In newer Hadoop versions, YARN is used instead of MapReduce for resource management.
What is MapReduce?

MapReduce is a programming model used in Hadoop to process and analyze large data sets in a
parallel and distributed way. It divides a large task into smaller parts, processes them on different
machines, and then combines the results.
It mainly has two main functions:

• Map: Processes input data and converts it into key-value pairs

• Reduce: Takes grouped data from the map phase and produces the final result

Phases of MapReduce

1. Input Split Phase

o The input data is divided into small parts called splits.

o Each split is given to a separate Map task.

2. Map Phase

o In this phase, the data is processed into key-value pairs.

o This helps in organizing data for further processing.

3. Shuffle and Sort Phase

o After mapping, data is shuffled and sorted so that all values with the same key are
grouped together.

o This prepares the data for the reduce phase.

4. Reduce Phase

o The grouped key-value pairs are processed to produce the final output.

o The reduce function summarizes the results.

5. Output Phase

o The final result is written to the Hadoop Distributed File System (HDFS).

Workflow of MapReduce

1. Input Data Splitting

o The large input dataset is divided into smaller chunks called input splits.

2. Mapping

o Each split is processed by a Map task that reads the data and converts it into key-
value pairs.

3. Shuffle and Sort

o The output from the Map tasks is shuffled to group all pairs with the same key
together.

o The data is also sorted by key.

4. Reducing
o The grouped data is passed to Reduce tasks, which process and combine the data to
produce summarized output.

5. Output

o The final results from the Reduce tasks are written back to the Hadoop Distributed
File System (HDFS).

Features of MapReduce

• Scalability: Can process petabytes of data by distributing tasks over many nodes.

• Fault Tolerance: Automatically reruns failed tasks on other nodes.

• Parallel Processing: Processes data blocks simultaneously, speeding up computation.

• Simplicity: Programmers only write Map and Reduce functions; the framework handles the
rest.

• Cost-effective: Runs on commodity hardware instead of expensive servers.

• Flexibility: Can process different data types like text, images, or videos.

Google App Engine (GAE)

Google App Engine is a cloud platform by Google that lets developers build and host web
applications without worrying about managing servers. It automatically handles infrastructure,
scaling, and load balancing.

Features of Google App Engine

• Automatic Scaling: Apps scale up or down based on traffic automatically.

• Fully Managed: No need to manage servers or infrastructure.

• Supports Multiple Languages: Like Python, Java, Go, Node.js, and more.

• Built-in Security: Provides security features and integrates with Google Cloud security.

• Integrated Developer Tools: Easy deployment and debugging with Google Cloud tools.

• Versioning: Supports deploying multiple versions and easy rollbacks.

Services by Google App Engine:

1. Data Store

o Stores data efficiently with queries and transactions.

o Supports multiple programming languages.


o Uses high-speed cache for faster access.

2. Google Accounts

o Lets apps use Google account login.

o Shares user info if allowed.

o Uses same account details across apps.

3. URL Fetch

o Apps can fetch data from internet APIs or web services.

4. Mail

o Sends emails using Google’s infrastructure.

Supported Environments for Google App Engine:

1. Java Runtime

o Use Java tools and APIs.

o Supports Java Servlet & other enterprise tech.

2. Python Runtime

o Run Python apps with its interpreter.

o Use Python tools and APIs for data and emails.

You might also like