Ex 3 -PLAN THE ARCHITECTURE FOR REAL TIME APPLICATION
AIM:
Designing the architecture for a real−time application involves considering various factors such as
scalability, performance, reliability, and maintainability
Algorithm:
Here's a high−level guide to help you plan the architecture for your real−time application:
Define Requirements:
● Clearly define the functional and non−functional requirements of your real−time application.
● Identify the specific use cases and scenarios that require real−time processing.
System Components:
● Identify the major components of your system. This could include servers, databases, user
interfaces, external APIs, and more.
● Divide the system into smaller, manageable modules that can be developed, tested, and deployed
independently.
Scalability:
● Plan for scalability from the beginning. Consider how the system will handle an increase in load
and user activity.
● Use scalable infrastructure, such as cloud services, to easily adapt to changing demands.
Data Storage:
● Choose an appropriate database solution for real−time data storage and retrieval.
● Consider using in−memory databases or caching mechanisms to improve data access speed.
Real-time Processing:
● Decide on the technologies and frameworks for real−time processing. This may include stream
processing systems like Apache Kafka, Apache Flink, or Rabbit MQ.
● Implement mechanisms for event−driven architecture to handle real−time events efficiently.
Communication:
● Establish efficient communication channels between different components of the system. APIs,
message queues, and Web Socket protocols are common choices for real−time communication.
● Ensure low latency and high throughput for communication between components.
Fault Tolerance:
● Design the system to be fault−tolerant. Use redundant components, implement backup and
recovery strategies, and handle errors gracefully.
● Consider implementing micro services architecture to isolate failures and improve overall system
resilience.
Security:
● Prioritize security measures to protect real−time data and communication.
● Implement secure communication protocols, access controls, and encryption mechanisms.
Monitoring and Analytics:
● Incorporate monitoring tools to track the performance of your real−time application.
● Use analytics to gain insights into user behavior, system performance, and potential issues.
Testing:
● Develop a comprehensive testing strategy, including unit testing, integration testing, and
performance testing.
● Implement continuous integration and continuous deployment (CI/CD) pipelines to automate
testing and deployment processes.
Documentation:
● Document the architecture, APIs, and data flow to facilitate easier maintenance and future
development. Include clear documentation on how to troubleshoot and resolve common issues.
Compliance:
● Ensure that your real−time application complies with relevant regulations and standards,
especially if it involves sensitive data.
ARCHITECTURE:
Explanation:
The above reference architecture is generally applicable: Data streams in from a variety of
producers, typically delivered via Apache Kafka, Amazon Kinesis, or Azure Event Hub, to tools that
ingest it and deliver it to a range of data stores and analytics engines. Between source and destination the
data is prepared for consumption for a variety of reasons, including normalization, obfuscation of PII,
flattening of nested data, filtering, and joining of data from multiple sources.
RESULT:
Thus architecture for real time applications was planned.
EX 7-CASE STUDY USING OLAP
AIM:
To study case using OLAP
Introduction:
OLAP:-
OLAP Stands for "Online Analytical Processing." OLAP allows users to analyze database
information from multiple database systems at one time. While relational databases are considered to be
two-dimensional, OLAP data is multidimensional, meaning the information can be compared in many
different ways. For example, a company might compare their computer sales in June with sales in July,
and then compare those results with the sales from another location, which might be stored in a different
database. In order to process database information using OLAP, an OLAP server is required to organize
and compare the information. Clients can analyze different sets of data using functions built into the
OLAP server. Some popular OLAP server software programs include Oracle Express Server and Hyperion
Solutions Essbase.
Purpose of OLAP:-
An effective OLAP solution solves problems for both business users and IT departments. For
business users, it enables fast and intuitive access to centralized data and related calculations for the
purposes of analysis and reporting. For IT, an OLAP solution enhances a data warehouse or other
relational database with aggregate data and business calculations. In addition, by enabling business users
to do their own analyses and reporting, OLAP systems reduce demands on IT resources.
OLAP offers five key benefits:
● Business-focused multidimensional data
● Business-focused calculations
● Trustworthy data and calculations
● Speed-of-thought analysis
● Flexible, self-service reporting
OLAP operations:
These are used to analyze data in an OLAP cube. There are five basic operations:
Drill down
This makes the data more detailed by moving down the concept hierarchy or adding a new dimension. For
example, in a cube showing sales data by Quarter, drilling down would show sales data by Month.
Roll up
This makes the data less detailed by climbing up the concept hierarchy or reducing dimensions. For
example, in a cube showing sales data by City, rolling up would show sales data by Country.
Dice
This selects a sub-cube by choosing two or more dimensions and criteria. For example, in a cube showing
sales data by Location, Time, and Item, dicing could select sales data for Delhi or Kolkata, in Q1 or Q2,
for Cars or Buses.
Slice
This selects a single dimension and creates a new sub-cube. For example, in a cube showing sales data by
Location, Time, and Item, slicing by Time would create a new sub-cube showing sales data for Q1.
Pivot
This rotates the current view to get a new representation. For example, after slicing by Time, pivoting
could show the same data but with Location and Item as rows instead of columns
RESULT:
Thus case study using OLAP done successfully.
EX 8- CASE STUDY USING OLTP
AIM:
To study case using OLTP
Introduction:
OLTP or online transactional processing is a software program or operating system that supports
transaction-oriented applications in three-tier architecture. It facilitates and supports the execution of a
large number of real-time transactions in a database.
OLTP monitors daily transactions and is typically done over an internet-based multi-access
environment. It handles query processing and, at the same time, ensures and protects data integrity. The
efficacy of OLTP is determined by the number of transactions per second that it can process. OLTP
systems are optimized for transactional superiority hence, suitable for most monetary transactions.
The defining characteristic of OLTP transactions is atomicity and concurrency. Concurrency
prevents multiple users from changing the same data simultaneously. Atomicity (or indivisibility) ensures
that all transactional steps are completed for the transaction to be successful. If one step fails or is
incomplete, the entire transaction fails.
Atomic statefulness is a computing condition in which database changes are permanent, requiring
transactions to be completed successfully. OLTP systems enable inserting, deleting, changing, and
querying data in a database.
OLTP systems activities consist of gathering input data, processing the data, and updating it using
the data collected. OLTP is usually supported by a database management system (DBMS) and operates in
a client-server system. It also relies on advanced transaction management systems to facilitate multiple
concurrent updates.
OLTP Transaction Examples
OLTP systems facilitate many types of financial and non-financial transactions such as:
● Automated teller machines (ATMs)
● Online banking applications
● Online bookings for airline ticketing, hotel reservations, etc.
● Online and in-store credit card payment processing
● Order entry
● E-commerce and in-store purchases
● Password changes and sending text messages
OLTP systems are found in a broad spectrum of industries with a concentration in client-facing
environments.
OLTP Characteristics:
1. Short response time
OLTP systems maintain very short response times to be effective for users. For example, responses from
an ATM operation need to be quick to make the process effective, worthwhile, and convenient.
2. Process small transactions
OLTP systems support numerous small transactions with a small amount of data executed simultaneously
over the network. It can be a mixture of queries and Data Manipulation Language (DML) overload. The
queries normally include insertions, deletions, updates, and related actions. Response time measures the
effectiveness of OLTP transactions, and millisecond responses are becoming common.
3. Data maintenance operations
Data maintenance operations are data-intensive computational reporting and data update programs that run
alongside OLTP systems without interfering with user queries.
4. High-level transaction volume and multi-user access
OLTP systems are synonymous with a large number of users accessing the same data at the same time.
Online purchases of a popular or trending gadget such as an iPhone may involve an enormous number of
users all vying for the same product. The system is built to handle such situations expertly.
5. Very high concurrency
An OLTP environment experiences very high concurrency due to the large user population, small
transactions, and very short response times. However, data integrity is maintained by a concurrency
algorithm, which prevents two or more users from altering the same data at the same time. It prevents
double bookings or allocations in online ticketing and sales, respectively.
A mobile money transfer application is a good example where concurrency is very high as thousands of
users can be making transfers simultaneously on the platform at every time of the day.
6. Round-the-clock availability
OLTP systems often need to be available round the clock, 24/7, without interruption. A small period of
unavailability or offline operations can significantly impact a large number of people and an equally huge
transaction quantity.Downtimes can also pose potential losses to organizations, e.g., an online banking
system downtime has adverse consequences to the bank’s bottom line. Therefore, an OLTP system
requires frequent, regular, and incremental backup.
7. Data usage patterns
OLTP systems experience periods of both high data usage and low data usage. Finance-related OLTP
systems typically see high data usage during month ends when financial obligations are settled.
8. Indexed data sets
Index data sets are used to facilitate rapid query, search, and retrieval.
9. Normalized schema
OLTP systems utilize a fully normalized schema for database consistency.
10. Storage
OLTP stores data records for the past few days or about a week. It supports sophisticated data models and
tables.
1. Business Strategy
The business strategy influences the OLTP systems design. The strategy is formulated at the senior
management and the level of the board of directors.
2. Business Process
They are processes by the OLTP system that will accomplish the goals set by the business strategy. The
processes comprise a set of activities, tasks, and actions.
3. Product, Customer/Supplier, Transactions, Employees
The OLTP database contains information on products, transactions, employees, and customers, and
suppliers.
4. Extract, Transform, Load (ETL) Process
The ETL process extracts data from the OLTP database and transforms it into the staging area, which
includes data cleansing and optimizing the data for analysis. The transformed data is then loaded into the
online analytical processing (OLAP) database, which is synonymous with the data warehouse
environment.
5. Data Warehouse and Data Mart
Data warehouses are central repositories of integrated data from one or more incongruent sources. A data
mart is an access layer of the data warehouse that is used to access specific/summarized information of a
unit or department.
6. Data Mining, Analytics, and Decision Making
The data stored in the data warehouse and data mart is used for analysis, data mining, and decision
making.
RESULT:
Thus case study using OLTP done successfully.