0% found this document useful (0 votes)
60 views146 pages

CAF-3-QB

The document is a comprehensive guide published by the Institute of Chartered Accountants of Pakistan, focusing on data systems and risks. It covers various topics including types of data, data governance, data analytics, and emerging technologies, along with self-test questions and answers for each chapter. The publication aims to provide insights into managing data effectively while addressing associated risks and security concerns.

Uploaded by

ha2600202
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views146 pages

CAF-3-QB

The document is a comprehensive guide published by the Institute of Chartered Accountants of Pakistan, focusing on data systems and risks. It covers various topics including types of data, data governance, data analytics, and emerging technologies, along with self-test questions and answers for each chapter. The publication aims to provide insights into managing data effectively while addressing associated risks and security concerns.

Uploaded by

ha2600202
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 146

CAF - 3

DATA, SYSTEMS
AND RISKS
CAF 3 - DATA SYSTEMS AND RISKS

First edition published by


The Institute of Chartered Accountants of Pakistan
Chartered Accountants Avenue
Clifton
Karachi – 75600 Pakistan
Email: ipd@icap.org.pk
www.icap.org.pk

© The Institute of Chartered Accountants of Pakistan, July 2025

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any
form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, without the prior
permission in writing of the Institute of Chartered Accountants of Pakistan, or as expressly permitted by law, or under
the terms agreed with the appropriate reprographics rights organization.

You must not circulate this book in any other binding or cover and you must impose the same condition on any
acquirer.

Notice
The Institute of Chartered Accountants of Pakistan has made every effort to ensure that at the time of writing, the
contents of this study text are accurate, but neither the Institute of Chartered Accountants of Pakistan nor its directors
or employees shall be under any liability whatsoever for any inaccurate or misleading information this work could
contain.

Faculty Note
If you identify any errors, omissions, or ambiguities in the content, please inform us at ipd@icap.org.pk so that
corrections can be incorporated in future editions.

ii THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS
QUESTION
BANK
TABLE OF CONTENTS

CHAPTER PAGE

Chapter 1 Types of Data and Sources 1

Chapter 2 Data Governance and Management 5

Chapter 3 Introduction to Data Analytics 9

Chapter 4 Big Data 13

Chapter 5 Database Management Systems 17

Chapter 6 Database Normalization & Data Warehousing 25

Chapter 7 IT Systems Architecture 31

Chapter 8 Enterprise Resource Planning Systems 37

Chapter 9 Emerging Technologies 41

Chapter 10 Artificial Intelligence & Automation 47

Chapter 11 Cloud Computing 53

Chapter 12 Blockchain and Fintech 57

Chapter 13 Impact of Digital Disruption on Business and Accountancy 61

Chapter 14 IT Risk Management & Security 65

Chapter 15 Cyber Security & Information Security Risks 69

Chapter 16 IT General Controls for Managing Risk 73

Chapter 17 ICT's Role in Risk Management 77

Annexures

Annexures A Control Objectives for Information and Related Technologies (COBIT) 81

Annexures B ISO/IEC 27001, 27002 & 27005 83

Annexures C Regulatory Guidelines by State Bank of Pakistan (SBP) 85

Annexures D Pakistan’s Legal Framework for Cybercrimes & Digital Security 87

LONG-FORM QUESTIONS 91

LONG-FORM ANSWERS 97

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN iii


CAF 3 - DATA SYSTEMS AND RISKS

iv THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 1
TYPES OF DATA
& SOURCES

1. Which of the following best describes qualitative data?


a) Numerical data that can be measured
b) Non-numerical data that describes characteristics
c) Data properly organized in rows and columns
d) Data collected from external sources

2. Nominal data is characterized by:


a) Categories with a meaningful order
b) Numerical values that can be averaged
c) Categories without any order or ranking
d) Continuous measurements

3. Which of the following is an example of ordinal data?


a) Temperature in Celsius
b) Customer satisfaction ratings (e.g., satisfied, neutral, dissatisfied)
c) Blood type (A, B, AB, O)
d) Number of products sold

4. Quantitative data that can take any value within a range is called:
a) Discrete data
b) Nominal data
c) Continuous data
d) Ordinal data

5. Which of the following is an example of discrete data?


a) Weight of a person
b) Time taken to complete a task
c) Number of employees in a company
d) Height of a building

6. Structured data is best defined as:


a) Data without any predefined format
b) Data organized in a fixed schema (e.g., rows and columns)
c) Data that includes text, images, and videos
d) Data stored in JSON format

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 1


CHAPTER 1: TYPES OF DATA & SOURCES CAF 3 - DATA SYSTEMS AND RISKS

7. Which of the following is an example of unstructured data?


a) A spreadsheet of customer transactions
b) A database of employee records
c) A social media post with text and images
d) An XML file containing product details

8. Semi-structured data differs from unstructured data because it:


a) Has no organizational tags
b) Is stored only in relational databases
c) Contains some organizational markers (e.g., JSON, XML)
d) Cannot be processed by computers

9. Which of the following is a case study example of structured data?


a) Analyzing customer reviews on Twitter
b) Storing bank transaction records in a database
c) Extracting sentiment from emails
d) Processing HTML webpages

10. A JSON file is an example of:


a) Structured data
b) Unstructured data
c) Semi-structured data
d) Nominal data

11. Which of the following is an example of an external data source?


a) Employee attendance records
b) Customer purchase history in a CRM system
c) Stock market indices from financial websites
d) Production logs from a manufacturing plant

12. Primary data is collected:


a) From pre-existing reports
b) Firsthand for a specific purpose (e.g., surveys)
c) Only from government databases
d) Through automated web scraping

13. A company using a market research report from a consulting firm is an example of:
a) Primary data
b) Secondary data
c) Internal data
d) Unstructured data

2 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 1: TYPES OF DATA & SOURCES

14. Which of the following is NOT an ethical consideration in data collection?


a) Privacy
b) Transparency
c) Data manipulation to favor results
d) Security

15. A social media company failing to inform users about how their data is used violates which ethical principle?
a) Fairness
b) Transparency
c) Security
d) Data accuracy

16. Ensuring that data collection does not lead to discrimination is related to:
a) Privacy
b) Fairness
c) Timeliness
d) Data structuring

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 3


CHAPTER 1: TYPES OF DATA & SOURCES CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
1 1 b
1 2 c
1 3 b
1 4 c
1 5 c
1 6 b
1 7 c
1 8 c
1 9 b
1 10 c
1 11 c
1 12 b
1 13 b
1 14 c
1 15 b
1 16 b

4 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 2
DATA GOVERNANCE AND
MANAGEMENT

1. What is the primary goal of data governance?


a) To increase data storage costs
b) To ensure data is accurate, secure, and compliant with regulations
c) To eliminate all internal data sources
d) To restrict access to data for all employees

2. Which of the following is a key component of data governance?


a) Data ownership and stewardship
b) Social media marketing
c) Employee vacation policies
d) Office interior design

3. Data governance operates at which three levels?


a) Local, national, and global
b) Strategic, tactical, and operational
c) Input, process, and output
d) Primary, secondary, and tertiary

4. Which activity is performed at the operational level of data governance?


a) Setting organizational vision for data use
b) Running daily data quality checks
c) Approving annual budgets
d) Negotiating cloud storage contracts

5. A hybrid storage solution combines:


a) On-premises and cloud storage
b) Paper files and digital scans
c) Audio and video data
d) Structured and unstructured data

6. Which emerging trend stores data closer to its source (e.g., IoT devices)?
a) Blockchain storage
b) Edge storage
c) Data lakes
d) Magnetic tape storage

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 5


CHAPTER 2: DATA GOVERNANCE AND MANAGEMENT CAF 3 - DATA SYSTEMS AND RISKS

7. Data lakes are best suited for:


a) Storing only structured data
b) Centralizing raw, unstructured, and structured data at scale
c) Replacing all relational databases
d) Managing employee payroll systems

8. Logical integrity in databases ensures:


a) a) Hardware is protected from physical damage
b) b) Data remains consistent through constraints (e.g., foreign keys)
c) c) All employees have the different password
d) d) Data is stored only in the cloud

9. Multi-factor authentication (MFA) is used to:


a) Increase data storage costs
b) Verify user identity through multiple methods (e.g., password + SMS)
c) Strictly implementing firewall protections
d) Saving metadata

10. Zero Trust Architecture follows the principle of:


a) "Trust all employees "
b) "Never trust, always verify"
c) "Encrypt all data"
d) "Store all data on cloud "

11. Which regulation in Pakistan governs financial institutions' data practices?


a) Prevention of Electronic Crimes Act
b) SBP’s Enterprise Technology Governance and Risk Management Framework
c) SECP’s Enterprise Technology Governance and Risk Management Framework
d) All of the above

12. AI and machine learning can improve data governance by:


a) Automating data quality checks and anomaly detection
b) Replacing all human data stewards
c) Reducing data storage costs
d) All of the above

13. A key driver for data governance is:


a) Reducing the need for data backups
b) Supporting self-service business intelligence (SSBI)
c) Eliminating all metadata
d) Using data for the decision making only

6 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 2: DATA GOVERNANCE AND MANAGEMENT

14. A key challenge in the successful implementation of data governance programs is:
a) Lack of alignment between data governance and business objectives
b) Inconsistent enforcement of data management policies across departments
c) Balancing the need for data standardization with the flexibility required for innovation
d) Insufficient investment in data governance technology and tools

15. Which is a best practice for implementing data governance?


a) Starting with a "big bang" organization-wide rollout
b) Avoiding small-scale pilot projects to ensure consistency
c) Securing management support and starting small
d) Streamlining the process by bypassing existing operational procedures

16. Data stewards are responsible for:


a) Designing and maintaining digital workspace infrastructure
b) Enforcing data policies and resolving quality issues
c) Managing social media accounts
d) All of the above

17. Metadata management helps with:


a) Hiding data from users
b) Tracking data lineage and improving discoverability
c) Deleting outdated files
d) Reducing encryption requirements

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 7


CHAPTER 2: DATA GOVERNANCE AND MANAGEMENT CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
2 1 b
2 2 a
2 3 b
2 4 b
2 5 a
2 6 b
2 7 b
2 8 b
2 9 b
2 10 b
2 11 b
2 12 a
2 13 b
2 14 c
2 15 c
2 16 b
2 17 b

8 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 3
INTRODUCTION TO
DATA ANALYTICS

1. What is the primary goal of data analytics?


a) To enhance data storage and infrastructure to support larger datasets
b) To convert raw data into actionable insights that inform strategic and operational decisions
c) To eliminate unnecessary historical data to streamline current processes
d) To ensure that data access is limited to technical teams for security and control purposes

2. Which phase of the Data Analytics Cycle involves handling missing values and correcting errors?
a) Data Collection
b) Data Cleaning
c) Data Modeling
d) Decision-Making

3. Descriptive analytics primarily answers which question?


a) "What will happen in the future?"
b) "Why did it happen?"
c) "What happened?"
d) "What should we do next?"

4. Diagnostic analytics uses which technique to identify root causes?


a) Time series forecasting
b) Drill-down analysis
c) Prescriptive algorithms
d) Data aggregation

5. Predictive analytics relies heavily on:


a) Summarizing past sales data
b) Machine learning models and historical trends
c) Creating pie charts
d) Detecting outliers from datasets

6. Which stage of analytics recommends optimal actions based on predictions?


a) Descriptive
b) Diagnostic
c) Predictive
d) Prescriptive

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 9


CHAPTER 3: INTRODUCTION TO DATA ANALYTICS CAF 3 - DATA SYSTEMS AND RISKS

7. A retail company forecasting holiday sales using past trends is an example of:
a) Descriptive analytics
b) Diagnostic analytics
c) Predictive analytics
d) Prescriptive analytics

8. Prescriptive analytics might use which of the following techniques?


a) Calculating averages
b) Optimization algorithms
c) Removing duplicates
d) Data visualization

9. Which technique is used in descriptive analytics to visually represent data trends?


a) Regression analysis
b) Anomaly detection
c) Data visualization
d) Clustering

10. In healthcare, analyzing patient readmission rates to find root causes is an example of:
a) Descriptive analytics
b) Diagnostic analytics
c) Predictive analytics
d) Prescriptive analytics

11. A financial institution using credit scores to forecast loan defaults applies:
a) Descriptive analytics
b) Diagnostic analytics
c) Predictive analytics
d) Prescriptive analytics

12. A key challenge in data analytics is:


a) Abundance of high-quality data
b) Ensuring data accuracy and completeness
c) Lack of computational tools
d) All of the above

13. An airline adjusting ticket prices based on demand forecasts leverages:


a) Descriptive analytics
b) Diagnostic analytics
c) Predictive analytics
d) Prescriptive analytics

10 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 3: INTRODUCTION TO DATA ANALYTICS

14. During data exploration, analysts typically:


a) Delete all outliers
b) Create summary statistics and visualizations
c) Implement prescriptive algorithms
d) Ignore seasonality in data

15. Interpretation and Insight Generation involves:


a) Collecting raw data
b) Translating model results into actionable business insights
c) Building databases
d) Removing all metadata

16. Which analytics stage reduces human input while increasing decision complexity?
a) Descriptive
b) Diagnostic
c) Predictive
d) Prescriptive

17. A hospital predicting disease outbreaks using historical data applies:


a) Descriptive analytics
b) Diagnostic analytics
c) Predictive analytics
d) Prescriptive analytics

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 11


CHAPTER 3: INTRODUCTION TO DATA ANALYTICS CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
3 1 b
3 2 b
3 3 c
3 4 b
3 5 b
3 6 d
3 7 c
3 8 b
3 9 c
3 10 b
3 11 c
3 12 b
3 13 d
3 14 b
3 15 b
3 16 d
3 17 c

12 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 4
BIG DATA

1. Which of the following best defines Big Data?


a) Datasets easily managed in spreadsheets
b) Massive volumes of structured/unstructured data generated at high velocity
c) Data collected only from social media
d) Historical records containing numeric data

2. The "5 Vs" of Big Data include all EXCEPT:


a) Volume
b) Velocity
c) Veracity
d) Visualization

3. Which characteristic of Big Data refers to the speed at which data is generated?
a) Variety
b) Velocity
c) Veracity
d) Value

4. Unstructured data includes:


a) Spreadsheets
b) Social media posts and videos
c) Relational databases
d) CSV files

5. Which is NOT a common source of Big Data?


a) IoT devices
b) Physical large records
c) Transactional records
d) Machine-generated logs

6. Web scraping is primarily used to:


a) Extract data from websites
b) Encrypt sensitive data
c) Populate data on websites
d) Monitor server performance

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 13


CHAPTER 4: BIG DATA CAF 3 - DATA SYSTEMS AND RISKS

7. Sensor data in agriculture helps farmers:


a) Optimize crop growth by monitoring soil conditions
b) Predict stock market trends
c) Design social media ads
d) Harvest crop using mechanical tools

8. Personalized product recommendations in e-commerce rely on:


a) Analyzing customer browsing/purchase history
b) Randomly selecting items
c) Ignoring user data
d) Using only demographic surveys

9. Predictive maintenance in manufacturing uses Big Data to:


a) Schedule machinery repairs before failures occur
b) Enhance employee productivity ensuring attendance
c) Understand production logs
d) Limit employee access to tools

10. A major challenge of Big Data is:


a) Ensuring data quality and accuracy
b) Lack of data generation
c) Over-reliance on paper records
d) Limited storage needs

11. Which technology is designed to handle Big Data storage and processing?
a) Apache Hadoop
b) Microsoft Word
c) Paper filing systems
d) GPUs

12. Data privacy in Big Data requires:


a) Strong encryption and access controls
b) Sharing data on community cloud
c) Relying on edge-device storage
d) Storing data only on Cloud

13. Edge computing processes data:


a) Only in centralized cloud servers
b) Closer to the source (e.g., IoT devices)
c) Exclusively on dedicated centralized machines
d) After a 12-hour delay

14 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 4: BIG DATA

14. NoSQL databases are used for:


a) Handling unstructured/semi-structured data
b) Replacing all spreadsheets
c) Handling Structured data
d) All of the above

15. During COVID-19, Big Data helped track the virus spread by analyzing:
a) Mobile tracking and social media data
b) Using anomaly detection models
c) Prescribing medicines in real time
d) All of the above

16. A bank using machine learning to detect fraud analyzes:


a) Real-time transaction patterns
b) Employee vacation schedules
c) Office computer access controls
d) Global trends in transaction volumes

17. Smart thermostats contribute to Big Data by:


a) Collecting temperature data for energy optimization
b) Printing weather forecasts
c) Allowing internet access
d) Understanding user preferences

18. Ethical concerns in Big Data include:


a) Privacy, bias, and transparency
b) Increasing paper usage
c) Reducing data storage costs
d) Performing real time data analytics

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 15


CHAPTER 4: BIG DATA CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
4 1 b
4 2 d
4 3 b
4 4 b
4 5 b
4 6 a
4 7 a
4 8 a
4 9 a
4 10 a
4 11 a
4 12 a
4 13 b
4 14 a
4 15 a
4 16 a
4 17 a
4 18 a

16 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 5
DATABASE
MANAGEMENT SYSTEMS

1. What is the primary function of a DBMS?


a) To create and manage spreadsheets
b) To store, retrieve, and manage structured/unstructured data efficiently
c) To design user interfaces
d) To replace all paper-based records

2. Which component of a DBMS ensures data integrity during transactions?


a) Query Processor
b) Transaction Manager
c) Storage Manager
d) Security Manager

3. The ACID property that ensures "all or nothing" execution of a transaction is:
a) Atomicity
b) Consistency
c) Isolation
d) Durability

4. Durability in ACID properties guarantees that:


a) Transactions are isolated from each other
b) Committed transactions survive system failures
c) Data is always consistent
d) Queries are optimized for speed

5. The three-level DBMS architecture includes all EXCEPT:


a) External Level
b) Conceptual Level
c) Physical Level
d) Operational Level

6. Logical Data Independence allows changes to the ______ without affecting applications.
a) Physical storage
b) Conceptual schema
c) User interfaces
d) Network configuration

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 17


CHAPTER 5: DATABASE MANAGEMENT SYSTEMS CAF 3 - DATA SYSTEMS AND RISKS

7. In the external level of DBMS architecture, different users can have:


a) Identical views of the entire database
b) Customized views based on their roles
c) Direct access to physical storage
d) No access to data

8. Which data model organizes data in tables with rows and columns?
a) Hierarchical Model
b) Network Model
c) Relational Model
d) Object-Oriented Model

9. A parent-child relationship is a feature of the:


a) Relational Model
b) Hierarchical Model
c) Object-Oriented Model
d) Entity-Relationship Model

10. The Network Model supports:


a) Only one-to-one relationships
b) Only one-to-many relationships
c) Many-to-many relationships
d) No relationships between entities

11. In an ER diagram, a rectangle represents a(n):


a) Attribute
b) Entity
c) Relationship
d) Constraint

12. A "weak entity" depends on a ______ for its existence.


a) Derived attribute
b) Strong entity
c) Composite key
d) Multivalued attribute

13. A many-to-many relationship in an ER model is represented by:


a) A single line
b) A double line
c) A diamond with connecting lines
d) A dashed line

18 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 5: DATABASE MANAGEMENT SYSTEMS

14. Which DBMS type uses SQL for querying?


a) Hierarchical DBMS
b) Network DBMS
c) Relational DBMS
d) Object-Oriented DBMS

15. An OODBMS stores data as:


a) Tables
b) Primary and Foreign Keys
c) Objects with methods
d) Graphs

16. IBM IMS is an example of a:


a) Relational DBMS
b) Hierarchical DBMS
c) Network DBMS
d) Object-Oriented DBMS

17. A key advantage of RDBMS is:


a) Rigid data structure
b) Support for complex ad-hoc queries using SQL
c) Ensure data integrity
d) Ensure scalability

18. Which DBMS type is best suited for multimedia applications?


a) Hierarchical DBMS
b) Network DBMS
c) Relational DBMS
d) Object-Oriented DBMS

19. A major challenge of DBMS is:


a) Low data redundancy
b) High implementation and maintenance costs
c) Limited data security
d) Inability to handle concurrent users

20. In the ER model, a "derived attribute" is:


a) A unique identifier
b) An attribute calculated from other attributes
c) A mandatory attribute
d) A multivalued attribute

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 19


CHAPTER 5: DATABASE MANAGEMENT SYSTEMS CAF 3 - DATA SYSTEMS AND RISKS

21. Which ACID property ensures that concurrent transactions do not interfere with each other?
a) Atomicity
b) Consistency
c) Isolation
d) Durability

22. In the three-level architecture, the conceptual schema defines:


a) How data is viewed by end-users
b) The physical storage details
c) The logical structure of the entire database
d) Network connectivity settings

23. A composite attribute in the ER model can be:


a) Derived from other attributes
b) Broken down into smaller sub-attributes
c) Only single-valued
d) Uniquely identifying an entity

24. Which of the following is NOT a state in a transaction lifecycle?


a) Active
b) Partially Committed
c) Terminated
d) Indexed

25. The "Lost Update Problem" in concurrent transactions occurs when:


a) Two transactions read the same data without modifying it
b) A committed transaction is rolled back
c) Changes made by one transaction are overwritten by another
d) The system crashes during a transaction

26. To resolve the "Dirty Read Problem," DBMS uses:


a) Checkpoints
b) Locking protocols
c) Data compression
d) Attribute inheritance

27. Normalization is primarily used to:


a) Increase data redundancy
b) Improve query performance
c) Eliminate data anomalies
d) Reduce the number of tables

20 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 5: DATABASE MANAGEMENT SYSTEMS

28. A relation is in 1NF if it contains no:


a) Functional dependencies
b) Repeating groups or multivalued attributes
c) Foreign keys
d) Derived attributes

29. Boyce-Codd Normal Form (BCNF) is stricter than 3NF because it eliminates:
a) Partial dependencies
b) Transitive dependencies
c) All non-trivial functional dependencies not derived from superkeys
d) Multivalued dependencies

30. In the Object-Oriented Data Model, encapsulation refers to:


a) Storing data in hierarchical trees
b) Combining data and methods into a single unit
c) Creating many-to-many relationships
d) Using SQL for queries

31. Which data model would best represent a file system directory structure?
a) Relational
b) Hierarchical
c) Network
d) Object-Oriented

32. A junction table is used in RDBMS to resolve:


a) One-to-one relationships
b) One-to-many relationships
c) Many-to-many relationships
d) Recursive relationships

33. Which SQL command is used to enforce referential integrity?


a) CREATE INDEX
b) FOREIGN KEY
c) GROUP BY
d) TRUNCATE

34. In a distributed DBMS, fragmentation refers to:


a) Breaking the database into smaller parts stored at different locations
b) Replicating the entire database at all locations
c) Converting relational tables to objects
d) Deleting corrupted data

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 21


CHAPTER 5: DATABASE MANAGEMENT SYSTEMS CAF 3 - DATA SYSTEMS AND RISKS

35. Data dictionaries in DBMS store:


a) Actual user data
b) Metadata about the database structure
c) Backup copies of transactions
d) Network configuration details

36. NoSQL databases are preferred over RDBMS for:


a) Handling highly structured data
b) Applications requiring complex joins
c) Scalability with unstructured data
d) Enforcing strict ACID properties

37. Columnar databases optimize performance for:


a) Transaction processing systems (OLTP)
b) Analytical queries (OLAP)
c) Hierarchical data models
d) Object-oriented programming

38. CAP Theorem states that a distributed system cannot simultaneously guarantee:
a) Consistency, Availability, and Partition Tolerance
b) Cardinality, Atomicity, and Persistence
c) Concurrency, Accuracy, and Performance
d) Compression, Archiving, and Processing

22 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 5: DATABASE MANAGEMENT SYSTEMS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
5 1 b
5 2 b
5 3 a
5 4 b
5 5 d
5 6 b
5 7 b
5 8 c
5 9 b
5 10 c
5 11 b
5 12 b
5 13 c
5 14 c
5 15 c
5 16 b
5 17 b
5 18 d
5 19 b
5 20 b
5 21 c
5 22 c
5 23 b
5 24 d
5 25 c
5 26 b
5 27 c
5 28 b
5 29 c
5 30 b
5 31 b
5 32 c
5 33 b

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 23


CHAPTER 5: DATABASE MANAGEMENT SYSTEMS CAF 3 - DATA SYSTEMS AND RISKS

5 34 a
5 35 b
5 36 c
5 37 b
5 38 a

24 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 6
DATABASE NORMALIZATION
& DATA WAREHOUSING

1. The primary goal of database normalization is to:


a) Increase data redundancy
b) Minimize redundancy and ensure data integrity
c) Speed up transaction processing
d) Eliminate all primary keys

2. A table is in 1NF if it:


a) Has no repeating groups and all attributes are atomic
b) Contains partial dependencies
c) Allows multivalued attributes
d) Uses composite keys exclusively

3. To achieve 2NF, a table must first satisfy:


a) 3NF
b) BCNF
c) 1NF
d) No normal forms

4. Partial dependencies are eliminated in:


a) 1NF
b) 2NF
c) 3NF
d) BCNF

5. In 3NF, a table must not have:


a) Atomic values
b) Transitive dependencies
c) Foreign keys
d) Composite keys

6. A data warehouse is characterized as:


a) Volatile and transaction-focused
b) Subject-oriented, integrated, time-variant, and non-volatile
c) Optimized for real-time updates
d) Exclusively for OLTP systems

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 25


CHAPTER 6: DATABASE NORMALIZATION & DATA WAREHOUSING CAF 3 - DATA SYSTEMS AND RISKS

7. Which is NOT a feature of a data warehouse?


a) Time-variant data
b) High normalization
c) Non-volatility
d) Subject-oriented design

8. Data warehouses are primarily used for:


a) Processing daily transactions
b) Historical data analysis and business intelligence
c) Real-time data insertion
d) Managing OLTP systems

9. OLTP systems are optimized for:


a) Complex analytical queries
b) Real-time transaction processing
c) Storing historical data
d) Data mining

10. Which statement is true about OLAP?


a) It handles high-volume, short transactions
b) It uses highly normalized databases
c) It supports complex queries for decision-making
d) It prioritizes data insertion speed over analysis

11. A key difference between OLTP and data warehouses is that OLTP systems:
a) Store historical data
b) Use denormalized schemas
c) Prioritize data integrity and concurrency
d) Are read-only

12. The "Transform" phase in ETL involves:


a) Extracting raw data from sources
b) Moving data into the target warehouse
c) Cleaning, standardizing, and enriching data
d) Deleting all complex records

13. Incremental loading in ETL refers to:


a) Populating the warehouse with all historical data at once
b) Adding only new or updated records
c) Using star schemas exclusively
d) All of the above

26 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 6: DATABASE NORMALIZATION & DATA WAREHOUSING

14. Which is a challenge in the "Extract" phase?


a) Handling heterogeneous data formats
b) Handling homogeneous data formats
c) Normalizing dimension tables
d) Writing SQL queries

15. A star schema consists of:


a) Multiple fact tables sharing dimensions
b) A single fact table linked to denormalized dimension tables
c) Fully normalized dimension tables
d) No fact tables

16. Snowflake schema differs from star schema in that it:


a) Denormalizes all dimension tables
b) Normalizes dimension tables into sub-tables
c) Eliminates fact tables
d) Uses only one dimension table

17. Galaxy schema is used when:


a) A single business process is analyzed
b) Multiple fact tables share dimension tables
c) No transformations are needed
d) Data is stored in CSV files

18. Which schema offers the fastest query performance for simple analytics?
a) Star schema
b) Snowflake schema
c) Galaxy schema
d) Relational schema

19. A dependent data mart is:


a) Created directly from source systems
b) A subset of a data warehouse
c) Independent of ETL processes
d) Used only for OLTP

20. Data marts are beneficial because they:


a) Increase redundancy
b) Provide department-specific data access
c) Replace the need for a data warehouse
d) Slow down query performance

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 27


CHAPTER 6: DATABASE NORMALIZATION & DATA WAREHOUSING CAF 3 - DATA SYSTEMS AND RISKS

21. Normalization trade-offs include:


a) Faster queries due to fewer joins
b) Increased storage efficiency but slower inserts
c) Simplified tables but more redundancy
d) Eliminating all foreign keys

22. A normalized dimension table is typical in a:


a) Star schema
b) Snowflake schema
c) Galaxy schema
d) Flat file

23. Which is true about OLTP systems?


a) They use denormalized databases
b) They prioritize complex analytical queries
c) They enforce ACID properties
d) All of the above

24. The staging area in ETL is used to:


a) Present data to end-users
b) Temporarily store and clean data before loading
c) Replace the data warehouse
d) Bypass transformation

28 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 6: DATABASE NORMALIZATION & DATA WAREHOUSING

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
6 1 b
6 2 a
6 3 c
6 4 b
6 5 b
6 6 b
6 7 b
6 8 b
6 9 b
6 10 c
6 11 c
6 12 c
6 13 b
6 14 a
6 15 b
6 16 b
6 17 b
6 18 a
6 19 b
6 20 b
6 21 b
6 22 b
6 23 c
6 24 b

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 29


CHAPTER 6: DATABASE NORMALIZATION & DATA WAREHOUSING CAF 3 - DATA SYSTEMS AND RISKS

30 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 7
INFORMATION SYSTEMS
ARCHITECTURE

1. The primary goal of IT systems architecture is to:


a) Minimize hardware costs
b) Align technology with business goals for efficiency and scalability
c) Eliminate all software dependencies
d) Use only on-premises infrastructure

2. Which component of IT architecture manages requests from clients and provides resources like data storage?
a) Peripheral devices
b) Servers
c) Middleware
d) User interfaces

3. Middleware in IT systems primarily enables:


a) Communication between disparate applications
b) Hardware Optimization
c) User interface design
d) Data redundancy

4. The operating system layer acts as an intermediary between:


a) Hardware and application software
b) Networks and storage
c) Users and databases
d) Cloud and edge computing

5. Which layer contains business logic and user functionality?


a) Hardware layer
b) Middleware layer
c) Application layer
d) User interface layer

6. A graphical user interface (GUI) is part of the:


a) Hardware layer
b) User interface layer
c) Middleware layer
d) Storage layer

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 31


CHAPTER 7: INFORMATION SYSTEMS ARCHITECTURE CAF 3 - DATA SYSTEMS AND RISKS

7. Horizontal scaling involves:


a) Upgrading a single server’s CPU and RAM
b) Adding more machines to distribute workload
c) Maximum hardware capacity of a single machine
d) All of the above

8. Vertical scaling is limited by:


a) Network latency
b) Maximum hardware capacity of a single machine
c) Adding more machines to distribute workload
d) Cloud service availability

9. A streaming service like Netflix uses horizontal scaling to:


a) Reduce subscription costs
b) Handle traffic spikes by adding servers
c) Eliminate all middleware
d) Standardize user interfaces

10. Modular design in IT systems allows:


a) Tight coupling of all components
b) Independent updates of system components
c) Enforcement of all security measures
d) Centralized control of all hardware

11. "Security by design" integrates security measures:


a) Only at the application layer
b) At every layer of the architecture
c) Exclusively for cloud systems
d) After system deployment

12. In IT infrastructure design, redundancy is a critical principle. Which of the following best describes its primary
purpose while acknowledging inherent trade-offs?
a) Maximizing data duplication across all storage tiers to ensure recoverability, even at the cost of storage
efficiency.
b) Maintaining fault-tolerant service delivery through failover mechanisms, ensuring minimal disruption
during hardware/software failures.
c) Removing the need for traditional backups by relying solely on distributed consensus protocols (e.g., Paxos,
Raft).
d) Prioritizing latency-sensitive workloads by reducing redundant network hops, even if availability
guarantees are relaxed.

13. Cloud-native architectures leverage:


a) Physical servers only
b) Kubernetes and containerization
c) Single monolithic applications
d) Legacy mainframes

32 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 7: INFORMATION SYSTEMS ARCHITECTURE

14. Edge computing is beneficial for:


a) Centralized data processing in data centers
b) Real-time data processing closer to the source
c) Real-time data processing closer to the Disaster recovery site
d) Reducing hardware costs

15. Zero Trust Security requires:


a) Trusting all internal users by default
b) Continuous verification of all access requests
c) Single factor authentication for remote employees
d) Disabling all encryption while system upgrades

16. Python is widely used in:


a) Low-level system programming
b) Data science and machine learning
c) Operating system development
d) Replacing all scripting languages

17. JavaScript is essential for:


a) Backend database management
b) Adding interactivity to websites
c) Writing operating systems
d) Replacing all high-level languages

18. SQL is a domain-specific language for:


a) Managing relational databases
b) Developing Artificial Intelligence models
c) Building user interfaces
d) Automating network configurations

19. Functional programming emphasizes:


a) Imperative loops and mutable data
b) Pure functions and immutable data
c) Tight coupling of components
d) Hardware-specific optimizations

20. Low-code/no-code platforms are popular because they:


a) Require deep programming expertise
b) Enable rapid application development with minimal coding
c) Make app development easy without involving any tool
d) Only support legacy systems

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 33


CHAPTER 7: INFORMATION SYSTEMS ARCHITECTURE CAF 3 - DATA SYSTEMS AND RISKS

21. A healthcare provider encrypting patient data end-to-end is an example of:


a) Security by design
b) Horizontal scaling
c) Legacy system maintenance
d) Eliminating middleware

22. An e-commerce platform using auto-scaling during Black Friday demonstrates:


a) Vertical scaling
b) Cloud-based horizontal scalability
c) Horizontal scalability
d) Hardware Upgrades

23. A financial institution using multi-factor authentication (MFA) implements:


a) Zero Trust Security
b) Only physical security
c) No redundancy
d) Monolithic architecture

24. Compared to vertical scaling, horizontal scaling:


a) Is limited by single-machine hardware
b) Distributes workloads across multiple machines
c) Is cheaper for all use cases
d) All of the above

34 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 7: INFORMATION SYSTEMS ARCHITECTURE

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
7 1 b
7 2 b
7 3 a
7 4 a
7 5 c
7 6 b
7 7 b
7 8 b
7 9 b
7 10 b
7 11 b
7 12 b
7 13 b
7 14 b
7 15 b
7 16 b
7 17 b
7 18 a
7 19 b
7 20 b
7 21 a
7 22 b
7 23 a
7 24 b

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 35


CHAPTER 7: INFORMATION SYSTEMS ARCHITECTURE CAF 3 - DATA SYSTEMS AND RISKS

36 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 8
ENTERPRISE RESOURCE
PLANNING SYSTEMS

1. What is the primary purpose of an ERP system?


a) To manage all financial transactions
b) To integrate core business functions into a single platform
c) To replace all human employees with automation
d) To eliminate the need for remote connectivity

2. Which of the following is NOT a core module of an ERP system?


a) Financial Management
b) Human Resource Management (HRM)
c) Social Media Marketing
d) Supply Chain Management (SCM)

3. ERP systems evolved from which earlier system in the 1960s?


a) Customer Relationship Management (CRM)
b) Material Requirements Planning (MRP)
c) Enterprise Resource Networking (ERN)
d) Cloud Computing

4. Which ERP module handles general ledger, accounts payable, and financial reporting?
a) HRM
b) Financial Management
c) CRM
d) Project Management

5. The HRM module in an ERP system is responsible for:


a) Tracking inventory levels
b) Managing payroll and employee records
c) Automating customer support tickets
d) Optimizing production schedules

6. A retail company uses the SCM module to:


a) Process employee salaries
b) Manage procurement and inventory
c) Design marketing campaigns
d) Develop supply chain apps

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 37


CHAPTER 8: ENTERPRISE RESOURCE PLANNING SYSTEMS CAF 3 - DATA SYSTEMS AND RISKS

7. The first phase of ERP implementation is:


a) System procurement
b) Planning and Requirement Analysis
c) Data Migration
d) System Testing

8. Data Migration in ERP implementation involves:


a) Training employees on the new system
b) Transferring data from legacy systems to the ERP
c) Customizing the user interface
d) Decommissioning old system

9. On-premise ERP systems are characterized by:


a) Hosting on vendor-managed cloud servers
b) Full control over data and infrastructure
c) Private cloud
d) Lower upfront costs

10. Cloud ERP systems are advantageous because they:


a) Require low running costs
b) Offer scalability and lower upfront costs
c) Cannot be accessed remotely
d) Exclusively support large enterprises

11. A hybrid ERP model combines:


a) Only on-premise solutions
b) Only cloud solutions
c) Both on-premise and cloud solutions
d) Legacy systems with integration

12. A major challenge in ERP implementation is:


a) Resistance to change among employees
b) Immediate cost savings
c) Lack of need for data migration
d) Simplified legacy systems

13. Excessive customization in ERP systems can lead to:


a) Lower implementation costs
b) Increased complexity and higher costs
c) Increased complexity and lower costs
d) Excessive need for training

38 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 8: ENTERPRISE RESOURCE PLANNING SYSTEMS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
8 1 b
8 2 c
8 3 b
8 4 b
8 5 b
8 6 b
8 7 b
8 8 b
8 9 b
8 10 b
8 11 c
8 12 a
8 13 b

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 39


CHAPTER 8: ENTERPRISE RESOURCE PLANNING SYSTEMS CAF 3 - DATA SYSTEMS AND RISKS

40 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 9
EMERGING
TECHNOLOGIES

1. What enables AI systems to perform tasks like decision-making and language understanding?
a) Blockchain
b) Autonomous decision-making
c) 5G networks
d) Edge computing

2. Which subset of AI focuses on learning from data patterns?


a) IoT
b) Machine Learning (ML)
c) Quantum computing
d) RPA

3. In healthcare, AI is primarily used for:


a) Managing hospital finances
b) Diagnosing diseases and analyzing medical images
c) Drug manufacturing
d) Patient transportation

4. Which AI technique enables virtual assistants like Siri to understand speech?


a) Neural networks
b) Natural Language Processing (NLP)
c) Computer vision
d) Reinforcement learning

5. Supervised learning in ML is used to:


a) Mine cryptocurrencies
b) Predict outcomes from labeled data
c) Optimize 5G networks
d) Generate random outputs

6. What component adjusts physical conditions (e.g., temperature) in IoT systems?


a) Sensors
b) Actuators
c) Cloud servers
d) 5G modems

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 41


CHAPTER 9: EMERGING TECHNOLOGIES CAF 3 - DATA SYSTEMS AND RISKS

7. In smart cities, IoT traffic sensors primarily:


a) Optimize traffic flow in real time
b) Replace traffic police requirement
c) Generate cryptocurrency
d) Provide accurate weather forecasts

8. A major IoT challenge is:


a) Low data generation
b) Security and privacy risks
c) Lack of wireless protocols
d) Data modelling and interpretation

9. Blockchain transactions are secure due to:


a) Centralized servers
b) Immutability and cryptographic hashing
c) High latency
d) Manual verification

10. Which is NOT a blockchain application?


a) Bitcoin
b) Smart contracts
c) Autonomous vehicle navigation
d) Supply chain tracking

11. Consensus in blockchain is achieved via:


a) Proof of Work (PoW)
b) 5G networks
c) RPA bots
d) Edge computing

12. A smart contract is:


a) A paper contract
b) Self-executing code on blockchain
c) An AI chatbot
d) A VR simulation

13. 5G's ultra-low latency is critical for:


a) Remote surgery
b) Cryptocurrency mining
c) Printing documents
d) RPA

42 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 9: EMERGING TECHNOLOGIES

14. Massive Machine-Type Communications (mMTC) supports:


a) Millions of IoT devices
b) Only autonomous vehicles
c) 8K video streaming
d) Quantum computing

15. 5G improves upon 4G in:


a) Speed, latency, and device capacity
b) Cryptocurrency mining
c) Centralized data processing
d) Speed, latency, and device battery

16. Which industry benefits least from 5G?


a) Healthcare
b) Traditional agriculture
c) Autonomous vehicles
d) AR/VR gaming

17. AR differs from VR by:


a) Overlaying digital elements on the virtual world
b) Creating fully immersive environments
c) Using only haptic feedback
d) Overlaying digital elements on the real world

18. VR is used in healthcare for:


a) Replacing need for doctors
b) Surgical training and PTSD therapy
c) Drug manufacturing and testing
d) All of the above

19. Which is a VR hardware requirement?


a) 5G-only connectivity
b) Headset and controllers
c) Blockchain wallet
d) RPA bots

20. AR/VR's biggest challenge is:


a) High hardware costs
b) Lack of real-world use
c) Incompatibility with AI
d) Slow data speeds

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 43


CHAPTER 9: EMERGING TECHNOLOGIES CAF 3 - DATA SYSTEMS AND RISKS

21. Qubits differ from classical bits by:


a) Being slower
b) Using only binary states
c) Existing in superposition (0 and 1 simultaneously)
d) Requiring more energy

22. Quantum supremacy refers to:


a) Ethical quantum use
b) Quantum computers outperforming classical ones
c) 5G networks
d) A new cryptocurrency

23. A problem quantum computing CANNOT solve soon:


a) Route optimization
b) Breaking RSA encryption
c) 100% accurate weather prediction
d) Molecular simulation

24. Quantum computing's biggest hurdle:


a) Too many qubits
b) Lack of algorithms
c) Error rates and decoherence
d) Overuse in smartphones

25. A limitation of edge computing is:


a) Inability to scale beyond a local network
b) High reliance on centralized data storage
c) Security risks at edge devices
d) Dependence on quantum computing

26. Which sector benefits the most from edge computing?


a) Traditional manufacturing without automation
b) Smart cities (real-time traffic management)
c) Paper-based offices with minimal technology
d) Manual payroll operations

27. Edge computing complements:


a) VR headsets with low bandwidth demands
b) IoT and 5G for real-time data processing
c) RPA bots used for back-office processes
d) Blockchain mining operations

44 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 9: EMERGING TECHNOLOGIES

28. RPA is best suited for:


a) Creative writing tasks requiring unique thought processes
b) Providing empathetic customer service
c) Repetitive, rule-based tasks
d) Complex decision-making scenarios involving quantum calculations

29. Which task is NOT RPA-compatible?


a) Invoice processing through standard workflows
b) Payroll management based on predefined rules
c) Data migration between structured systems
d) Designing creative marketing strategies

30. RPA improves compliance by:


a) Following predefined rules exactly and maintaining detailed audit trails
b) Using AI for decision-making in compliance tasks
c) All the options
d) Improve the visibility of data flows across systems

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 45


CHAPTER 9: EMERGING TECHNOLOGIES CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
9 1 b
9 2 b
9 3 b
9 4 b
9 5 b
9 6 b
9 7 a
9 8 b
9 9 b
9 10 c
9 11 a
9 12 b
9 13 a
9 14 a
9 15 a
9 16 b
9 17 d
9 18 b
9 19 b
9 20 a
9 21 c
9 22 b
9 23 c
9 24 c
9 25 c
9 26 b
9 27 b
9 28 c
9 29 d
9 30 a

46 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 10
ARTIFCIAL INTELLIGENCE
& AUTOMATION

1. What is the primary capability of AI that enables machines to perform human-like tasks?
a) Process automation
b) Autonomous decision-making
c) data analysis and visualization
d) Blockchain based ledgers

2. Which type of AI analyzes historical data to uncover patterns?


a) Generative AI
b) Predictive AI
c) Analytical AI
d) Reactive AI

3. Predictive AI is commonly used for:


a) Creating new images
b) Forecasting future trends
c) Processing real-time sensor data
d) Generating music

4. Generative AI creates:
a) Only financial reports
b) New content like text/images
c) Hardware components
d) Network infrastructure

5. Which algorithm is used in Generative AI for image synthesis?


a) Linear Regression
b) GANs (Generative Adversarial Networks)
c) k-Means Clustering
d) Decision Trees

6. Machine Learning (ML) differs from traditional programming because it:


a) Does not require explicit instructions for every task
b) Learns patterns from data
c) All of these options
d) Can improve over time

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 47


CHAPTER 10: ARTIFCIAL INTELLIGENCE & AUTOMATION CAF 3 - DATA SYSTEMS AND RISKS

7. Deep Learning uses:


a) Simple linear models
b) Neural networks with multiple layers
c) Principal Component analysis
d) Manual feature extraction

8. NLP (Natural Language Processing) enables machines to:


a) Generate images containing words
b) Understand and generate human language
c) Predict stock prices
d) Optimize supply chains

9. Computer Vision is used in:


a) Speech recognition
b) Analyzing visual data
c) Text translation
d) Audio synthesis

10. Which subfield would power a chatbot?


a) Computer Vision
b) NLP (Natural Language Processing)
c) Reinforcement Learning
d) k-Means Clustering

11. Supervised Learning requires:


a) Unlabeled data
b) Labeled training data
c) No data
d) Only test data

12. Which algorithm is used for binary classification?


a) Linear Regression
b) Logistic Regression
c) k-Means
d) PCA

13. Decision Trees are suitable for:


a) Financial forecasting
b) Classification and regression
c) Clustering
d) Dimensionality reduction

48 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 10: ARTIFCIAL INTELLIGENCE & AUTOMATION

14. Unsupervised Learning is used when:


a) Output labels are available
b) Discovering hidden patterns in unlabeled data
c) Predicting future sales
d) Working with robotic process automation tools

15. k-Means is a:
a) Classification algorithm
b) Clustering algorithm
c) Regression algorithm
d) Reinforcement Learning technique

16. Custom ML models are ideal for:


a) Generic tasks
b) Unique business needs
c) Quick deployment
d) Low-budget projects

17. Off-the-shelf solutions are:


a) Highly customizable
b) Pre-built and easy to implement
c) Only for large enterprises
d) Not scalable

18. AWS SageMaker is an example of:


a) Custom ML development
b) Off-the-shelf AI solution
c) Robotics platform
d) Blockchain tool

19. Hybrid AI approaches combine:


a) Only supervised and unsupervised learning
b) Custom and off-the-shelf solutions
c) AI with quantum computing
d) NLP and Computer Vision

20. IA combines:
a) AI and RPA
b) Blockchain and IoT
c) 5G and edge computing
d) VR and AR

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 49


CHAPTER 10: ARTIFCIAL INTELLIGENCE & AUTOMATION CAF 3 - DATA SYSTEMS AND RISKS

21. Unlike traditional RPA, IA can handle:


a) Only repetitive tasks
b) Cognitive tasks requiring decision-making
c) Cognitive tasks requiring manual input
d) Non-fungible tokens sale and purchase

22. Autonomous agents operate:


a) Always in human supervision
b) Independently to achieve goals
c) Without any sensors
d) Only in virtual environments

23. A thermostat is an example of:


a) Goal-based agent
b) Simple reflex agent
c) Utility-based agent
d) Learning agent

24. Self-driving cars use:


a) Only rule-based systems
b) Goal-based agents
c) Off-the-shelf chatbots
d) Manual controls

25. Multi-agent systems are used in:


a) Isolated single-task environments
b) Collaborative environments like supply chains
c) None of these options
d) Manual data entry

26. GANs are used for:


a) Generating realistic images
b) Predicting stock prices
c) Language translation
d) Data compression

27. Transformer models excel in:


a) Image classification
b) NLP tasks like text generation
c) Robotic movement
d) Financial auditing

50 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 10: ARTIFCIAL INTELLIGENCE & AUTOMATION

28. Reinforcement Learning is used in:


a) Teaching robots via trial-and-error
b) Data visualization
c) Rule-based systems
d) Manual processes

29. PCA (Principal Component Analysis) is a:


a) Classification technique
b) Dimensionality reduction technique
c) Clustering algorithm
d) Reinforcement Learning method

30. Sentiment Analysis falls under:


a) Computer Vision
b) NLP (Natural Language Processing)
c) Robotics
d) Edge computing

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 51


CHAPTER 10: ARTIFCIAL INTELLIGENCE & AUTOMATION CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
10 1 b
10 2 c
10 3 b
10 4 b
10 5 b
10 6 c
10 7 b
10 8 b
10 9 b
10 10 b
10 11 b
10 12 b
10 13 b
10 14 b
10 15 b
10 16 b
10 17 b
10 18 b
10 19 b
10 20 a
10 21 b
10 22 b
10 23 b
10 24 b
10 25 b
10 26 a
10 27 b
10 28 a
10 29 b
10 30 b

52 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 11
CLOUD
COMPUTING

1. What is the defining characteristic of cloud computing that allows automatic resource adjustment based on
demand?
a) Broad network access
b) Resource pooling
c) Rapid elasticity
d) Measured service

2. Which cloud feature enables users to provision computing resources without human intervention from the
service provider?
a) On-demand self-service
b) Resource pooling
c) Measured service
d) Private cloud

3. The pay-per-use billing model in cloud computing is made possible by which characteristic?
a) Resource pooling
b) Broad network access
c) Measured service
d) Rapid elasticity

4. Which cloud service model provides virtual machines, storage, and networking infrastructure?
a) SaaS
b) PaaS
c) IaaS
d) FaaS

5. A development team wants to focus on writing code without managing servers. Which service model should
they use?
a) IaaS
b) PaaS
c) SaaS
d) DaaS

6. What is the primary advantage of SaaS for business users?


a) Complete control over infrastructure
b) No need to install or maintain software
c) Unlimited customization options
d) Free hardware included

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 53


CHAPTER 11: CLOUD COMPUTING CAF 3 - DATA SYSTEMS AND RISKS

7. Which deployment model would a financial institution with strict compliance requirements most likely choose?
a) Public cloud
b) Private cloud
c) Hybrid cloud
d) Community cloud

8. A company uses AWS for development but keeps sensitive data in its own data center. What deployment model
is this?
a) Public cloud
b) Multi-cloud
c) Hybrid cloud
d) Community cloud

9. Several government agencies sharing a cloud infrastructure for common services is an example of:
a) Public cloud
b) Private cloud
c) Hybrid cloud
d) Community cloud

10. How does cloud computing help businesses reduce capital expenditures?
a) By eliminating upfront hardware costs
b) b) By providing free IT training to staff
c) c) Through guaranteed profits
d) d) With automatic task processing

11. Which cloud benefit is most valuable for handling seasonal traffic fluctuations?
a) Measured service
b) Rapid elasticity
c) Resource pooling
d) Broad network access

12. What disaster recovery capability is enhanced by cloud computing?


a) Automatic geographic redundancy
b) Free hardware replacements
c) Instant recovery from any disaster
d) Unlimited free backups

13. What does "vendor lock-in" refer to in cloud computing?


a) Difficulty moving between cloud providers
b) Mandatory use of certain software
c) Physical security of data centers
d) Required long-term contracts

54 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 11: CLOUD COMPUTING

14. Why is data security a major concern in public clouds?


a) All data is automatically public
b) Multi-tenant architecture creates potential risks
c) Providers don't use encryption
d) Hackers can't access cloud systems

15. Which cloud capability is most critical for an e-commerce site during holiday sales?
a) Cloud-based auto-scaling
b) On-premises servers
c) Local storage solutions
d) Physical backup systems

16. What cloud service helps implement continuous integration and delivery pipelines?
a) Basic IaaS
b) PaaS tools like Azure DevOps
c) SaaS email services
d) Community cloud storage

17. What is the key characteristic of serverless computing?


a) No servers are involved
b) Developers don't manage servers
c) Computers without CPUs
d) Free computing resources

18. Which strategy helps prevent vendor lock-in in cloud computing?


a) Adopting multi-cloud approaches
b) Signing longer contracts
c) Using only one provider
d) Avoiding all cloud services

19. What defines a "cloud-native" application?


a) Designed specifically for cloud environments
b) Only works offline
c) Legacy systems moved to cloud
d) Cannot scale automatically

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 55


CHAPTER 11: CLOUD COMPUTING CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
11 1 c
11 2 a
11 3 c
11 4 c
11 5 b
11 6 b
11 7 b
11 8 c
11 9 d
11 10 a
11 11 b
11 12 a
11 13 a
11 14 b
11 15 a
11 16 b
11 17 b
11 18 a
11 19 a

56 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 12
BLOCKCHAIN AND
FINTECH

1. What is the primary feature of blockchain that ensures data cannot be altered once recorded?
a) Decentralization
b) Immutability
c) Transparency
d) Consensus

2. Which consensus mechanism requires miners to solve complex mathematical problems to validate
transactions?
a) Proof of Stake (PoS)
b) Proof of Work (PoW)
c) Delegated Proof of Stake (DPoS)
d) Byzantine Fault Tolerance (BFT)

3. What is the role of "nodes" in a blockchain network?


a) They store and validate transactions.
b) They act as intermediaries for payments.
c) They mine cryptocurrencies exclusively.
d) They control the central authority.

4. Which type of blockchain is open to anyone and fully decentralized?


a) Private blockchain
b) Consortium blockchain
c) Public blockchain
d) Hybrid blockchain

5. What is the purpose of "hashing" in blockchain?


a) To increase transaction speed
b) To create a unique digital fingerprint of data
c) To replace smart contracts
d) To reduce the number of nodes

6. Which of the following is NOT a key area of Fintech?


a) Digital Payments
b) Robo-Advisors
c) Traditional Banking
d) InsurTech

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 57


CHAPTER 12: BLOCKCHAIN AND FINTECH CAF 3 - DATA SYSTEMS AND RISKS

7. How do robo-advisors differ from traditional financial advisors?


a) They require human intervention for every decision.
b) They use algorithms to automate investment strategies.
c) None of these options.
d) They are not regulated by financial authorities.

8. What is the primary benefit of peer-to-peer (P2P) lending platforms?


a) They eliminate the need for borrowers and lenders.
b) They allow direct lending between individuals without banks.
c) They guarantee 100% returns on investments.
d) They are only available to non-individual investors.

9. Which technology is commonly used in InsurTech to assess risk and personalize policies?
a) Blockchain
b) Artificial Intelligence (AI)
c) Quantum Computing
d) Virtual Reality (VR)

10. What is the main purpose of RegTech in financial services?


a) To bypass government regulations
b) To automate compliance and risk management
c) To replace traditional banking systems
d) To reduce transaction fees

11. What is the key advantage of DeFi (Decentralized Finance) over traditional finance?
a) It relies on centralized banks.
b) It eliminates intermediaries using blockchain.
c) It only supports fiat currencies.
d) It is slower than traditional banking.

12. How do smart contracts enhance efficiency in financial agreements?


a) They rely on manual enforcement through traditional legal systems.
b) They reduce the need for intermediaries by executing automatically when pre-defined conditions are met.
c) They are primarily applicable to paper-based agreements and physical documents.
d) They increase the complexity and cost of transactions by requiring third-party oversight.

13. Which blockchain-based solution significantly reduces cross-border payment settlement times?
a) SWIFT
b) Ripple (XRP)
c) Traditional banking
d) Paper checks

14. What does "tokenization" of assets enable?


a) Converting digital assets into physical form
b) Fractional ownership of high-value assets
c) Increasing regulatory restrictions
d) Slowing down transaction speeds

58 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 12: BLOCKCHAIN AND FINTECH

15. In what way does blockchain revolutionize identity verification in Fintech?


a) By storing all user identities on a single, centralized server managed by financial institutions.
b) By providing decentralized identity solutions that enable users to manage and share their credentials
securely without relying on a central authority.
c) By mandating frequent, manual KYC (Know Your Customer) checks at each stage of the transaction
process.
d) By completely removing the need for any formal identity verification procedures in financial transactions.

16. What is a real-world example of a smart contract application?


a) Manually filing an insurance claim
b) Automatically paying out flight delay compensation
c) Using paper-based contracts for real estate
d) Requiring a bank to approve every transaction

17. Which of the following is a benefit of blockchain in voting systems?


a) Reduced risk of tampering
b) Faster vote counting
c) Secure and transparent elections
d) Centralized control by a single authority

18. Which of the following is an example of a public blockchain?


a) Hyperledger
b) R3 Corda
c) Bitcoin
d) Dragonchain

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 59


CHAPTER 12: BLOCKCHAIN AND FINTECH CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
12 1 b
12 2 b
12 3 a
12 4 c
12 5 b
12 6 c
12 7 b
12 8 b
12 9 b
12 10 b
12 11 b
12 12 b
12 13 b
12 14 b
12 15 b
12 16 b
12 17 c
12 18 c

60 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 13
IMPACT OF DIGITAL DISRUPTION ON
BUSINESS AND ACCOUNTANCY
1. What is the primary characteristic of digital disruption?
a) Incremental improvements in existing processes
b) Radical transformation of industries through technology
c) Minor adjustments to traditional business models
d) Temporary changes in consumer behavior

2. Which historical example illustrates digital disruption in the entertainment industry?


a) Blockbuster’s expansion in the 1990s
b) Netflix transitioning from DVDs to streaming
c) The invention of the television
d) Radio broadcasting in the 1920s

3. Which technology enables autonomous decision-making and is a key driver of digital disruption?
a) Blockchain
b) Artificial Intelligence (AI)
c) Virtual Reality (VR)
d) 3D Printing

4. How does the data explosion accelerate digital disruption?


a) By minimizing the importance of reliable internet connections for businesses.
b) By giving organizations access to vast amounts of data, enabling more informed and agile decision-making
processes.
c) By completely removing the need for businesses to adhere to regulatory compliance or privacy laws.
d) By lowering the expectations of consumers for personalized, data-driven services and products.

5. How does digital disruption transform customer experience?


a) By increasing the complexity of customer service interactions and making response times slower.
b) By leveraging technology to offer personalized, real-time engagement and solutions tailored to individual
preferences.
c) By limiting the need for businesses to develop mobile-friendly solutions for their customers.
d) By discouraging the use of digital platforms for transactions and payments.

6. What is a defining characteristic of platform-based business models like Uber?


a) Exclusive ownership of all underlying infrastructure, such as vehicles or property.
b) A centralized management structure that controls all transactions and services directly.
c) Facilitating connections between service providers and customers without necessarily owning any of the
physical assets involved in the transaction.
d) High barriers to entry, making it difficult for competitors to replicate the model.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 61


CHAPTER 13: IMPACT OF DIGITAL DISRUPTION ON BUSINESS AND ACCOUNTANCY CAF 3 - DATA SYSTEMS AND RISKS

7. Which framework focuses on aligning IT services with business needs?


a) COBIT
b) ITIL
c) ISO/IEC 27001
d) TOGAF

8. What is a best practice for improving ICT processes?


a) Avoiding automation to reduce errors
b) Conducting annual audits
c) Standardizing workflows across departments
d) Limiting employee training to cut costs

9. How can businesses measure the ROI of ICT investments?


a) By ignoring long-term strategic value
b) Through cost-benefit analysis and productivity gains
c) By eliminating long form based KPIs
d) By reducing data collection

10. How does blockchain enhance audit processes?


a) By increasing processing speed
b) By creating immutable, verifiable records
c) By increasing transaction verification stages
d) By reducing transparency

62 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 13: IMPACT OF DIGITAL DISRUPTION ON BUSINESS AND ACCOUNTANCY

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
13 1 b
13 2 b
13 3 b
13 4 b
13 5 b
13 6 c
13 7 b
13 8 c
13 9 b
13 10 b

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 63


CHAPTER 13: IMPACT OF DIGITAL DISRUPTION ON BUSINESS AND ACCOUNTANCY CAF 3 - DATA SYSTEMS AND RISKS

64 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 14
IT RISK MANAGEMENT
AND SECURITY

1. What is the primary goal of risk management?


a) To eliminate all risks
b) To minimize the negative impact of risks on an organization
c) To transfer all risks to third parties
d) To ignore low-probability risks

2. Which of the following is the first step in the IT risk management process?
a) Risk mitigation
b) Risk identification
c) Risk monitoring
d) Incident response

3. IT risk management is critical for:


a) Only large corporations
b) Ensuring business continuity and protecting digital assets
c) Reducing employee training costs
d) Eliminating the need for IT security

4. Which of the following is an example of a physical IT risk?


a) A phishing email
b) A server failure due to overheating
c) A ransomware attack
d) A misconfigured firewall

5. How can organizations mitigate digital risks like ransomware?


a) By enforcing strong password policies
b) By installing antivirus software and regular backups
c) By relocating data centers to safer regions
d) By reducing employee headcount

6. What is the most effective way to reduce human risks such as accidental data leaks?
a) Implementing multi-factor authentication (MFA)
b) Conducting regular security awareness training
c) Increasing the number of firewalls
d) Disabling all remote access

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 65


CHAPTER 14: IT RISK MANAGEMENT AND SECURITY CAF 3 - DATA SYSTEMS AND RISKS

7. Which strategy helps mitigate third-party risks associated with cloud providers?
a) Avoiding all third-party vendors
b) Conducting vendor security assessments
c) Using only on-premises servers
d) Ignoring compliance requirements

8. A DDoS attack is an example of which type of IT risk?


a) Physical risk
b) Digital risk
c) Environmental risk
d) Human risk

9. What does the principle of least privilege entail?


a) Granting all employees administrative access
b) Restricting user access to only what is necessary for their role
c) Eliminating all access controls
d) Setting hard to remember passwords

10. Which technology ensures data confidentiality by converting it into an unreadable format?
a) Firewall
b) Encryption
c) Intrusion Detection System (IDS)
d) Load balancer

11. What is the purpose of a disaster recovery plan (DRP)?


a) To prevent all cyberattacks
b) To restore IT systems after a disruption
c) To reduce employee training costs
d) To eliminate the need for backups

12. Which tool aggregates and analyzes security data in real time to detect threats?
a) VPN
b) SIEM (Security Information and Event Management)
c) HTML
d) CRM

13. How does Zero Trust Architecture enhance security?


a) By assuming internal network traffic is inherently secure unless proven otherwise
b) By continuously validating every access request, irrespective of its source or context
c) By implementing a single-layer authentication system for all users and devices
d) By prioritizing perimeter-based defenses over identity verification

14. Which future trend is most likely to intensify as a result of climate change?
a) Diminished reliance on distributed data centers due to energy constraints
b) Heightened emphasis on mitigating environmental IT risks, such as flooding or extreme temperatures
c) A shift toward cyber resilience replacing physical infrastructure concerns
d) Streamlined regulatory frameworks prioritizing innovation over environmental oversight

66 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 14: IT RISK MANAGEMENT AND SECURITY

15. AI and machine learning are increasingly integrated into IT risk management to:
a) Supplant human oversight with fully autonomous decision-making systems
b) Enhance automation of threat identification and mitigation processes
c) Obviate the need for cryptographic protocols through predictive analytics
d) Optimize network efficiency by minimizing data transmission overhead

16. Why is regulatory compliance critical in IT risk management?


a) It eliminates all risks
b) It avoids legal penalties and builds stakeholder trust
c) It reduces the need for IT security budgets
d) It allows unlimited data sharing

17. How does blockchain technology bolster IT security?


a) By consolidating data into a centralized repository for streamlined access control
b) By establishing an immutable ledger of transactions resistant to unauthorized alteration
c) By replacing authentication mechanisms like passwords with decentralized keypairs
d) By compressing data flows to minimize network latency and exposure

18. What is a key challenge of cyber-physical integration (e.g., IoT)?


a) Securing both digital and physical components
b) Reducing the number of devices
c) Avoiding all cloud services
d) Eliminating human oversight

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 67


CHAPTER 14: IT RISK MANAGEMENT AND SECURITY CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
14 1 b
14 2 b
14 3 b
14 4 b
14 5 b
14 6 b
14 7 b
14 8 b
14 9 b
14 10 b
14 11 b
14 12 b
14 13 b
14 14 b
14 15 b
14 16 b
14 17 b
14 18 a

68 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 15
CYBERSECURITY AND
INFORMATION SECURITY RISKS

1. Which of the following is not a type of malware?


a) Worm
b) Firewall
c) Trojan Horse
d) Ransomware

2. How has the field of cybersecurity transformed since the inception of the internet?
a) It has diminished in relevance as hardware advancements inherently mitigate digital threats
b) It has narrowed its scope to safeguarding physical server infrastructure against localized risks
c) It has evolved to encompass the protection of intricate, interdependent systems alongside innovations like
IoT and cloud ecosystems
d) It has shifted toward fully autonomous defenses, rendering human oversight obsolete

3. Which attack involves overwhelming a system with traffic to disrupt its availability?
a) Phishing
b) DDoS (Distributed Denial of Service)
c) Data diddling
d) Salami attack

4. What is a defining feature of phishing attacks?


a) They leverage systemic hardware vulnerabilities to extract confidential data
b) They deploy deceptive communications, such as emails or messages, to manipulate users into disclosing
sensitive information
c) They focus on infiltrating high-security entities like governmental organizations
d) They employ advanced obfuscation techniques that render them undetectable by conventional security
measures

5. Which scenario best exemplifies an insider threat?


a) A foreign cybercriminal infiltrating a corporate network through exploited vulnerabilities
b) A staff member unintentionally exposing confidential information due to procedural oversight
c) A catastrophic event, such as a flood, compromising physical server integrity
d) An improperly configured security perimeter allowing unauthorized external access

6. What is the primary function of multi-factor authentication (MFA)?


a) To streamline authentication by minimizing reliance on traditional passwords
b) To authenticate a user’s identity through a combination of distinct verification methods, such as a
password and a one-time passcode
c) To secure network transmissions by encoding all data exchanges
d) To restrict access by preemptively denying all external connection attempts

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 69


CHAPTER 15: CYBERSECURITY AND INFORMATION SECURITY RISKS CAF 3 - DATA SYSTEMS AND RISKS

7. Which tool is specifically designed to monitor network traffic for anomalous behavior and issue alerts?
a) A solution that scans endpoints for malicious code and neutralizes threats
b) An Intrusion Detection System (IDS) that analyzes traffic patterns and flags potential security breaches
c) A perimeter defense mechanism that filters traffic based on predefined rules
d) A protocol that establishes secure, encrypted tunnels for remote access

8. What is the core assurance provided by encryption?


a) Enhanced network performance through optimized data compression
b) Rendering data indecipherable to individuals lacking authorized access
c) Comprehensive protection against all forms of cyber intrusions
d) Seamless deployment of software patches without user intervention

9. Which AI-related risk arises from skewed decision-making caused by prejudiced training datasets?
a) A vulnerability exploited before it is identified or patched by security teams
b) Algorithmic bias that distorts outcomes due to unrepresentative or flawed input data
c) Synthetic media generated to deceive through hyper-realistic impersonation
d) A coordinated network of compromised devices executing malicious commands

10. What makes IoT devices especially susceptible to cyberattacks?


a) They possess inherent resistance to malicious software infections
b) They frequently lack comprehensive security mechanisms and updates
c) They operate in isolation from internet connectivity by design
d) Their deployment is confined to residential environments with minimal exposure

11. What defines the shared responsibility model in cloud computing?


a) The cloud provider assumes full accountability for all security and compliance obligations
b) Security duties are apportioned between the cloud provider and the customer based on service scope
c) Customers bear exclusive responsibility for securing the physical infrastructure hosting cloud services
d) Regulatory bodies oversee and enforce all aspects of cloud security governance

12. What is the primary role of a honeypot in the context of cybersecurity?


a) To serve as a secure repository for encrypted system backups
b) To lure adversaries and analyze their tactics and techniques
c) To preemptively filter and block all inbound email communications
d) To supplant traditional perimeter defenses like firewalls

13. Which technology is specifically employed to verify the authenticity and integrity of digital documents?
a) A system that secures remote connections through encrypted channels
b) A digital signature that cryptographically binds identity to content
c) A tool that detects and removes malicious code from systems
d) A platform that simulates social engineering attacks for training purposes

14. Why is automation considered indispensable in contemporary cybersecurity practices?


a) It fully supplants human intervention with self-regulating systems
b) It accelerates threat response times while ensuring uniform application of security protocols
c) It is tailored exclusively for the operational needs of small-scale enterprises
d) It drives up infrastructure expenses by necessitating advanced hardware deployments

70 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 15: CYBERSECURITY AND INFORMATION SECURITY RISKS

15. Which strategy is most effective in reducing the risk of phishing attacks?
a) Distributing login credentials among team members for operational efficiency
b) Implementing ongoing staff education combined with simulated phishing exercises
c) Deactivating email security filters to streamline message delivery
d) Retaining factory-set passwords for ease of system administration

16. What is the chief advantage of adopting a proactive defense strategy in cybersecurity?
a) Delaying response actions until after threats have materialized
b) Persistently tracking threat landscapes and dynamically adjusting safeguards to thwart attacks
c) Streamlining organizational structure by reducing workforce dependency
d) Bypassing regulatory mandates to prioritize operational flexibility

17. Which framework is designed to enable organizations to systematically address and mitigate cybersecurity
risks?
a) The NIST Cybersecurity Framework, providing structured guidance for risk management
b) A set of protocols governing secure social media engagement and data handling
c) A policy framework optimizing employee leave to enhance operational resilience
d) A contractual agreement detailing hardware maintenance and failure response obligations

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 71


CHAPTER 15: CYBERSECURITY AND INFORMATION SECURITY RISKS CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
15 1 b
15 2 c
15 3 b
15 4 b
15 5 b
15 6 b
15 7 b
15 8 b
15 9 b
15 10 b
15 11 b
15 12 b
15 13 b
15 14 b
15 15 b
15 16 b
15 17 a

72 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 16
IT GENERAL CONTROLS
FOR MANAGING RISK

1. What is the fundamental objective of IT General Controls (ITGCs)?


a) To supplant manual business operations with fully automated workflows
b) To safeguard the integrity, confidentiality, and availability of IT infrastructure and associated data
c) To render supplementary cybersecurity protocols redundant through inherent system resilience
d) To optimize resource allocation by minimizing the need for IT personnel

2. Which of the following does not align with the core objectives of IT General Controls (ITGCs)?
a) Upholding the accuracy and consistency of data across systems
b) Reducing IT infrastructure costs
c) Ensuring uninterrupted access to critical IT resources
d) Facilitating adherence to statutory and industry regulations

3. How do ITGCs differ from application-specific controls?


a) ITGCs apply only to financial software.
b) ITGCs are tailored to individual systems, while application controls are broad.
c) ITGCs provide a framework for the entire IT environment, while application controls target specific
software.
d) ITGCs are optional for organizations.

4. Which control ensures that only authorized users can access sensitive systems?
a) Change management
b) Access control
c) Program development
d) Physical security

5. 5. What is the fundamental purpose of role-based access control (RBAC) in IT security?


a) To universally assign administrative privileges across the workforce for operational flexibility
b) To limit system access according to an individual’s defined job functions and duties
c) To supplant password-based authentication with role-specific credentials
d) To streamline data protection by automating system backup processes

6. Why is change management a vital component of IT General Controls (ITGCs)?


a) It safeguards systems by blocking unapproved or inadequately validated modifications
b) It optimizes IT operations by diminishing the reliance on specialized personnel
c) It permits unrestricted system alterations without oversight or authorization
d) It prioritizes the fortification of physical infrastructure over digital controls

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 73


CHAPTER 16: IT GENERAL CONTROLS FOR MANAGING RISK CAF 3 - DATA SYSTEMS AND RISKS

7. Which ITGC element is specifically tasked with ensuring data restoration following a system failure?
a) A process for documenting and resolving unexpected security incidents
b) A backup and recovery mechanism to preserve and reinstate critical data
c) A practice of auditing software code to identify potential vulnerabilities
d) A system of measures regulating physical conditions like temperature and humidity

8. What defines a core capability of privileged access management (PAM)?


a) It standardizes access rights across all employees for uniform system interaction
b) It oversees and constrains elevated permissions for accessing vital system resources
c) It bypasses authentication requirements for privileged users to enhance efficiency
d) It exclusively governs entry to physically secured data center environments

9. What constitutes the initial step in establishing IT General Controls (ITGCs)?


a) Equipping staff with procedural knowledge to enforce control mechanisms
b) Performing a comprehensive evaluation of potential risks to IT systems
c) Procuring advanced hardware to bolster system performance and security
d) Temporarily suspending all access restrictions to baseline system vulnerabilities

10. Why is employee training a critical factor in ensuring the effectiveness of ITGCs?
a) Human errors or oversight by employees frequently serve as the root cause of security incidents
b) Comprehensive training inherently neutralizes all potential cybersecurity threats
c) ITGC frameworks function independently of employee engagement or compliance
d) Specialized instruction is exclusively necessary for IT personnel, not general staff

11. Which tool is specifically engineered for real-time oversight of IT system activities?
a) A platform for tabular data organization and basic analytics
b) A SIEM (Security Information and Event Management) system for continuous threat monitoring
c) An application designed for document creation and text formatting
d) Software tailored for visual content development and editing

12. What represents a significant obstacle in managing IT General Controls (ITGCs) amidst rapid technological
progress?
a) IT ecosystems are increasingly streamlined and uniform in structure
b) The coexistence of legacy infrastructure and emerging technologies demands flexible control adaptations
c) The prevalence and sophistication of cybersecurity threats are steadily declining
d) Regulatory frameworks have become obsolete due to self-regulating systems

13. How do resource limitations affect the deployment of IT General Controls (ITGCs)?
a) They facilitate the recruitment of specialized cybersecurity professionals with ease
b) Constrained funding and personnel can result in incomplete or weakened control measures
c) They permit organizations to bypass risk evaluations without consequence
d) They render the establishment of ITGCs superfluous to organizational needs

74 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 16: IT GENERAL CONTROLS FOR MANAGING RISK

14. Why does regulatory compliance pose a significant hurdle for multinational organizations?
a) Diverse regions impose distinct and continually shifting legal and technical mandates
b) Global jurisdictions uniformly enforce identical IT governance standards
c) Adherence to compliance is discretionary for entities operating internationally
d) ITGC frameworks inherently negate the need for external regulatory alignment

15. Which regulation mandates ITGCs for financial reporting integrity?


a) GDPR
b) SOX (Sarbanes-Oxley Act)
c) HIPAA
d) PCI DSS

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 75


CHAPTER 16: IT GENERAL CONTROLS FOR MANAGING RISK CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
16 1 b
16 2 b
16 3 c
16 4 b
16 5 b
16 6 a
16 7 b
16 8 b
16 9 b
16 10 a
16 11 b
16 12 b
16 13 b
16 14 a
16 15 b

76 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CHAPTER 17
ICT'S ROLE IN
RISK MANAGEMENT

1. What is the primary role of ICT in modern risk management?


a) To eliminate all risks manually
b) To enhance the identification, reporting, and mitigation of risks through technology
c) To replace human decision-making entirely
d) To reduce the need for regulatory compliance

2. Which of the following is not a key risk category addressed by ICT in risk management?
a) Cybersecurity risks
b) Employee trainings
c) Operational risks
d) Compliance risks

3. How does Information and Communication Technology (ICT) enable organizations to adopt a proactive stance
in risk management?
a) By limiting responses to risks until after they have fully manifested
b) By leveraging predictive analytics and continuous real-time surveillance to foresee potential risks
c) By deliberately excluding historical data from risk evaluation processes
d) By minimizing the periodicity of risk assessments to enhance operational efficiency

4. What is the primary objective of predictive modeling in the context of risk identification?
a) To systematically discard all historical datasets in favor of real-time inputs
b) To project potential future risks by analyzing historical trends and data correlations
c) To entirely supplant human expertise with automated risk evaluation algorithms
d) To deliberately overlook nascent threats in favor of established risk profiles

5. Which tool is engineered to examine IT systems for weaknesses such as unpatched software or configuration
flaws?
a) An automated risk scanner, such as Nessus or OpenVAS
b) A learning management system like Moodle
c) An enterprise reporting tool like Oracle Financials
d) A digital analytics suite such as Adobe Analytics

6. How do dashboards improve risk reporting?


a) By executing machine learning algorithm for forecasting
b) By presenting risk metrics visually for quick decision-making
c) By developing tools to automate actions based on identified risks
d) All of the above

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 77


CHAPTER 17: ICT'S ROLE IN RISK MANAGEMENT CAF 3 - DATA SYSTEMS AND RISKS

7. Why are centralized risk repositories considered vital for organizations?


a) They consolidate risk data to streamline external audits without internal use
b) They create a single, authoritative source of risk information for cross-team alignment
c) They aggregate risk metrics primarily to support executive-level reporting
d) They centralize risk documentation to expedite compliance with industry standards

8. Which ICT tool is specifically deployed to block unauthorized access to sensitive data?
a) A firewall, such as Cisco ASA or Palo Alto Networks
b) A performance management system like BambooHR
c) A collaboration platform such as Microsoft Teams
d) A customer relationship management tool like Salesforce

9. How does Information and Communication Technology (ICT) facilitate business continuity in the face of
disruptions?
a) By phasing out backup systems for efficiency
b) Through cloud disaster recovery and network redundancy
c) By automating workflows to replace manual plans
d) By focusing on monitoring instead of risk assessment

10. Why is data accuracy vital in ICT-driven risk management?


a) Inaccurate data distorts risk analysis and decisions
b) Risk management thrives despite poor data quality
c) Accurate data matters to IT specialists
d) Manual entry outperforms automated data systems

11. How does automation improve efficiency in risk management?


a) By amplifying reliance on core processes
b) By optimizing repetitive tasks like monitoring
c) By bypassing the need to identify risks every time
d) By providing centralized repository of automation risks

12. How does blockchain enhance risk management processes?


a) By increasing transaction processing speeds
b) By ensuring immutable records for audits
c) By identification of errors in data integrity
d) All of the above

13. What are potential objectives of post-incident analysis in driving continuous improvement?
a) To identify employee errors contributing to incidents
b) To refine risk strategies using insights from past events
c) To assess whether ICT system updates can be deferred
d) All of the above options

78 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS CHAPTER 17: ICT'S ROLE IN RISK MANAGEMENT

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
17 1 b
17 2 b
17 3 b
17 4 b
17 5 a
17 6 b
17 7 b
17 8 a
17 9 b
17 10 a
17 11 b
17 12 b
17 13 b

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 79


CHAPTER 17: ICT'S ROLE IN RISK MANAGEMENT CAF 3 - DATA SYSTEMS AND RISKS

80 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


ANNEXURE A

CONTROL OBJECTIVES FOR


INFORMATION AND RELATED
TECHNOLOGIES (COBIT)

1. The primary purpose of COBIT 2019 is to:


a) Replace ITIL and ISO standards
b) Govern and manage enterprise I&T to achieve strategic goals
c) Focus solely on IT audit compliance
d) Standardize software development and improvement processes

2. Which principle emphasizes adapting governance to an organization’s size, industry, and risk profile?
a) Holistic Approach
b) Governance Distinct from Management
c) Tailored to Enterprise Needs
d) Dynamic Governance System

3. The goals cascade links enterprise goals to:


a) IT department budgets
b) Alignment goals and COBIT objectives
c) Organizational hierarchy
d) Compliance audits

4. A Level 3 capability rating indicates a process is:


a) Ad hoc
b) Standardized enterprise-wide
c) Continuously optimized
d) Not implemented

5. Maturity levels assess:


a) Individual employee performance
b) The overall governance system or focus areas
c) Hardware reliability
d) Software licensing costs

6. A process rated "Largely Achieved" at Level 2 means it meets:


a) >85% of criteria
b) 50–85% of criteria
c) <15% of criteria
d) All regulatory requirements

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 81


ANNEXURE A CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
A 1 b
A 2 c
A 3 b
A 4 b
A 5 b
A 6 b

82 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


ANNEXURE B
ISO/IEC
27001, 27002 & 27005

1. 1. What is the core focus of ISO/IEC 27001?


a) Frameworks for agile software development
b) Building an Information Security Management System
c) Protocols for data center physical safety
d) Guidelines for auditing financial records

2. The "CIA triad" in ISO/IEC 27001 refers to:


a) Cost, Innovation, Availability
b) Confidentiality, Integrity, Availability
c) Compliance, Integrity, Accountability
d) Cybersecurity, Incident Management, Auditing

3. Which of the following is not one of the four themes in Annex A of ISO/IEC 27001:2022?
a) Organizational Controls
b) Financial Controls
c) People Controls
d) Technological Controls

4. What does ISO/IEC 27005 primarily guide organizations on?


a) Managing information security risks
b) Optimizing software development cycles
c) Securing physical facility perimeters
d) Evaluating financial exposure risks

5. Which ISO/IEC 27005 component ensures stakeholders are informed about risks?
a) Risk monitoring
b) Risk communication
c) Risk avoidance
d) Risk delegation

6. The relationship between ISO/IEC 27001 and 27002 is best described as:
a) 27001 specifies ISMS requirements; 27002 details control implementation
b) 27002 replaces 27001
c) 27001 is optional while 27002 is mandatory
d) 27002 mandates certification while 27001 mandates integration with COBIT

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 83


ANNEXURE B CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
B 1 b
B 2 b
B 3 b
B 4 a
B 5 b
B 6 a

84 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


ANNEXURE C
REGULATORY GUIDELINES
BY (SBP)

1. The SBP Framework applies to all of the following EXCEPT:


a) Commercial banks
b) Microfinance banks (MFBs)
c) Non-banking financial companies (NBFCs)
d) Islamic banks

2. The Technology Governance Framework aligns IT with:


a) Business strategy
b) Employee preferences
c) Vendor priorities
d) Short-term cost savings

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 85


ANNEXURE C CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
C 1 c
C 2 a

86 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


ANNEXURE D
PAKISTAN’S LEGAL
FRAMEWORK FOR CYBERCRIMES
& DIGITAL SECURITY

1. The NCSP 2021 was issued by:


a) State Bank of Pakistan
b) Ministry of Information Technology & Telecommunication (MoITT)
c) Pakistan Telecommunication Authority
d) Federal Investigation Agency

2. Which entity is responsible for national-level cyber incident response under NCSP?
a) PTA
b) nCERT (National Computer Emergency Response Team)
c) PEMRA
d) SECP

3. Critical Information Infrastructure (CII) under NCSP includes:


a) Banking, energy, and telecommunications sectors
b) Retail and agriculture
c) Social media platforms
d) Non-profit organizations

4. Non-compliance with NCSP may lead to penalties under:


a) Information Security Act, 2017
b) Prevention of Electronic Crimes Act (PECA), 2016
c) Pakistan Penal Code
d) IT Ordinance

5. Which body regulates social media under PECA?


a) PEMRA
b) Social Media Protection and Regulatory Authority (SMPRA)
c) Pakistan Cyber Force
d) National Accountability Bureau

6. Service providers must retain traffic data for:


a) 30 days
b) 1 year
c) 3 months
d) Indefinitely

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 87


ANNEXURE D CAF 3 - DATA SYSTEMS AND RISKS

7. ETO 2002 provides legal validity to:


a) Electronic signatures and documents
b) Handwritten wills
c) Verbal contracts
d) social media posts

8. Which instrument is EXCLUDED from ETO’s scope?


a) Property sale contracts
b) Digital invoices
c) Online banking transactions
d) E-governance applications

9. Advanced Electronic Signatures under ETO are presumed authentic unless:


a) The signer is a minor
b) Proven otherwise in court
c) Used for international transactions
d) Printed on paper

10. The accreditation of Certification Service Providers (CSPs) is handled by:


a) State Bank of Pakistan
b) Electronic Certification Accreditation Council (ECAC)
c) Security & Exchange Commission of Pakistan
d) Pakistan Software Export Board

11. ETO amendments to Qanun-e-Shahadat allow:


a) Electronic documents as admissible evidence
b) Oral testimonies to override digital records
c) Only handwritten contracts in court
d) Social media posts as inadmissible evidence

12. DNS filtering under NCSP is used to:


a) Block malicious domains
b) Monitor private emails
c) Tax e-commerce transactions
d) Slow internet speeds

13. PECA’s Social Media Protection Tribunal hears appeals against:


a) Banking regulations
b) SMPRA decisions
c) None of these
d) Employment infidelity

88 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS ANNEXURE D

14. Under ETO, electronic retention is valid if the content is:


a) Unaltered and accessible
b) Printed annually
c) Stored outside Pakistan
d) Encrypted for over 10 years

15. NCSP’s three-tier governance structure includes:


a) National, sectoral, and organizational levels
b) Only federal and provincial levels
c) Military and civilian divisions
d) Public and private sectors only

16. PECA defines unlawful content as material that:


a) Incites violence or hate speech
b) Criticizes government policies
c) Discusses cybersecurity
d) Criticizes digital media

17. Real-time surveillance under PECA is authorized for:


a) Serious offenses
b) All violations
c) Criticism based comments
d) Social media activists

18. A key difference between PECA and ETO is:


a) PECA criminalizes cyber offenses; ETO validates digital transactions
b) ETO applies only to government bodies
c) PECA excludes financial crimes
d) ETO mandates internet shutdowns

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 89


ANNEXURE D CAF 3 - DATA SYSTEMS AND RISKS

ANSWERS TO SELF TEST QUESTIONS


Chapter/ Annexure Ref. Question Answer
D 1 b
D 2 b
D 3 a
D 4 b
D 5 b
D 6 b
D 7 a
D 8 a
D 9 b
D 10 b
D 11 a
D 12 a
D 13 b
D 14 a
D 15 a
D 16 a
D 17 a
D 18 a

90 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


LONG-FORM QUESTIONS

1. Data Types and Their Applications


Data can be classified into qualitative (nominal and ordinal) and quantitative (discrete and continuous) types, as
well as structured, unstructured, and semi-structured categories. Select an industry of your choice (e.g.,
healthcare, retail, or finance) and explain how each of these data types and categories might be generated and
utilized within that industry. Provide specific examples for each type and category, and discuss how
understanding these distinctions can help the industry improve its decision-making processes.

2. Data Governance Framework and Levels


Data governance operates at strategic, tactical, and operational levels, each with distinct roles in managing data
effectively. Imagine you are implementing a data governance program for a multinational e-commerce company.
a) Describe the key activities you would undertake at each of these three levels to ensure robust data
management.
b) For each level, identify a potential challenge (e.g., resistance to change, compliance issues) and propose a
solution to overcome it.
c) How do these levels work together to support the company’s overall data management objectives?

3. Role of Data Owners and Data Stewards


Explain the roles of Data Owners and Data Stewards within a data governance framework. How do their
responsibilities complement each other, and why are both roles critical to ensuring data integrity and operational
efficiency?

4. Data Lifecycle Management (DLM)


Discuss the importance of Data Lifecycle Management (DLM) in a data governance strategy. Describe each phase
of the data lifecycle and explain how governance measures at each stage help ensure compliance, data quality,
and cost optimization.

5. Data Sensitivity Levels


Explain the four levels of data sensitivity classification and provide practical examples of each. Why is it
important for organizations to distinguish between these levels?

6. Comparative Analysis of Data Classification Models


Compare and contrast the different data classification models (sensitivity-based, role-based, compliance-based,
lifecycle-based, and government classification models). In what scenarios might each model be most useful?

7. Data Analytics Cycle in Practice


The data analytics process involves a cycle of collecting data, cleaning it, exploring it, modeling it, interpreting
results, making decisions, and monitoring outcomes. Choose a real-world scenario (e.g., a retail chain analyzing
customer purchasing trends or a hospital improving patient care) and explain how each stage of this cycle could
be applied to solve a specific business problem. For each stage, describe the tools or techniques that might be
used and how the results of one stage inform the next. What are some potential obstacles that could arise, and
how might they be addressed?

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 91


LONG-FORM QUESTIONS CAF 3 - DATA SYSTEMS AND RISKS

8. Comparing Data Analytics Stages


A leading retail company is facing challenges in its supply chain, including frequent stockouts, excess inventory,
and inefficiencies in delivery logistics. The company wants to leverage data analytics to improve supply chain
performance and enhance profitability.
Using the four stages of data analytics—Descriptive, Diagnostic, Predictive, and Prescriptive Analytics—explain
how the company can systematically address its supply chain issues.
For each stage:
1) Describe how it applies to the scenario.
2) Provide an example of a specific technique that could be used.
3) Explain how the insights gained in one stage lead to the next.

9. Five Vs of Big Data


SmartRetail Inc., a large e-commerce company, collects vast amounts of customer data, including purchase
history, website browsing behavior, product reviews, and social media interactions. The company wants to
leverage this data to improve customer experience, optimize inventory management, and detect fraudulent
transactions.
Define Big Data and explain how SmartRetail Inc. can utilize its data effectively by considering the 5 Vs of Big
Data (Volume, Variety, Velocity, Veracity, and Value). Provide examples of how each of these characteristics
applies to their operations.

10. ACID Properties in DBMS


A banking system processes thousands of transactions daily, including fund transfers, withdrawals, and deposits.
1) Define the ACID properties and explain how each property (Atomicity, Consistency, Isolation, Durability)
ensures reliable transaction processing in this system.
2) Provide a real-world example of how atomicity prevents partial updates during a fund transfer between two
accounts.
3) How does isolation prevent concurrent transactions from interfering with each other? Illustrate with an
example.

11. Database Normalization


An e-commerce platform stores order data in a single table with repeating groups and partial dependencies:

OrderID CustomerName Product1 Product2 Product3 ProductPrice1 ProductPrice2 ProductPrice3 CustomerAddress

1 Ali Akbar Laptop Mouse Keyboard 50000 1000 2000 Lodhi Road Lahore

2 Saira Shahid Monitor Printer NULL 15000 8000 NULL Shali Road Karchi

Tasks:
1) Identify the normalization violations (e.g., 1NF, 2NF) in this table.
2) Convert the table into 1NF and 2NF, explaining each step.
3) Discuss one advantage (e.g., reduced redundancy) and one disadvantage (e.g., increased joins) of
normalization.

92 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM QUESTIONS

12. Data Warehousing vs. OLTP


1) Compare OLTP and Data Warehousing systems in terms of:
- Purpose (e.g., real-time transactions vs. historical analysis).
- Data structure (e.g., normalized vs. denormalized).
- Query types (e.g., simple inserts vs. complex aggregations).
2) Why is a data warehouse non-volatile, and how does this differ from OLTP systems?
3) Provide an example where an organization would use both systems (e.g., a retail chain processing sales vs.
analyzing yearly trends).

13. Data Warehouse Schemas


A multinational corporation analyzes sales, inventory, and supplier data.
Tasks:
1) Compare Star Schema, Snowflake Schema, and Galaxy Schema in terms of:
- Structure (e.g., fact tables, dimension tables).
- Performance implications (e.g., query speed vs. storage efficiency).
- Ideal use cases (e.g., departmental reporting vs. enterprise-wide analysis).
2) Design a Star Schema for the sales data, identifying the fact table and at least three dimension tables.
3) When would the company prefer a Galaxy Schema over a Star Schema? Justify your answer.

14. Designing a Scalable and Secure IT Architecture


Imagine you are an IT architect tasked with designing the systems architecture for a growing online retail
company that expects a significant increase in customer traffic during seasonal sales events. Explain how you
would structure the architecture to ensure scalability, flexibility, and security. Discuss the role of hardware,
software, networks, and storage in your design, and describe how you would incorporate at least two best
practices (e.g., modular design, redundancy, or security by design) to address the company’s needs. Provide
specific examples of technologies or approaches you might use, and explain how they would help the company
handle peak demand while protecting customer data.

15. The OSI Model as a Framework


Describe the OSI (Open Systems Interconnection) Model and explain the function of each of its seven layers. How
does this layered approach promote standardization, interoperability, and efficient troubleshooting in
networked systems?

16. Types of ERP Systems and Their Differences


What are the three main types of ERP systems, and how do they differ in terms of deployment, cost, and control?

17. ERP Implementation Challenges and Mitigation Strategies


Describe the key challenges organizations face during ERP implementation and suggest one strategy to overcome
each challenge.

18. Synergy of AI and IoT in Everyday Solutions


Explore how Artificial Intelligence (AI) and the Internet of Things (IoT) can work together to create innovative
solutions for everyday use. Explain how AI’s ability to analyze data and make decisions could enhance IoT’s
interconnectivity and data collection features. Provide two examples of potential applications that could benefit
diverse users, such as smart environments or personal assistants, and describe the key features of each
technology that enable these solutions. What challenges might arise when integrating AI and IoT, and how would
you address them to ensure effective functionality?

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 93


LONG-FORM QUESTIONS CAF 3 - DATA SYSTEMS AND RISKS

19. Blockchain and 5G for Data Management


Compare the roles of Blockchain Technology and 5G Technology in improving data management and
communication in a technology-driven world. Discuss how Blockchain’s decentralization and security features
contrast with 5G’s high-speed, low-latency capabilities. Provide examples of how each could be applied to
enhance data integrity or real-time access, such as secure record-keeping or rapid data sharing. Which
technology do you believe has a greater impact on future data systems, and why? What are two implementation
challenges for your chosen technology, and how would you mitigate them?

20. Comparing AR and VR for User Engagement in Digital Spaces


Compare the roles of Augmented Reality (AR) and Virtual Reality (VR) in enhancing user engagement within
digital environments. Discuss how AR’s integration with the physical world contrasts with VR’s creation of
isolated virtual experiences, focusing on their strengths in fostering interaction and immersion. Provide
examples of how each could be applied, such as real-time visualizations or virtual explorations, and explain which
you believe is more effective for sustaining long-term user interest, and why. What are two challenges in
deploying your chosen technology, and what strategies would you use to overcome them?

21. AI Strategy for Financial Services


GlobalBank is implementing AI across three divisions:
1) Retail Banking: Personalized customer recommendations
2) Risk Management: Real-time fraud detection
3) Operations: Automated document processing
Tasks:
a) For each division:
• Specify whether Analytical, Predictive, or Generative AI is most appropriate
• Justify your choice with two business requirements it addresses
• Name one specific algorithm that could be used (e.g., CNN, Random Forest, GPT-3)
b) The CTO insists on using only off-the-shelf AI solutions. As the AI lead:
• List three advantages and two limitations of this approach for GlobalBank

22. Robotic Process Automation (RPA) vs. Artificial Intelligence (AI)


A manufacturing company is exploring automation technologies to improve efficiency in production and supply
chain management. The management is evaluating Robotic Process Automation (RPA) and Artificial Intelligence
(AI) to streamline operations.
Tasks:
1) Define RPA and AI and explain their fundamental differences in how they process tasks.
2) Compare RPA and AI across three key aspects:
- Task Complexity: What types of tasks can each technology handle in a manufacturing setup?
- Decision-Making Ability: How do they differ in handling rule-based vs. cognitive tasks?
- Learning & Adaptability: Can they improve over time without human intervention?
3) Identify one real-world use case for RPA and one for AI in manufacturing, explaining why each technology is
best suited for the chosen task.
4) Hybrid Approach: Explain how RPA and AI can be combined for a more powerful automation solution.
Provide an example of how they might work together in a manufacturing facility.

94 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM QUESTIONS

23. Cloud Computing Characteristics


Explain the key characteristics of cloud computing and discuss how each feature benefits businesses. Provide
real-world examples to illustrate your answer.

24. Cloud Service Models


Compare the three primary cloud service models (IaaS, PaaS, SaaS) in terms of control, flexibility, management
responsibilities, and use cases. Which model would be most suitable for a software development startup, and
why?

25. Blockchain Fundamentals


Explain the role of hashing and consensus mechanisms in ensuring the security and immutability of a blockchain.
Provide an example of how altering a transaction in a block would affect the blockchain, and describe how a
specific consensus mechanism (e.g., Proof of Work or Proof of Stake) prevents such tampering.

26. Blockchain and Fintech Synergy


Decentralized Finance (DeFi) is described as a key synergy between blockchain and fintech. Define DeFi and
explain how it differs from traditional financial systems. Discuss one advantage and one potential challenge of
using DeFi platforms, for lending and borrowing cryptocurrencies.

27. Digital Disruption on Accounting Profession


Discuss how digital disruption has reshaped the accounting profession, focusing on changes in roles, skills, and
tools. Contrast these changes with traditional accounting practices.

28. Understanding Risk Management


Define risk management and explain its importance in protecting an organization’s digital assets. Select two key
stages of the risk management process i.e. risk identification and risk mitigation and describe how they
contribute to maintaining operational resilience. Provide a hypothetical example of how an organization might
apply these stages to address a potential IT-related threat.

29. Types of IT Risks and Mitigation


Compare two distinct categories of IT risks (e.g., physical risks vs. digital risks) in terms of their sources, potential
impacts on an organization, and effective mitigation strategies. For each category, provide one specific mitigation
strategy and explain how it reduces the risk’s likelihood or impact.

30. Risk Treatment Strategies


Explain the various risk treatment strategies available to organizations, including mitigation, avoidance, transfer,
and acceptance. How should an organization determine which strategy to adopt based on its risk appetite, impact
assessment, and regulatory environment? Illustrate your answer with practical scenarios such as cybersecurity
threats or third-party risks.

31. Foundational Security Measures


Explain the concept of defense-in-depth in IT security, highlighting key foundational security measures such as
access control, encryption, and incident response. How do these layers work together to enhance organizational
security?

32. Role of Advanced and Infrastructure-level Technologies in IT security


Discuss the role of advanced and infrastructure-level technologies in modern IT security. How do innovations
like Artificial Intelligence (AI), Zero Trust Architecture, and endpoint security tools contribute to a
comprehensive cybersecurity strategy?

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 95


LONG-FORM QUESTIONS CAF 3 - DATA SYSTEMS AND RISKS

33. Cybersecurity Threats and Defense Strategies


Identify two major cybersecurity threats that organizations face in today’s digital landscape (e.g., malware,
phishing) and describe how they exploit vulnerabilities to compromise systems or data. For each threat, propose
one preventive measure and one detection method, explaining how these strategies work together to enhance
security. Discuss how failing to address these threats could affect an organization’s reputation or operations.

34. Emerging Cybersecurity Risks in a Tech-Driven Organization


TechNova Solutions is a rapidly growing technology firm that has recently expanded its operations to leverage
cutting-edge innovations. The company has implemented Artificial Intelligence (AI) to enhance customer
support through chatbots, deployed Internet of Things (IoT) devices to monitor its smart office infrastructure,
and migrated its data storage and processing to a cloud computing platform to improve scalability. While these
technologies have boosted efficiency and customer satisfaction, TechNova’s IT team has noticed an uptick in
security incidents, including unauthorized access attempts and data exposure risks. The leadership is concerned
about the emerging cybersecurity threats these technologies introduce and seeks a comprehensive risk
management strategy to protect the organization’s systems, data, and reputation.
Tasks:
a) AI Risks and Mitigation: Analyze two specific cybersecurity risks associated with TechNova Solutions’ use of
Artificial Intelligence (AI) for its chatbot system. For each risk, suggest one mitigation measure and explain
how it addresses the risk to enhance TechNova’s security.
b) IoT Risks and Mitigation: Identify two specific cybersecurity risks linked to TechNova Solutions’ deployment
of Internet of Things (IoT) devices in its smart office infrastructure. For each risk, propose one mitigation
measure and describe how it reduces the risk to strengthen the company’s cybersecurity posture.
c) Cloud Computing Risks and Mitigation: Examine two specific cybersecurity risks arising from TechNova
Solutions’ migration to a cloud computing platform for data storage and processing. For each risk,
recommend one mitigation measure and explain how it mitigates the risk to improve overall security.
d) Strategic Impact: Discuss how proactive management of these AI, IoT, and cloud computing risks could
position TechNova Solutions as a leader in its industry, considering factors such as customer trust,
operational resilience, and competitive advantage.

35. Objectives and Importance of IT Controls


Define IT General Controls (ITGCs) and explain their role in managing risks within an organization’s IT
environment. For two key objectives of ITGCs i.e. ensuring data integrity and maintaining system availability,
describe how achieving these objectives helps mitigate specific IT-related risks.

36. IT General Controls and Information and Communication Technology strategies


A financial institution faces rising cybersecurity risks. Recommend ITGCs and ICT strategies to mitigate these
risks, justifying your choices.

37. National Cyber Security Policy (NCSP) 2021


The NCSP 2021 establishes a three-tier governance structure to strengthen Pakistan’s cybersecurity posture.
Identify the three levels of this structure and describe the primary function of one key entity at the national level.

38. Prevention of Electronic Crimes Act (PECA) 2016


PECA 2016 criminalizes specific cyber activities to protect critical infrastructure and individual rights. You are
required to define "cyberterrorism" under PECA and provide one example of an act that would qualify as
cyberterrorism.

39. Electronic Transactions Ordinance (ETO) 2002


The ETO 2002 provides legal validity to electronic transactions and signatures. What is the role of a Certification
Service Provider (CSP) under the ETO, and how does it ensure the authenticity of electronic signatures?

96 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


LONG-FORM ANSWERS

1 Data Types and Their Applications


In the healthcare industry, various data types play critical roles in patient care, operational efficiency, and
medical research. Understanding these distinctions enables healthcare providers to make better-informed
decisions and improve outcomes.
1) Qualitative Data:
• Nominal Data: Represents categories without order.
o Example: Patient blood types (A, B, AB, O) or gender (Male, Female, Other).
o Use: Ensures correct blood transfusions and personalized treatment plans.
• Ordinal Data: Categories with meaningful order but non-uniform intervals.
o Example: Pain scales (1-10) or tumor stages (Stage I to IV).
o Use: Prioritizes emergency cases and tracks disease progression.
2) Quantitative Data:
• Discrete Data: Countable whole numbers.
o Example: Number of patients admitted daily or medications prescribed.
o Use: Resource allocation (staffing, inventory) and capacity planning.
• Continuous Data: Measurable values within a range.
o Example: Body temperature (98.6°F), blood pressure (120/80 mmHg), or lab results
(glucose levels).
o Use: Monitors patient vitals and adjusts treatments in real-time.
3) Structured Data:
• Example: Electronic Health Records (EHRs) with fields like patient ID, diagnosis codes (ICD-10),
and treatment dates.
• Use: Enables quick retrieval of patient history and insurance billing automation.
4) Unstructured Data:
• Example: Doctor’s clinical notes (text), MRI images, or patient voice recordings.
• Use: Natural Language Processing (NLP) extracts insights from notes; AI analyzes images for
anomalies.
5) Semi-Structured Data:
• Example: Lab reports in JSON format or wearable device data (heart rate, steps) with timestamps.
• Use: Integrates IoT device data with EHRs for holistic patient monitoring.
Impact on Decision-Making:
• Precision Medicine: Combining structured (genomic data) and unstructured (research papers) data to
tailor treatments.
• Operational Efficiency: Discrete data (patient counts) optimizes staff schedules; continuous data
(equipment usage) predicts maintenance needs.
• Research: Ordinal data (clinical trial phases) and qualitative feedback (patient surveys) improve drug
development.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 97


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

Conclusion:
Classifying healthcare data correctly ensures accurate analytics, reduces errors (e.g., mismatched blood
types), and enhances predictive capabilities (e.g., outbreak detection). Structured data streamlines
operations, while unstructured/semi-structured data unlocks advanced AI applications, ultimately saving
lives and cutting costs.
2 Data Governance Framework and Levels
a) Key Activities at Each Level
1) Strategic Level
• Key Activities:
o Define the vision and objectives for data governance.
o Establish a Data Governance Council with C-level executives to align data policies with
business goals.
o Develop high-level policies on data security, privacy, and quality standards.
o Allocate budget and resources for governance initiatives.
2) Tactical Level
• Key Activities:
o Design data stewardship roles (e.g., assigning Data Owners for customer, product, and
transaction data).
o Implement data standards (e.g., naming conventions, metadata tagging for product
catalogs).
o Create compliance workflows (e.g., automated data retention policies for user records).
o Conduct training programs for teams on data governance best practices.
3) Operational Level
• Key Activities:
o Execute daily data quality checks (e.g., detecting duplicate product listings).
o Enforce access controls (e.g., role-based permissions for customer PII).
o Monitor data breaches via SIEM tools and respond to incidents.
o Maintain metadata repositories to track data lineage (e.g., tracing price changes across
regions).
b) Challenges & Solutions

Level Potential Challenge Proposed Solution

Strategic Lack of executive buy-in Present case studies showing cost savings from
due to unclear ROI. reduced data breaches.

Tactical Resistance from teams Pilot governance in one department (e.g., product
accustomed to siloed data. data), showcase efficiency gains, then scale.

Operational Overwhelming volume of Use AI-driven tools to prioritize critical issues (e.g.,
data quality alerts. false product pricing errors over minor metadata
gaps).

98 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

c) How Levels Work Together


• Strategic → Tactical: The council’s vision (e.g., "Unified customer data") guides tactical teams to
build a master data management (MDM) system.
• Tactical → Operational: Data stewards define quality rules, enabling ops teams to flag
inconsistencies in real-time (e.g., mismatched currency conversions).
• Operational → Strategic: Incident reports inform policy updates at the strategic level.
Synergy Example:
• Strategic sets a goal to "Improve data-driven personalization."
• Tactical deploys a customer 360° platform with standardized tags.
• Operational cleanses real-time clickstream data, feeding accurate analytics for targeted ads.
Outcome: Alignment ensures compliance, reduces costs, and enhances customer trust—key to e-commerce
success.
3 Role of Data Owners and Data Stewards
Data Owners and Data Stewards are two foundational roles in a data governance framework. While they
operate at different levels of the organization, their functions are complementary and essential to the
effective management of data.
Role of Data Owners:
Data Owners are typically senior-level individuals or business leaders who are accountable for specific
data domains such as customer data, financial data, or employee data. Their responsibilities include:
• Defining how data should be used in alignment with business goals.
• Making strategic decisions regarding data access, sharing, and classification.
• Ensuring that policies and compliance requirements are met within their data domain.
• Assigning and overseeing the work of Data Stewards who manage the operational aspects of the data.
Role of Data Stewards:
Data Stewards are operational custodians of the data who ensure that the data is:
• Accurate, consistent, and well-documented.
• Properly formatted, validated, and cleansed.
• Handled according to the organization's data standards, procedures, and quality expectations.
• Maintained in a way that supports ongoing business operations.
Complementary Responsibilities:
While the Data Owner is responsible for the strategic oversight and decision-making, the Data Steward
ensures the practical, day-to-day enforcement of those decisions. For instance, if a Data Owner defines access
policies, the Data Steward ensures that those policies are implemented and monitored.
Why Both Roles Are Critical:
• Ensuring Data Integrity: Together, they reduce the risk of data errors, duplication, and inconsistency.
• Maintaining Operational Efficiency: Clear roles streamline data handling processes, improve
workflow, and reduce delays caused by data issues.
• Supporting Compliance: By maintaining clear ownership and quality control, both roles contribute
to meeting regulatory and business requirements.
Without either role, data governance becomes fragmented—policies may exist without enforcement, or data
may be maintained without strategic alignment. Their synergy creates a robust foundation for trustworthy,
high-quality data.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 99


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

4 Data Lifecycle Management


Data Lifecycle Management (DLM) is a critical element of data governance that ensures data is managed
systematically and securely from its creation to its final disposal. It involves applying governance controls
across five key phases:
1) Creation/Acquisition:
This marks the beginning of the data lifecycle.
• What Happens: Data is either generated internally (e.g., via transactions, applications) or
acquired from external sources (e.g., vendors, customers).
• Governance Focus:
o Accuracy checks at point of entry.
o Classification of data (e.g., confidential, public).
o Assignment of ownership and initial documentation.
• Impact: Reduces data errors and ensures proper security from the start.
2) Storage:
The data is stored in databases, data lakes, file systems, or cloud environments.
• Governance Focus:
o Access controls to restrict unauthorized usage.
o Data encryption and backup policies.
o Storage optimization to avoid redundancy.
• Impact: Enhances security, reduces risk of data loss, and supports disaster recovery.
3) Usage:
The most active phase where data is processed, analyzed, and shared.
• Governance Focus:
o User access tracking and audit trails.
o Data cleansing and validation routines.
o Monitoring for ethical and compliant usage.
• Impact: Ensures decisions are based on accurate, timely, and authorized data.
4) Archiving:
Inactive but valuable data is moved to long-term storage.
• Governance Focus:
o Use of read-only storage with retention policies.
o Ensuring accessibility and security for audits or legal inquiries.
o Compliance with regulatory retention periods.
• Impact: Reduces storage costs while preserving institutional and legal records.
5) Deletion/Disposal:
Data that is no longer needed is permanently destroyed.
• Governance Focus:
o Use of secure deletion techniques (e.g., data wiping, cryptographic erasure).
o Documentation of destruction for audit purposes.
o Compliance with privacy regulations.
• Impact: Minimizes legal liability, prevents unauthorized recovery, and ensures compliance with
data protection laws.

100 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

5 Data Sensitivity Levels


The four levels of sensitivity-based data classification help organizations manage and protect data according
to the risk associated with unauthorized disclosure or misuse. These levels are:
1) Public Data:
• Description: Information that is safe for public access and does not pose any risk if disclosed.
• Example: Company brochures, press releases, and published annual reports.
• Importance: While it doesn't require strong protection, it must still be managed to ensure
accuracy and consistency with the organization's messaging.
2) Internal Data:
• Description: Intended only for use within the organization. While not highly sensitive, its
exposure could cause minor issues or confusion.
• Example: Internal emails, employee handbooks, standard operating procedures.
• Importance: Helps in operational efficiency and coordination but should be shielded from
external audiences to avoid misinterpretation or competitive exposure.
3) Confidential Data:
• Description: Information that could result in harm to the organization if exposed. Requires
restricted access and protection mechanisms.
• Example: Financial records, business strategies, employee salary data.
• Importance: Unauthorized disclosure could lead to financial losses, loss of competitive edge, or
reputational harm.
4) Restricted Data:
• Description: The highest level of sensitivity. Involves legal or contractual obligations and could
cause severe damage if exposed.
• Example: Personally Identifiable Information (PII), health records, legal contracts, trade secrets.
• Importance: Mishandling can result in legal penalties, regulatory breaches, or loss of trust.
Importance of Classification:
Differentiating between these levels enables organizations to apply appropriate security controls, ensure
regulatory compliance, reduce risk, and allocate resources effectively. A clear classification scheme also
supports consistent access control and governance policies, enabling both protection and efficient use of
data.
6 Comparative Analysis of Data Classification Models
Organizations may adopt one or more data classification models depending on their needs, industry, and
regulatory environment. Here's a comparison:
1) Sensitivity-Based Classification Model:
• Basis: Degree of sensitivity and potential harm from disclosure.
• Examples of Use: Common in corporate environments to manage operational data such as
financials, HR data, and intellectual property.
• Best For: General-purpose security and risk management.
2) Role-Based Classification Model:
• Basis: User roles and responsibilities; access is granted on a need-to-know basis.
• Examples of Use: Department-specific data such as marketing strategies (accessible to
marketing) or legal contracts (legal department).
• Best For: Internal data governance and access control frameworks (e.g., in ERP or CRM systems).

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 101


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

3) Compliance-Based Classification Model:


• Basis: Legal, regulatory, or contractual obligations.
• Examples of Use: Handling PII under GDPR, PHI under HIPAA, or payment data under PCI-DSS.
• Best For: Regulated industries like finance, healthcare, and e-commerce.
4) Lifecycle-Based Classification Model:
• Basis: Stage of the data in its lifecycle — from creation to deletion.
• Examples of Use: Draft versions marked as "Temporary", archived audit logs, or data flagged for
secure deletion.
• Best For: Organizations focused on data lifecycle management, archival policies, and retention
schedules.
5) Government Classification Model:
• Basis: National security and government operational sensitivity.
• Examples of Use: Defense documents labeled "Top Secret" or "Confidential."
• Best For: Public sector, military, or organizations handling classified government data.
Each classification model serves a unique purpose. For example, a multinational bank may adopt
compliance-based classification for regulatory needs, sensitivity-based for internal security, and role-based
access control across departments. A government agency would prioritize the government model, while a
data-driven tech company might benefit from a lifecycle-based model to manage vast amounts of evolving
information efficiently.
7 Data Analytics Cycle in Practice
Let’s consider a retail chain analyzing customer purchasing trends to optimize inventory management and
improve customer satisfaction. Below is how each stage of the data analytics cycle applies to this scenario:
1) Data Collection
• Objective: Gather customer purchase data from multiple sources.
• Data Sources:
o Point-of-Sale (POS) systems (transaction data)
o Online sales records (e-commerce platform)
o Customer loyalty programs
o External market data (seasonal trends, competitor pricing)
• Tools & Techniques: SQL databases, API integrations, web scraping
Challenge: Data inconsistency across different sources.
Solution: Standardized data collection formats and automated extraction pipelines.
2) Data Cleaning
• Objective: Remove errors, handle missing values, and ensure consistency.
• Techniques:
o Handling missing values (imputation techniques)
o Standardizing date formats, currency, and product codes
o Removing duplicate transactions
• Tools: Python, Power Query
Challenge: Large datasets may have inconsistencies or errors.
Solution: Implement automated cleaning scripts and validation rules.

102 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

3) Data Exploration (EDA - Exploratory Data Analysis)


• Objective: Identify patterns, trends, and anomalies in purchasing behavior.
• Techniques:
o Summary statistics (mean purchase value, top-selling items)
o Visualizations (sales heatmaps, histograms)
o Correlation analysis (product category relationships)
• Tools: Python, Power BI, Tableau
Challenge: Unexpected outliers in sales data.
Solution: Investigate outliers and verify data accuracy before proceeding.
4) Data Modeling & Analysis
• Objective: Use machine learning or statistical models to predict customer demand.
• Techniques:
o Time series forecasting
o Market basket analysis
o Customer segmentation
• Tools: Python, R, Power BI
Challenge: Overfitting in predictive models due to noise in data.
Solution: Cross-validation and feature selection to refine models.
5) Interpretation & Insights Generation
• Objective: Translate model outputs into actionable business insights.
• Examples of Insights:
o Peak sales occur on weekends → Adjust staffing and stock accordingly.
o High correlation between certain product purchases → Bundle them for promotions.
o Declining sales in a region → Investigate local market conditions.
• Tools: Excel, Power BI dashboards
Challenge: Non-technical stakeholders may struggle with complex analytics.
Solution: Use intuitive dashboards and visual storytelling to present findings.
6) Decision-Making & Implementation
• Objective: Use insights to optimize inventory management and marketing strategies.
• Actions Taken:
o Adjust reorder levels for fast-moving products.
o Personalize customer offers based on buying behavior.
o Allocate marketing budgets to high-revenue product categories.
• Tools: CRM systems, ERP software, automated inventory systems
Challenge: Resistance to data-driven decision-making.
Solution: Training for decision-makers on data-driven strategies.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 103


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

7) Monitoring & Continuous Improvement


• Objective: Track the impact of implemented changes and refine strategies.
• Metrics Monitored:
o Sales growth
o Customer retention rates
o Stockout and overstock levels
• Tools: Real-time dashboards, automated alert systems
Challenge: Business environments change rapidly (e.g., competitor actions, market trends).
Solution: Regularly update models and dashboards with fresh data.
Conclusion:
By following the data analytics cycle, the retail chain can predict customer demand, optimize stock
levels, increase revenue, and enhance customer satisfaction. The key to success is continuous
monitoring, ensuring that strategies remain aligned with real-world conditions.
8 Comparing Data Analytics Stages
To address supply chain inefficiencies, the retail company will apply data analytics in four progressive stages,
transforming raw data into actionable strategies.
1) Descriptive Analytics – Understanding the Past
Objective: Identify key trends in supply chain performance.
Application in Scenario: The company collects and analyzes past sales data, inventory levels, and
delivery times to understand what has happened.
Techniques Used:
• Data visualization tools such as Power BI or Tableau to create dashboards showing stockouts,
delayed shipments, and high-return products.
• Summary statistics like mean delivery time, average stock levels, and sales trends.
Insights:
• Sales data from the past year shows frequent stockouts for high-demand products.
• Certain regions consistently experience delayed deliveries.
How It Helps: This stage provides a baseline understanding of supply chain inefficiencies.
Potential Drawback: It only describes what happened but does not explain why.
2) Diagnostic Analytics – Identifying Root Causes
Objective: Determine why stockouts and delays occur.
Application in Scenario:
The company examines warehouse operations, supplier performance, and seasonal trends to find
correlations between inefficiencies and root causes.
Techniques Used:
• Drill-down analysis to break down data by supplier, region, and product category.
• Correlation analysis to identify relationships between demand surges and supplier delays.
Insights:
• Stockouts are more common during holiday seasons due to slow supplier response times.
• Late deliveries occur mainly due to congestion at distribution centers.
How It Helps: By diagnosing inefficiencies, the company can explore corrective actions.
Potential Drawback: Correlation does not always imply causation, so further validation is needed.

104 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

3) Predictive Analytics – Forecasting Future Trends


Objective: Anticipate future supply chain disruptions.
Application in Scenario:
The company builds predictive models to estimate demand, supplier lead times, and potential
stockouts.
Techniques Used:
• Time series forecasting methods to predict demand spikes.
• Machine learning models like Random Forest or Regression to estimate supply chain bottlenecks.
Insights:
• The model predicts increase in demand for certain products during the holiday season.
• Specific suppliers are likely to miss deadlines based on historical delays.
How It Helps: The company can proactively adjust inventory levels and coordinate with suppliers.
Potential Drawback: Predictions depend on data quality, and unforeseen events such as a pandemic
may disrupt accuracy.
4) Prescriptive Analytics – Recommending Best Actions
Objective: Suggest optimal strategies to prevent supply chain inefficiencies.
Application in Scenario:
Using prescriptive models, the company determines the best way to allocate inventory, reroute
shipments, and negotiate supplier contracts.
Techniques Used:
• Optimization algorithms to identify the best supplier mix and inventory restocking strategies.
• Scenario analysis to simulate different supplier responses and adjust logistics accordingly.
Insights:
• The company can pre-stock warehouses with high-demand products before peak season.
• Switching to faster suppliers with slightly higher costs reduces overall delivery delays and saves
more on lost sales.
How It Helps: Instead of reacting to problems, the company takes preventive action for optimal supply
chain performance.
Potential Drawback: Implementing prescriptive models can be complex and require high
computational power.
9 Five Vs of Big Data
Big Data refers to the vast amounts of structured and unstructured data generated at high velocity from
various sources such as social media, IoT devices, transaction records, and more. It requires advanced
technologies and techniques to capture, store, process, and analyze in order to derive valuable insights for
decision-making.
The 5 Vs of Big Data are key characteristics used to define and understand the challenges and opportunities
associated with Big Data:
1) Volume:
• SmartRetail collects enormous amounts of data from millions of customers, including
transactions, clicks, and social media interactions.
• The company needs scalable storage solutions to handle the growing volume of data, ensuring
that the data is stored securely and can be accessed efficiently for analysis.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 105


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

2) Variety:
• The data generated by SmartRetail includes structured data (transaction records, customer
profiles), semi-structured data (clickstream data), and unstructured data (customer reviews,
social media posts).
• By integrating these diverse data types, SmartRetail can get a more comprehensive
understanding of customer preferences and market trends.
3) Velocity:
• Data is generated in real-time, such as customer interactions on the website or mobile app, and
must be processed quickly to deliver personalized recommendations or detect fraudulent
activity.
• Real-time analytics help SmartRetail respond to changing customer demands instantly and
prevent fraud by detecting suspicious transactions as they happen.
4) Veracity:
• Ensuring the accuracy and trustworthiness of data is crucial for SmartRetail. For instance,
fraudulent product reviews or inaccurate sales data could lead to wrong business decisions.
• SmartRetail needs to apply data cleansing techniques to filter out unreliable or inconsistent data,
ensuring that decisions are based on reliable information.
5) Value:
• The ultimate goal of Big Data is to derive valuable insights that drive business decisions.
• For SmartRetail, this could mean identifying customer buying patterns, optimizing inventory
based on demand predictions, and enhancing customer satisfaction through targeted marketing
strategies.
By understanding and applying the 5 Vs of Big Data, SmartRetail Inc. can not only manage its vast
datasets efficiently but also leverage the insights for better decision-making and improved business
performance.
10 ACID Properties in DBMS
1) ACID Properties and their Role in Banking
The ACID properties ensure reliable transaction processing in databases, particularly critical for
banking systems where accuracy and consistency are paramount:
• Atomicity:
o Definition: Guarantees that a transaction is treated as a single, indivisible unit. Either all
operations in the transaction are completed, or none are.
o Banking Application: Ensures that a fund transfer (e.g., Rs. 100 from Account A to Account
B) either fully succeeds (both debit and credit occur) or fully fails (no partial updates).
Prevents scenarios where money is deducted but not credited.
• Consistency:
o Definition: Ensures transactions bring the database from one valid state to another,
adhering to predefined rules (e.g., no negative balances).
o Banking Application: If a withdrawal would overdraw an account, the transaction is aborted
to maintain consistency (e.g., rejecting a Rs. 500 withdrawal from an account with Rs. 400).
• Isolation:
o Definition: Ensures concurrent transactions execute as if they were sequential, preventing
interference (e.g., dirty reads or lost updates).
o Banking Application: While one transaction calculates interest on an account, another
transaction cannot modify the same account’s balance until the first completes.

106 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

• Durability:
o Definition: Once a transaction is committed, its effects persist even after system failures
(e.g., power outages).
o Banking Application: After a successful transfer, the updated balances are permanently
saved to disk. Even if the system crashes, the transaction remains intact.
2) Example of Atomicity in Fund Transfers
Scenario: Transferring Rs. 100 from Account A (balance: Rs. 500) to Account B (balance: Rs. 200).
• Atomicity in Action:
1) Step 1: Deduct Rs. 100 from Account A (Rs. 500 → Rs. 400).
2) Step 2: Add Rs. 100 to Account B (Rs. 200 → Rs. 300).
• Failure Case: If the system crashes after Step 1 but before Step 2:
o Without Atomicity: Account A loses Rs. 100, but Account B is unchanged (Rs. 400 and Rs.
200), causing inconsistency.
o With Atomicity: The entire transaction is rolled back, reverting Account A to Rs. 500. No
partial update occurs.
3) Isolation and Concurrent Transactions
Scenario: Two customers, X and Y, share a joint account (balance: Rs. 1,000).
• Transaction 1 (T1): X withdraws Rs. 200.
• Transaction 2 (T2): Y checks the balance simultaneously.
Without Isolation:
• T1 reads Rs. 1,000 and deducts Rs. 200 (new balance: Rs. 800).
• T2 reads the intermediate state (Rs. 1,000) before T1 completes, showing an incorrect balance.
With Isolation:
• T1 locks the account during the transaction.
• T2 is blocked until T1 completes, ensuring Y sees either Rs. 1,000 (before T1) or Rs. 800 (after
T1), never an intermediate value.
Isolation Levels:
• Read Committed: T2 sees only committed changes (avoids dirty reads).
• Serializable: Transactions execute as if sequentially, preventing phantom reads.
Conclusion:
1) ACID in Banking: Ensures transactions are reliable, consistent, and secure.
2) Atomicity Example: Prevents "half-completed" transfers, safeguarding against financial
discrepancies.
3) Isolation Example: Locks accounts during transactions to prevent concurrent access issues.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 107


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

11 Database Normalization
1) Normalization Violations
a) Violation of 1NF (First Normal Form)
• Repeating Groups: The table contains multiple columns for similar data (Product1,
Product2, Product3 and ProductPrice1, ProductPrice2, ProductPrice3), which violates
atomicity (each column should contain a single value).
• NULL Values: Some fields (e.g., Product3, ProductPrice3) contain NULL values, indicating
incomplete data.
b) Violation of 2NF (Second Normal Form)
• Partial Dependency: The table has a composite key (assuming OrderID + Product as the
primary key), but non-key attributes like CustomerName and CustomerAddress depend
only on OrderID, not the full key.
2) Converting the Table into 1NF and 2NF
Step 1: Convert to 1NF (Eliminate Repeating Groups)
To satisfy 1NF, we restructure the table so that each row contains only atomic values (no repeating
groups).
New Tables in 1NF:
a) Orders Table (Stores order and customer details)
OrderID CustomerName CustomerAddress
1 Ali Akbar Lodhi Road Lahore
2 Saira Shahid Shali Road Karchi
b) OrderDetails Table (Stores individual products per order)
OrderID Product Price
1 Laptop 50000
1 Mouse 1000
1 Keyboard 2000
2 Monitor 15000
2 Printer 8000
Explanation:
• Each row now has a single product entry.
• NULL values are eliminated since only ordered products are stored.
• The composite key for OrderDetails is (OrderID, Product).
Step 2: Convert to 2NF (Remove Partial Dependencies)
To satisfy 2NF, we ensure that all non-key attributes depend on the entire primary key (not just part
of it).
Issue in 1NF Tables:
• In the Orders table, CustomerName and CustomerAddress depend only on OrderID (not on the
full composite key).

108 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

Solution:
a) Orders Table (Stores only order metadata)
OrderID CustomerID (Foreign Key)
1 101
2 102
b) Customers Table (Stores customer details)
CustomerID CustomerName CustomerAddress
101 Ali Akbar Lodhi Road Lahore
102 Saira Shahid Shali Road Karchi
c) OrderDetails Table (Remains unchanged from 1NF)
OrderID Product Price
1 Laptop 50000
1 Mouse 1000
1 Keyboard 2000
2 Monitor 15000
2 Printer 8000
Explanation:
• CustomerName and CustomerAddress are moved to a separate Customers table.
• The Orders table now only links to customers via CustomerID.
• OrderDetails remains unchanged since Price depends on both OrderID and Product.
3) Advantages and Disadvantages of Normalization
Advantage: Reduced Redundancy
• Before Normalization: Customer details (e.g., "Ali Akbar, Lodhi Road Lahore") were repeated for
every product in an order.
• After Normalization: Customer details are stored once in the Customers table, referenced
via CustomerID.
• Benefit: Saves storage space and prevents inconsistencies (e.g., updating an address in one
place affects all related orders).
Disadvantage: Increased Joins
• Before Normalization: A single query could retrieve all order data.
• After Normalization: To get complete order details, we must join multiple tables (Orders,
Customers, OrderDetails).
• Drawback: More complex queries and potential performance overhead.
Summary of Normalization Steps
Normal Form Action Taken Resulting Tables
1NF Eliminated repeating groups Orders, OrderDetails
2NF Removed partial dependencies Orders, Customers, OrderDetails
Conclusion:
• Normalization improves data integrity and reduces redundancy but may increase query
complexity.
• The e-commerce platform now has a scalable structure for managing orders efficiently.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 109


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

12 Data Warehousing vs. OLTP


1) Comparison of OLTP and Data Warehousing Systems
Feature OLTP Systems Data Warehousing Systems
Primary Purpose Real-time transaction processing Historical data analysis and reporting
Data Structure Highly normalized (3NF/BCNF) Partially denormalized (star schema)
Query Types Simple CRUD operations Complex analytical queries
Data Volume Current operational data Large historical datasets
Performance Focus Fast writes, transaction throughput Fast reads, complex aggregations
Users Front-line staff, applications Business analysts, executives
Update Frequency Continuous real-time updates Periodic batch loads (ETL)
Key Differences Explained:
Purpose:
• OLTP: Designed for day-to-day operations like processing orders, updating inventory, and
recording transactions in real-time.
• Data Warehouse: Optimized for analyzing trends over time, generating reports, and supporting
business decisions.
Data Structure:
• OLTP: Uses normalized tables to minimize redundancy and ensure data integrity during frequent
updates.
• Data Warehouse: Employs denormalized schemas (like star schema) to optimize query
performance for analytical workloads.
Query Types:
• OLTP: Many simple transactions (e.g., checking account balance, updating customer records).
• Data Warehouse: Fewer but more complex queries involving aggregations, joins across years of
data.
2) Non-Volatility in Data Warehouses
Why Data Warehouses are Non-Volatile:
1) Historical Preservation: Once data enters the warehouse, it's never modified or deleted to
maintain a consistent historical record.
2) Decision Consistency: Ensures analytical reports remain comparable over time.
3) Audit Compliance: Provides reliable data trails for regulatory requirements.
Difference from OLTP:
• OLTP Systems: Data is constantly updated (e.g., account balances change with each transaction).
• Data Warehouses: Data is appended but never altered (e.g., daily sales are added but existing
records stay unchanged).
Example: In banking:
• OLTP: Updates your account balance when you make a withdrawal
• Data Warehouse: Preserves all historical balances for trend analysis

110 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

3) Real-World Example: Retail Chain


OLTP System Usage:
• Point-of-Sale (POS) Processing:
o Records individual sales transactions in real-time
o Updates inventory levels immediately
o Processes customer payments
Data Warehouse Usage:
• Business Intelligence:
o Analyzes yearly sales trends by product category
o Compares store performance across regions
o Forecasts inventory needs for holiday seasons
How They Work Together:
1) Daily ETL process extracts new sales from OLTP system
2) Transforms data into consistent format (e.g., standardizing product codes)
3) Loads into warehouse's sales_fact and dimension tables
4) Analysts run reports without impacting operational systems
Conclusion:
1) OLTP and data warehouses serve complementary purposes
2) Non-volatility enables reliable historical analysis
3) Modern organizations need both for operational and strategic needs
4) ETL processes bridge the two systems while maintaining separation of concerns
13 Data Warehouse Schemas
1) Comparison of Star Schema, Snowflake Schema, and Galaxy Schema:
Star Schema:
• Structure:
o The Star Schema consists of one central fact table and multiple dimension tables. The fact
table typically stores quantitative data like sales, revenue, or transaction counts. The
dimension tables contain descriptive attributes related to the fact table (e.g., Product, Time,
Customer).
• Performance Implications:
o Query Speed: Star Schema typically offers faster query performance because the fact table
is directly linked to the dimension tables, making joins simpler and quicker.
o Storage Efficiency: Star Schema requires more storage because the dimension tables are
not normalized. This redundancy leads to higher storage needs.
• Ideal Use Cases:
o Departmental Reporting: The Star Schema is ideal for departmental or functional reporting
where fast, straightforward queries are necessary, and denormalized data is acceptable. It's
often used in reporting for specific areas like sales or marketing.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 111


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

Snowflake Schema:
• Structure:
o Similar to the Star Schema but with normalized dimension tables. The dimension tables are
split into multiple related tables to reduce redundancy. For example, instead of having a
single "Product" dimension table, it may be split into "Product Category," "Product
Subcategory," and "Product Details."
• Performance Implications:
o Query Speed: Due to the normalization of dimension tables, queries often require more
joins, which can result in slower performance compared to a Star Schema.
o Storage Efficiency: The Snowflake Schema is more storage-efficient because of the
normalization of dimension tables, reducing data redundancy.
• Ideal Use Cases:
o Enterprise-Wide Analysis: The Snowflake Schema is ideal for enterprise-wide data marts
or data warehouses where data consistency and efficiency are important, and the
complexity of joins can be handled for more accurate data representation.
Galaxy Schema (also known as Fact Constellation Schema):
• Structure:
o The Galaxy Schema is a more complex design that involves multiple fact tables sharing some
common dimension tables. It's essentially a combination of multiple Star Schemas. For
instance, a company might have a sales fact table and an inventory fact table, both sharing
common dimension tables like "Product," "Time," and "Customer."
• Performance Implications:
o Query Speed: Similar to the Star Schema but potentially slower than the Star Schema due to
the presence of multiple fact tables and more complex relationships. Queries might require
more joins, especially when combining data from different fact tables.
o Storage Efficiency: Similar to the Star Schema, the Galaxy Schema can be more storage-
intensive due to the need to store multiple fact tables and their relationships to dimensions.
• Ideal Use Cases:
o Large-Scale Enterprise Reporting: The Galaxy Schema is ideal for complex, multi-faceted
analyses across different business processes. It’s typically used when the business needs to
analyze multiple metrics (e.g., sales and inventory) from a shared set of dimensions across
various departments or business units.
2) Design of a Star Schema for Sales Data
Fact Table:
• Sales_Fact: This table would contain the key metrics (measures) of sales transactions, such as:
o Sales_ID (Primary Key)
o Sales_Amount
o Quantity_Sold
o Discount_Applied
o Total_Revenue
o Product_ID (Foreign Key)
o Time_ID (Foreign Key)
o Customer_ID (Foreign Key)
o Store_ID (Foreign Key)

112 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

Dimension Tables:
• Product_Dimension: Contains descriptive attributes about products.
o Product_ID (Primary Key)
o Product_Name
o Product_Category
o Product_Description
• Time_Dimension: Contains time-related attributes.
o Time_ID (Primary Key)
o Date
o Month
o Quarter
o Year
o Weekday
• Customer_Dimension: Contains information about customers.
o Customer_ID (Primary Key)
o Customer_Name
o Customer_Location
o Customer_Segment
• Store_Dimension: Contains information about store locations.
o Store_ID (Primary Key)
o Store_Name
o Store_Location
3) When Would the Company Prefer a Galaxy Schema Over a Star Schema?
A company might prefer a Galaxy Schema over a Star Schema in the following situations:
• Multiple Fact Tables: If the company needs to analyze multiple business processes, such as sales
and inventory, and these processes share common dimensions (e.g., Product, Time, Customer),
the Galaxy Schema would be a better fit. It allows the integration of various fact tables like sales,
inventory, and returns under a common set of dimension tables, enabling cross-analysis of data
across different metrics.
• Complex Reporting Needs: For organizations that require complex reporting and analysis across
various business units or departments, a Galaxy Schema offers more flexibility. It can
accommodate different kinds of analyses (e.g., comparing sales performance with inventory
levels) by linking multiple fact tables through shared dimensions.
• Scalability for Larger Datasets: If the data warehouse is expected to scale in the future to include
additional fact tables (e.g., marketing performance or customer feedback), the Galaxy Schema
provides a more extensible structure to grow with the business.
In contrast, a Star Schema would be preferable if the company needs to focus on simpler, departmental
reporting without involving multiple fact tables, offering quicker query performance at the expense of
less flexibility.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 113


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

14 Designing a Scalable and Secure IT Architecture


As an IT architect designing the systems architecture for a growing online retail company anticipating
significant traffic spikes during seasonal sales events, my goal would be to create a robust, scalable, and
secure infrastructure that supports seamless operations and protects customer data. The architecture would
leverage hardware, software, networks, and storage in a cohesive manner, incorporating best practices like
modular design and redundancy to meet the company’s needs.
Hardware: The foundation of the architecture would rely on a combination of on-premises and cloud-based
hardware to balance control and scalability. I would deploy a small cluster of high-performance servers on-
site to handle core operations like order processing and inventory management during normal periods. For
scalability during peak demand, I’d integrate cloud infrastructure, such as Amazon Web Services (AWS) EC2
instances, which can dynamically scale by adding virtual servers. This hybrid approach ensures that the
system can handle baseline loads efficiently while tapping into cloud resources for surges, such as during
Eid promotion sales.
Software: The software stack would include a mix of system and application software tailored to the retail
environment. A Linux-based operating system would manage hardware resources due to its stability and
cost-effectiveness. For the application layer, I’d implement a microservices-based e-commerce platform (e.g.,
using Node.js or Python with Django), where functions like user authentication, payment processing, and
product catalog management are separate services. This modular design allows individual components to
scale independently—e.g., adding more payment processing instances during high-traffic events—while
simplifying updates and maintenance without disrupting the entire system.
Networks: A robust network infrastructure is critical for connecting users, systems, and cloud resources. I’d
design a hybrid network with a fast local area network (LAN) for internal operations and a wide area
network (WAN) leveraging a content delivery network (CDN) like Cloudflare to distribute static content (e.g.,
product images) globally. This reduces latency for customers and offloads traffic from core servers. Network
security would be enhanced with firewalls and intrusion detection systems to monitor and block malicious
activity, ensuring uninterrupted service during peak times.
Storage: To manage the growing volume of customer data, orders, and product information, I’d use a
combination of centralized and distributed storage. A relational database like PostgreSQL would store
structured data (e.g., customer profiles, transaction records) on-site for quick access, while a cloud-based
object storage solution like AWS S3 would handle unstructured data (e.g., product images, logs) with
scalability and redundancy. Data replication across multiple regions would ensure availability and disaster
recovery, critical during high-demand periods.
Best Practices: I’d incorporate modular design and redundancy as key principles. The microservices
architecture exemplifies modular design, enabling the company to scale specific functions (e.g., checkout
processes) without overhauling the entire system. For example, during a sale, additional instances of the
payment service could be spun up on AWS, ensuring smooth transactions. Redundancy would be achieved
by deploying servers and storage across multiple availability zones in the cloud. If one zone fails due to a
technical issue, traffic would automatically failover to another, minimizing downtime—a vital feature when
thousands of customers are shopping simultaneously.
Technologies and Benefits: Using AWS Auto Scaling, the system would automatically add resources as
traffic increases, ensuring performance during peak demand. Cloudflare’s CDN would cache content closer
to users, reducing server load and improving page load times. To protect customer data, I’d implement end-
to-end encryption (e.g., TLS for data in transit, AES-256 for data at rest) and multi-factor authentication
(MFA) for administrative access, safeguarding sensitive information like payment details against breaches.
In conclusion, this architecture ensures scalability through cloud integration and modular design, flexibility
via microservices and hybrid resources, and security with encryption and redundancy. By anticipating
challenges like cost, latency, and threats, and proactively mitigating them, the company can handle seasonal
surges efficiently while maintaining customer trust and operational continuity.

114 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

15 The OSI Model as a Framework


The OSI Model is a conceptual framework that organizes the tasks involved in computer networking into
seven abstract layers. This structure was developed to help different systems communicate across diverse
technologies and vendors by standardizing network functions.
Functions of Each Layer:
1) Physical Layer (Layer 1): This layer is concerned with the physical transmission of data. It defines
how bits are transmitted over a medium—such as the type of signal, bit rate, and how devices are
physically connected. It ensures that data is sent and received as a stream of bits.
2) Data Link Layer (Layer 2): Responsible for establishing a reliable link between two directly
connected devices. It handles framing, addressing, and error detection to ensure data frames are
transmitted accurately over the physical link.
3) Network Layer (Layer 3): This layer manages routing and addressing, allowing data to travel
between different networks. It ensures that packets are delivered from the source to the destination
even if they are not directly connected.
4) Transport Layer (Layer 4): Ensures end-to-end communication between systems. It provides
segmentation, flow control, and error recovery, ensuring that data is delivered completely and in the
correct order.
5) Session Layer (Layer 5): Manages and controls the dialogue (sessions) between two systems. It
establishes, maintains, and terminates connections, ensuring that sessions are properly synchronized
and managed.
6) Presentation Layer (Layer 6): Handles data translation, encryption, and compression. It ensures that
data from the application layer of one system is readable by the application layer of another, even if
their formats differ.
7) Application Layer (Layer 7): This is the closest layer to the end user and provides services such as
file transfer, email, and browsing. It interfaces directly with applications to provide access to network
resources.
Importance of the OSI Model:
• Standardization:
It provides a universal framework for developers and vendors, making network communication
protocols consistent and predictable.
• Interoperability:
Systems developed by different manufacturers can work together by adhering to OSI-based standards,
which improves compatibility.
• Modularity and Flexibility:
The layered structure allows for changes in one layer without affecting the others. For example, a new
encryption method can be implemented at the presentation layer without changing the network or
transport layers.
• Simplified Troubleshooting:
Network issues can be diagnosed by isolating problems to specific layers. For example, if a message is
not being received, technicians can test each layer in sequence to pinpoint the problem.
• Guidance for Network Design:
It serves as a blueprint for network planning, enabling organizations to build scalable, secure, and
reliable communication systems.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 115


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

16 Types of ERP Systems and Their Differences


The three main types of ERP systems are:
1) On-Premise ERP
• Deployment: Installed locally on the organization’s servers and managed in-house.
• Cost: High upfront costs (hardware, licenses, IT infrastructure) but lower long-term subscription
fees.
• Control: Full control over data, security, and customization but requires dedicated IT staff for
maintenance.
2) Cloud ERP
• Deployment: Hosted on the vendor’s servers and accessed via the internet.
• Cost: Lower initial costs (subscription-based) but ongoing operational expenses.
• Control: Less customization; vendor manages updates/security, but dependence on internet
connectivity.
3) Hybrid ERP
• Deployment: Combines on-premise (for sensitive data) and cloud (for scalability) solutions.
• Cost: Balanced—reduces upfront costs while allowing critical systems to remain on-premise.
• Control: Flexibility to prioritize security (on-premise) or scalability (cloud) as needed.
17 ERP Implementation Challenges and Mitigation Strategies
1) High Implementation Costs
• Challenge: ERP systems require significant investment in software, hardware, and consulting.
• Solution: Adopt a phased rollout (prioritize critical modules first) or consider cloud ERP to
reduce upfront costs.
2) Resistance to Change
• Challenge: Employees may resist new workflows, leading to low adoption.
• Solution: Implement change management (training, clear communication, and involve end-users
early in the process).
3) Data Migration Issues
• Challenge: Inaccurate or incomplete data transfer from legacy systems.
• Solution: Cleanse and validate data before migration; conduct trial runs to identify errors.
4) Need for Customization
• Challenge: Over-customization can increase complexity and costs.
• Solution: Use standard ERP features where possible and limit customization to critical needs.
5) Integration Challenges
• Challenge: Difficulty connecting ERP with existing systems (e.g., CRM, legacy software).
• Solution: Use APIs/middleware for seamless integration and test compatibility early.

116 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

18 Synergy of AI and IoT in Everyday Solutions


The integration of Artificial Intelligence (AI) and the Internet of Things (IoT) is transforming everyday life
by enabling intelligent, data-driven decision-making in connected environments. While IoT provides the
infrastructure for collecting real-time data through sensors and devices, AI enhances this ecosystem by
analyzing patterns, making predictions, and automating responses. This synergy results in smart, adaptive
solutions that improve efficiency, convenience, and personalization.
Enhancing IoT with AI
IoT consists of a network of connected devices that generate massive amounts of data. However, raw data
alone is not sufficient to create intelligent solutions. AI’s machine learning (ML) algorithms and predictive
analytics enable IoT systems to process this data, extract meaningful insights, and automate decisions
without human intervention.
For example:
• AI can detect anomalies in sensor data, predicting potential failures before they occur.
• AI-powered natural language processing (NLP) allows IoT devices to interact with users in more
human-like ways (e.g., voice assistants).
• AI-driven computer vision enables IoT cameras to recognize faces, gestures, or even detect security
threats.
Example Applications of AI and IoT Integration
1) Smart Homes and Smart Environments
AI-powered IoT devices in homes can automate and optimize energy consumption, security, and user
convenience.
• Key Technologies:
o Smart Thermostats (e.g., Nest, Ecobee) use AI to learn user preferences and adjust
temperature settings accordingly.
o AI-enabled Security Systems (e.g., Ring, Google Nest Cam) detect unusual activity and
provide real-time alerts.
o Voice Assistants (e.g., Alexa, Google Assistant) use AI to interpret voice commands and
control connected IoT devices.
• Benefits:
o Increased energy efficiency, cost savings, and personalized user experiences.
o Enhanced home security with AI-driven real-time threat detection.
2) AI-Driven Smart Healthcare Solutions
AI and IoT are revolutionizing healthcare by enabling remote monitoring, early disease detection,
and personalized treatment plans.
• Key Technologies:
o Wearable Health Monitors (e.g., Fitbit, Apple Watch) collect real-time health metrics like
heart rate, oxygen levels, and sleep patterns.
o AI-Powered Predictive Analytics detects anomalies in patient data, predicting potential
health issues before they escalate.
o IoT-Connected Smart Pills track medication adherence, ensuring patients follow prescribed
treatments.
• Benefits:
o Early detection of health risks, reducing hospital visits.
o Improved patient care through AI-driven insights and remote monitoring.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 117


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

Challenges in AI-IoT Integration and Solutions

Challenge Description Solution

Data Privacy & Security IoT devices collect vast Implement end-to-end data encryption,
amounts of personal data, multi-factor authentication, and secure AI
making them potential models to prevent unauthorized access.
targets for cyberattacks.

High Computational AI models require Use edge computing to process data locally
Power Needs significant processing on IoT devices, reducing reliance on cloud-
power, which may be based AI processing.
limited in IoT devices.

Interoperability Issues IoT devices from different Develop and adopt standardized protocols
manufacturers may not and API-based communication for
communicate effectively. seamless device integration.

Data Overload IoT generates large Implement AI-driven data filtering and
volumes of data, making prioritization to ensure only relevant data
real-time analysis is processed.
challenging.

Conclusion
The fusion of AI and IoT is reshaping everyday experiences, making environments smarter and more
adaptive. From intelligent homes to healthcare solutions, AI enhances IoT by transforming data into
actionable insights. While challenges like security, interoperability, and data management exist,
adopting best practices such as encryption, edge computing, and standardization ensures smooth
integration. As AI and IoT continue to evolve, their synergy will lead to even more groundbreaking
innovations, enhancing convenience, safety, and efficiency in our daily lives.

19 Blockchain and 5G for Data Management

Blockchain Technology and 5G Technology offer distinct yet complementary approaches to enhancing data
management and communication in a technology-driven world. Blockchain’s decentralized, secure
framework contrasts with 5G’s high-speed, low-latency capabilities, each addressing different needs in data
integrity, accessibility, and efficiency.
Blockchain operates as a distributed ledger, with its decentralization eliminating central points of failure and
distributing data across a peer-to-peer network. This ensures no single entity controls the system, enhancing
resilience. Its immutability and security, backed by cryptographic algorithms, make records tamper-proof,
guaranteeing data integrity. Conversely, 5G, the fifth-generation wireless technology, excels in increased
network speed (up to 100 times faster than 4G) and low latency (as low as 1 millisecond), enabling rapid
data transmission. Its greater capacity supports millions of connected devices, making it ideal for real-time
communication in dense networks.
For example, Blockchain can enhance secure record-keeping. In a shared database scenario, such as tracking
digital assets, Blockchain ensures every entry (e.g., ownership transfer) is permanently recorded and
auditable across nodes, preventing fraud or unauthorized changes. Its transparency allows all participants
to verify data, fostering trust. On the other hand, 5G supports rapid data sharing. In a smart network of
sensors, such as environmental monitoring devices, 5G’s speed and capacity enable real-time data collection
and dissemination—e.g., instantly sharing air quality updates across a region—improving responsiveness
and decision-making.

118 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

I believe 5G has a greater impact on future data systems due to its transformative role in enabling real-time
connectivity across vast ecosystems. While Blockchain secures static data and transactions, 5G powers
dynamic, high-volume data flows essential for emerging technologies like IoT, autonomous systems, and
immersive applications (e.g., AR/VR). Its ability to connect billions of devices at unprecedented speeds will
drive the next wave of innovation, making data instantly accessible and actionable on a global scale.
However, 5G faces implementation challenges. Infrastructure costs are significant, requiring extensive
network upgrades (e.g., new towers, fiber optics). I’d mitigate this by advocating phased rollouts, prioritizing
high-demand areas, and leveraging public-private partnerships to share costs. Security risks also arise, as
more connected devices increase attack surfaces. I’d address this with robust encryption, network slicing to
isolate critical data, and AI-driven threat detection to monitor anomalies in real time.
Blockchain excels in secure, trustworthy data management, while 5G revolutionizes communication speed
and scale. 5G’s broader impact stems from its enablement of real-time, interconnected systems, though its
deployment requires overcoming cost and security hurdles with strategic planning and advanced
safeguards.

20 Comparing AR and VR for User Engagement in Digital Spaces

Augmented Reality (AR) and Virtual Reality (VR) both enhance user engagement in digital environments,
but they do so through distinct approaches. AR integrates digital elements into the physical world, fostering
interaction by overlaying information or visuals via devices like smartphones or glasses, while VR creates
isolated, fully immersive virtual experiences through headsets, emphasizing deep immersion. Their
contrasting strengths shape how they engage users and sustain interest over time.
AR’s strength lies in its integration with the physical world and interactivity. By superimposing digital
content—like 3D models or data—onto real environments, AR enhances users’ surroundings without
disconnecting them. For example, in real-time visualizations, AR could project interactive maps or
annotations onto a user’s view of a physical space, allowing them to explore additional layers of information
(e.g., historical facts about a landmark) while remaining grounded in reality. This blend of physical and
digital fosters engagement by making interactions contextual and accessible, leveraging AR’s accessibility
through common devices.
VR, conversely, excels in full immersion and simulated environments, transporting users into entirely virtual
spaces where they can interact with a 3D world. In virtual explorations, VR could enable users to navigate a
digitally crafted landscape—like a fantasy realm or a historical reenactment—offering a sense of presence
and agency through its interactive experiences. VR’s ability to isolate users from distractions creates a deeply
engaging, escapist experience, appealing to those seeking intense, focused interaction.
For sustaining long-term user interest, I believe AR is more effective. Its integration with everyday life makes
it versatile and less intrusive, encouraging repeated use without requiring users to disconnect from their
surroundings. VR’s immersive nature, while captivating initially, may fatigue users over time due to its
hardware demands and isolation, limiting casual, frequent engagement. AR’s adaptability—e.g., enhancing
daily tasks with digital overlays—offers broader, more sustainable appeal, as users can incorporate it
seamlessly into routine activities.
Deploying AR faces challenges. Content development is a hurdle, as creating high-quality, context-aware
overlays requires advanced tools and expertise, increasing costs. I’d mitigate this by using scalable platforms
(e.g., ARKit) and crowd-sourcing content creation to reduce complexity and expense. Hardware variability—
differing device capabilities—can also disrupt consistent experiences. I’d address this by designing AR
applications with adaptive rendering, optimizing for low-end devices while scaling features for high-end
ones, ensuring broad compatibility.
AR and VR both enrich digital engagement—AR through real-world enhancement, VR through virtual
immersion. AR’s practicality and accessibility make it more sustainable for long-term interest, though its
deployment requires overcoming content and hardware challenges with scalable tools and adaptive design.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 119


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

21 AI Strategy for Financial Services


GlobalBank is integrating Artificial Intelligence (AI) across three divisions: Retail Banking, Risk Management,
and Operations. To maximize effectiveness, it is crucial to select the appropriate AI type, justify its
application, and evaluate the trade-offs of using off-the-shelf AI solutions versus custom-built AI models.
a) AI Type Selection, Justification, and Algorithms
1) Retail Banking – Personalized Customer Recommendations
• Appropriate AI Type: Predictive AI
• Justification:
1) Enhancing customer experience – Predictive AI personalizes product
recommendations based on customer preferences and behavior.
2) Driving revenue growth – Personalized marketing increases cross-selling and
upselling opportunities, improving customer engagement.
• Example Algorithm:
1) Association Rule Mining
2) Used by platforms like Amazon, Netflix, and Spotify for personalized
recommendations.
Implementation in GlobalBank:
• AI will analyze transaction history, browsing behavior, and past purchases to recommend
financial products like credit cards, loans, or investment plans.
• The system will leverage machine learning models trained on historical banking data to
improve prediction accuracy.
2) Risk Management – Real-Time Fraud Detection
• Appropriate AI Type: Analytical & Predictive AI
• Justification:
1) Identifying fraudulent transactions – AI continuously monitors transactions, flagging
anomalies in real time.
2) Reducing financial loss – Predictive AI helps proactively block fraudulent activities,
minimizing risk exposure.
• Example Algorithm:
1) Random Forest or Gradient Boosting (XGBoost, LightGBM)
2) These algorithms excel at fraud classification, analyzing vast amounts of historical
transaction data to detect suspicious activities.
Implementation in GlobalBank:
• AI will assess transactions using past fraud patterns, geolocation, device fingerprints, and
spending behavior.
• Suspicious activities (e.g., high-value transfers from new locations, rapid transactions across
accounts) will trigger alerts, allowing banks to intervene proactively.

120 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

3) Operations – Automated Document Processing


• Appropriate AI Type: Generative AI
• Justification:
1) Automating tedious processes – AI reduces manual effort in reading, summarizing, and
extracting key details from financial documents.
2) Improving accuracy and efficiency – AI minimizes errors in regulatory reporting and
legal document processing.
• Example Algorithm:
1) GPT-4, BERT (Transformer-based NLP models)
2) These models are highly effective in document summarization, entity recognition, and
sentiment analysis.
Implementation in GlobalBank:
• AI will process customer forms, contracts, and loan applications, extracting relevant details
such as customer names, transaction values, and risk ratings.
• This will help accelerate approvals, ensure compliance with regulations, and reduce
operational costs.
b) Advantages and Limitations of Off-the-Shelf AI Solutions
The CTO prefers off-the-shelf AI solutions from cloud providers like AWS, Google Cloud, or Microsoft
Azure. While these solutions offer benefits, they also have limitations.
Advantages:
1) Faster Deployment:
• Pre-built AI models require minimal configuration, allowing quicker time-to-market
compared to custom development.
• Example: AWS Fraud Detector can be integrated into the bank’s fraud monitoring system
without building a model from scratch.
2. Lower Initial Cost & Maintenance:
• Off-the-shelf AI solutions eliminate the need for in-house AI expertise, reducing
development and maintenance costs.
• AI models are automatically updated by vendors, ensuring ongoing improvements.
3. Proven Accuracy & Scalability:
• These solutions have been trained on extensive datasets, making them highly reliable for
general-purpose use.
• They scale efficiently, supporting millions of transactions per second in high-traffic
environments.
Limitations:
1. Limited Customization:
• Pre-built AI models may not align with GlobalBank’s unique fraud detection patterns.
• Example: An AI model trained on generic fraud patterns might fail to detect local or
institution-specific fraud trends.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 121


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

2. Data Privacy & Security Risks:


• Cloud-based AI solutions require sharing sensitive financial data with external vendors,
posing security risks.
• Compliance with regulations like Enterprise Governance Framework becomes challenging
when data is processed externally.
22 Robotic Process Automation (RPA) vs. Artificial Intelligence (AI)
1) Definition of RPA and AI
• Robotic Process Automation (RPA) is a rule-based automation technology that mimics human
actions to perform repetitive, structured tasks. It operates on predefined logic and workflows
without the ability to learn or adapt.
• Artificial Intelligence (AI), on the other hand, involves machine learning, natural language
processing, and cognitive capabilities to analyze data, recognize patterns, and make intelligent
decisions, often improving over time.
2) Comparison of RPA and AI in Manufacturing
Feature Robotic Process Automation (RPA) Artificial Intelligence (AI)
Task Handles repetitive, rule-based tasks: Manages complex cognitive tasks:
Complexity • Data entry • Predictive maintenance
• Invoice processing • Defect detection
• Robotic assembly line movements • Autonomous decision-making
Decision- Follows predefined rules with no Uses ML/analytics for data-driven
Making independent decision-making decisions and adaptation
Learning & No learning capability Continuously improves via pattern
Adaptability Requires manual workflow updates analysis
Adapts to new conditions autonomously
3) Real-World Use Cases in Manufacturing
RPA Use Case: Automated Order Processing
• In a manufacturing company, RPA can be used to automatically extract order details from emails,
input them into an Enterprise Resource Planning (ERP) system, and generate purchase orders.
• This eliminates manual data entry errors, speeds up order processing, and ensures accuracy in
supply chain operations.
AI Use Case: Predictive Maintenance in Machinery
• AI-powered systems use IoT sensor data to predict machine failures before they happen by
analyzing temperature, vibration, and usage patterns.
• Machine learning algorithms detect anomalies and recommend maintenance actions, reducing
downtime and repair costs.
4) Hybrid Approach: Combining RPA and AI
Manufacturers can integrate RPA and AI for a more robust automation strategy.
Example: Intelligent Quality Control
• AI-powered computer vision scans products on the assembly line to detect defects in real-time.
• RPA bots log defective items in the ERP system, notify production teams, and generate quality
control reports automatically.
This hybrid approach ensures fast, accurate defect detection while automating response actions,
improving overall production efficiency.

122 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

Conclusion
While RPA automates repetitive, rule-based tasks, AI enables intelligent decision-making. In
manufacturing, combining both technologies enhances productivity, reduces costs, and improves
overall operational efficiency.
23 Cloud Computing Characteristics
Cloud computing is defined by five essential characteristics that distinguish it from traditional IT
infrastructure. Each feature offers unique advantages to businesses, enabling cost savings, flexibility, and
scalability.
1. On-Demand Self-Service
• Definition: Users can automatically provision computing resources (e.g., virtual machines,
storage) without requiring manual intervention from the cloud provider.
• Benefit: Reduces delays in IT operations, allowing businesses to deploy resources instantly when
needed.
• Example: An e-commerce company uses AWS Auto Scaling to automatically add more servers
during peak shopping seasons (e.g., Black Friday) and remove them afterward to save costs.
2. Broad Network Access
• Definition: Cloud services are accessible over the internet from any device (laptops,
smartphones, tablets).
• Benefit: Supports remote work, global collaboration, and mobile access to applications.
• Example: A multinational company uses Microsoft 365 (SaaS) so employees can access emails,
documents, and collaboration tools from anywhere.
3. Resource Pooling
• Definition: Cloud providers use multi-tenancy to share physical resources (servers, storage)
among multiple customers while keeping data logically separated.
• Benefit: Lowers costs by maximizing resource utilization and eliminating idle capacity.
• Example: Google Cloud hosts virtual machines for multiple businesses in the same data center,
ensuring efficient resource usage while maintaining security.
4. Rapid Elasticity
• Definition: Computing resources can be scaled up or down instantly based on demand.
• Benefit: Ensures performance during traffic spikes while avoiding over-provisioning.
• Example: A video streaming platform like Netflix scales its cloud servers dynamically when a
popular show is released, ensuring smooth streaming for millions of users.
5. Measured Service
• Definition: Cloud providers monitor and charge users based on actual resource consumption
(e.g., storage used, CPU hours).
• Benefit: Transparent billing and cost optimization, as businesses only pay for what they use.
• Example: A startup using Azure Blob Storage is billed monthly based on the exact amount of data
stored, avoiding fixed costs.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 123


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

24 Cloud Service Models


The three cloud service models—Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and
Software as a Service (SaaS)—differ in control, flexibility, management, and use cases.
Feature IaaS PaaS SaaS
Control High (manage OS, apps, Medium (focus on app Low (fully managed by
networking) development) provider)
Flexibility Highly customizable Limited to platform tools Least customizable
Management User manages OS, security, Provider handles Provider manages
apps infrastructure everything
Use Cases Custom apps, legacy systems, App development, DevOps, Email (Gmail), CRM
big data APIs (Salesforce)
Best Model for a Software Development Startup:
In my view, the best model for this scenario would be PaaS
Why PaaS?
1) Focus on Development:
PaaS (e.g., Google App Engine, Heroku) provides pre-configured environments, databases, and
development tools, allowing developers to focus on coding rather than managing servers.
2) Faster Deployment:
Built-in CI/CD pipelines and middleware reduce setup time, enabling rapid app development and
updates.
3) Scalability:
Automatically scales applications as user demand grows, eliminating manual server management.
4) Cost Efficiency:
No need to invest in physical hardware or hire extensive IT staff for infrastructure maintenance.
25 Blockchain Fundamentals
Hashing and consensus mechanisms are foundational elements of blockchain technology that work together
to ensure its security and immutability, making it a reliable and tamper-resistant system for recording
transactions.
• Hashing:
Hashing is a cryptographic process that converts input data (e.g., transaction details) into a fixed-size
string of characters, known as a hash, which acts as a unique digital fingerprint. In a blockchain, each
block contains a hash of its own data and the hash of the previous block, creating a linked chain. This
linking ensures immutability because any change to a block’s data alters its hash, which in turn breaks
the connection to the next block. Hashing provides security by using complex algorithms (e.g., SHA-
256 in Bitcoin) that are computationally difficult to reverse, meaning an attacker cannot easily
recreate or manipulate the original data without detection. The integrity of the entire chain depends
on these cryptographic hashes remaining consistent.
• Consensus Mechanisms:
Consensus mechanisms are protocols that ensure all participants (nodes) in a blockchain network
agree on the validity of transactions and the state of the ledger, even in a decentralized environment
without a central authority. They prevent fraudulent or unauthorized changes by requiring network-
wide agreement before a block is added. This process secures the blockchain by making it extremely
difficult for a malicious actor to alter the ledger unless they can overpower the network’s consensus
rules. Common mechanisms include Proof of Work (PoW) and Proof of Stake (PoS), each with distinct
approaches to achieving this agreement.

124 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

Example of Altering a Transaction

Consider a blockchain like Bitcoin, where a block contains a transaction: Alice sends 5 BTC to Bob. This
transaction, along with others, is hashed, and the block’s hash is linked to the next block in the chain.
If an attacker tries to alter this transaction (e.g., changing it to Alice sending 10 BTC to Bob), the block’s
hash would change because the input data has been modified. Since each subsequent block references
the hash of the previous block, this alteration would invalidate the entire chain from that point
forward. For the change to go unnoticed, the attacker would need to recalculate and update the hashes
of all subsequent blocks and convince the network to accept this tampered version—an extremely
resource-intensive and unlikely feat.

How Proof of Work (PoW) Prevents Tampering

In Bitcoin’s Proof of Work consensus mechanism, miners compete to solve complex mathematical
puzzles to validate transactions and add a new block to the blockchain. The first miner to solve the
puzzle broadcasts the block to the network, and other nodes verify its validity before accepting it. This
process requires significant computational power and energy, known as the "work." To tamper with
the aforementioned transaction, an attacker would need to redo the PoW for the altered block and all
subsequent blocks faster than the rest of the network continues to build on the original chain.
Additionally, they would need to control more than 50% of the network’s computational power (a
"51% attack") to impose their tampered version—a task that is economically and practically
unfeasible due to the vast resources required. Thus, PoW ensures security and immutability by making
tampering computationally prohibitive and ensuring that the longest, valid chain (agreed upon by the
majority) is the one the network follows.
In summary, hashing secures the blockchain by linking blocks in an immutable chain, where any
change is immediately detectable, while consensus mechanisms like PoW enforce network agreement
and protect against unauthorized alterations by leveraging computational effort and majority rule.
Together, they create a robust, trustworthy system resistant to tampering.
26 Blockchain and Fintech Synergy
Definition of Decentralized Finance (DeFi)
Decentralized Finance (DeFi) refers to a financial ecosystem built on blockchain technology that operates
without traditional intermediaries such as banks, brokers, or financial institutions. It leverages smart
contracts—self-executing agreements coded on a blockchain—to facilitate financial services like lending,
borrowing, trading, and earning interest on digital assets. DeFi platforms are typically open-source,
permissionless, and accessible to anyone with an internet connection and a cryptocurrency wallet, enabling
peer-to-peer (P2P) interactions on a decentralized network, most commonly Ethereum.
How DeFi Differs from Traditional Financial Systems
DeFi fundamentally differs from traditional financial systems in several key ways:
• Intermediaries: Traditional finance relies on centralized entities like banks or payment processors
to facilitate transactions, verify identities, and enforce agreements. In contrast, DeFi eliminates these
intermediaries by using blockchain and smart contracts to automate and execute financial processes
directly between users.
• Control and Ownership: In traditional systems, institutions control user funds, set terms, and
maintain custody of assets. DeFi gives users full control over their assets through private keys and
self-custody wallets, promoting a "trustless" environment where trust is placed in code rather than
institutions.
• Accessibility: Traditional finance often excludes individuals without bank accounts or those in
underserved regions due to strict requirements (e.g., credit scores, documentation). DeFi is globally
accessible, requiring only an internet connection, making it more inclusive.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 125


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

• Transparency: Traditional systems operate behind closed doors with limited visibility into
processes, whereas DeFi transactions and smart contract code are transparent and auditable on the
blockchain, enhancing accountability.
• Speed and Cost: Traditional financial transactions, especially cross-border ones, can be slow and
expensive due to multiple intermediaries. DeFi offers faster, cheaper transactions by leveraging
blockchain’s efficiency and removing middlemen.
Advantage of Using DeFi Platforms for Lending and Borrowing Cryptocurrencies
One significant advantage of DeFi platforms, for lending and borrowing cryptocurrencies is reduced costs
and increased efficiency. In traditional lending, borrowers face high interest rates and fees due to bank
overheads, while lenders earn minimal returns on savings accounts. DeFi eliminates these intermediaries,
allowing lenders to deposit cryptocurrencies into liquidity pools and earn higher interest rates directly from
borrowers, who can access loans at lower costs.
Potential Challenge of Using DeFi Platforms for Lending and Borrowing Cryptocurrencies
A notable challenge of DeFi platforms is smart contract vulnerabilities and security risks. Since DeFi
relies on smart contracts to automate lending and borrowing, any bugs or exploits in the code can lead to
significant financial losses.
Conclusion
DeFi represents a powerful synergy between blockchain and fintech by offering a decentralized alternative
to traditional finance, characterized by autonomy, accessibility, and transparency. While it provides
advantages like cost efficiency in lending and borrowing, challenges such as smart contract security highlight
the need for ongoing improvements to ensure its reliability and widespread adoption.
27 Digital Disruption on Accounting Profession
Digital disruption has fundamentally transformed the accounting profession, reshaping traditional roles,
required skill sets, and the tools used in daily practice. These changes represent a shift from manual,
compliance-focused work to a more strategic, technology-driven approach.
1) Changing Roles in Accounting
• Traditional Role: Accountants were primarily responsible for recording transactions, preparing
financial statements, and ensuring compliance with tax and regulatory requirements. Their work
was largely backward-looking, focusing on historical data.
• New Role: Modern accountants have evolved into strategic advisors. They analyze real-time
financial data to provide insights for business decision-making, risk management, and growth
strategies. Their role now includes forecasting, scenario modeling, and advising on digital
transformation initiatives.
2) Evolving Skill Requirements

o Traditional Skills: Proficiency in bookkeeping, manual data entry, and knowledge of accounting
standards were sufficient. Familiarity with spreadsheets and basic accounting software was the
norm.

o New Skills: Today’s accountants must be adept at:
o - Data Analytics: Interpreting large datasets to identify trends and opportunities.
o - AI & Automation Tools: Using AI-driven software for tasks like fraud detection and
predictive analysis.
o - Blockchain & Cloud Computing: Understanding decentralized ledgers and cloud-based
accounting platforms.
o - Soft Skills: Communication and strategic thinking to explain financial insights to non-
financial stakeholders.

126 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

3) Transformation of Tools & Technologies



o Traditional Tools: Manual ledgers, spreadsheets, and on-premise accounting software were
standard. Audits required physical verification of documents.

o Modern Tools:
- Automation (RPA & AI): Robotic Process Automation (RPA) handles repetitive tasks like
reconciliations, while AI improves accuracy in expense categorization and anomaly
detection.
o - Cloud Accounting: Platforms enable real-time financial reporting and remote collaboration.
o - Blockchain for Auditing: Provides immutable transaction records, reducing fraud risks and
audit time.
o - Predictive Analytics: Helps businesses forecast cash flow and financial performance more
accurately.
4) Contrast with Traditional Practices
Aspect Traditional Accounting Modern Accounting

Primary Focus Compliance & record- Strategic advisory & decision support
keeping

Data Usage Historical data only Real-time & predictive analytics

Tools Used Manual ledgers, Excel AI, blockchain, cloud platforms

Audit Process Time-consuming, paper- Faster, automated, digital


based

Skills Needed Bookkeeping, tax knowledge Data science, tech-savviness, business acumen
Conclusion
Digital disruption has elevated the accounting profession from a transactional function to a strategic
business partner. While traditional accounting emphasized accuracy and compliance, modern accounting
leverages technology to drive efficiency, transparency, and forward-looking insights. Accountants must now
embrace continuous learning to stay relevant in an increasingly automated and data-driven financial
landscape.
28 Understanding Risk Management
Definition of Risk Management:
Risk management is the systematic process of identifying, assessing, and controlling risks to minimize their
potential negative impact on an organization. It involves a structured approach to anticipating, evaluating,
and mitigating uncertainties that could lead to harm, loss, or disruption. In the context of IT, risk
management focuses on protecting digital systems, networks, and data from threats such as cyberattacks,
system failures, or human errors, ensuring the organization can operate effectively and securely.
Importance in Protecting Digital Assets:
Risk management is crucial for safeguarding an organization’s digital assets—such as sensitive data, IT
infrastructure, and software applications—because these assets are integral to modern business operations.
In today’s interconnected digital landscape, where organizations rely on technology for customer
engagement, data processing, and decision-making, any disruption or compromise can lead to financial
losses, reputational damage, and operational downtime. By proactively addressing risks, organizations can
maintain the confidentiality, integrity, and availability of their digital assets, fostering trust with
stakeholders and ensuring compliance with regulatory standards.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 127


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

Two Key Stages and Their Contribution to Operational Resilience:


1) Risk Identification:
This stage involves pinpointing potential risks by analyzing IT systems, business processes, and
external factors that could lead to adverse outcomes. For example, it might include recognizing
vulnerabilities like outdated software or phishing susceptibility among employees. Risk identification
contributes to operational resilience by providing a clear understanding of threats, enabling the
organization to prepare for and prioritize them effectively. Without knowing what risks exist, an
organization cannot protect itself, making this the foundational step in maintaining continuity.
2) Risk Mitigation:
Risk mitigation entails developing and implementing strategies to reduce, avoid, or transfer identified
risks. This could involve actions like installing security patches, deploying firewalls, or purchasing
insurance. By reducing the likelihood or impact of risks, mitigation ensures that IT systems remain
functional during and after potential incidents, supporting operational resilience. It transforms
awareness into actionable safeguards, allowing the organization to recover quickly and maintain
critical operations.
Hypothetical Example:
Consider a mid-sized e-commerce company that relies on its online platform to process customer orders.
During the risk identification stage, the company conducts a threat modeling exercise and discovers that
its customer database is vulnerable to a ransomware attack due to unpatched software and weak employee
passwords. Recognizing this IT-related threat, the company moves to the risk mitigation stage by
implementing two strategies: first, it updates all software with the latest security patches to close
vulnerabilities, and second, it enforces multi-factor authentication (MFA) for all employee accounts to
prevent unauthorized access. If a ransomware attack is attempted, the updated software blocks the exploit,
and MFA stops attackers from using stolen credentials, ensuring the platform remains operational and
customer data stays secure. This proactive approach maintains the company’s ability to process orders
without interruption, preserving revenue and customer trust.
In summary, risk management protects digital assets by systematically addressing threats, with stages like
risk identification and mitigation working together to ensure operational resilience. The e-commerce
example illustrates how applying these stages can effectively neutralize an IT threat, keeping the
organization robust and responsive.
29 Types of IT Risks and Mitigation
Physical and digital risks differ fundamentally in their origins—physical risks arise from environmental or
hardware-related events, while digital risks emerge from cyber threats targeting software and networks.
Their impacts overlap in causing operational disruptions, but physical risks tend to affect hardware
availability and require physical recovery, whereas digital risks threaten data security and can spread
rapidly across systems.
Comparison of Physical Risks and Digital Risks:
1) Physical Risks
• Sources: Physical risks originate from tangible threats to IT infrastructure, such as natural
disasters (e.g., floods, earthquakes), theft, vandalism, hardware failures, or power outages. These
risks are rooted in the physical environment where IT systems like servers, data centers, and
networking equipment are located.
• Potential Impacts on an Organization: Physical risks can cause significant downtime, data loss,
or damage to critical hardware, disrupting business operations. For instance, a flood could
destroy servers, halting online services and resulting in lost revenue, customer dissatisfaction,
and costly repairs. Physical damage may also lead to prolonged recovery times if backups or
redundancies are inadequate.

128 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

• Effective
o Mitigation Strategy: One specific strategy is deploying hardware redundancy and
maintenance. This involves setting up backup servers and failover mechanisms in separate
locations and conducting regular equipment checks.
- How▪It Reduces Risk: By maintaining redundant systems, the organization ensures that if a
primary server fails due to a power surge or natural disaster, operations can switch to a
backup system, minimizing downtime. Regular maintenance reduces the likelihood of
hardware failure by addressing wear and tear proactively, thus lowering both the
probability and impact of physical disruptions.
2) Digital Risks
• Sources: Digital risks stem from intangible threats within the IT ecosystem, primarily
cyberattacks such as malware, phishing, ransomware, data breaches, or Distributed Denial of
Service (DDoS) attacks. These risks exploit vulnerabilities in software, networks, or databases,
often launched remotely by malicious actors.
•o Potential
o Impacts on an Organization: Digital risks can compromise data confidentiality,
integrity, and availability, leading to financial losses, legal penalties, and reputational harm. For
example, a ransomware attack could encrypt customer data, halting transactions and eroding
trust, while a data breach might expose sensitive information, triggering regulatory fines and
lawsuits.
•o Effective
o Mitigation Strategy: One specific strategy is implementing multi-factor authentication
(MFA). This requires users to provide multiple forms of verification (e.g., a password and a one-
time code) before accessing systems or data.
- ▪ How▪ It Reduces Risk: MFA reduces the likelihood of unauthorized access by adding an
additional layer of security beyond passwords, which can be stolen via phishing or brute
force attacks. Even if credentials are compromised, attackers cannot proceed without the
second factor, significantly lowering the risk of data breaches or ransomware deployment
and mitigating the potential impact on operations and reputation.
30 Risk Treatment Strategies
Risk treatment is the phase in IT risk management where organizations decide how to respond to identified
risks. The four main strategies—mitigation, avoidance, transfer, and acceptance—are selected based on the
organization’s risk appetite, the assessed impact and likelihood of each risk, and the prevailing regulatory
environment.
1) Risk Mitigation (Reduction):
This strategy involves implementing controls to reduce either the probability or impact of the risk.
Mitigation does not eliminate risk but brings it within acceptable limits.
Common Controls:
• Software patching and updates
• Access management and RBAC
• Employee awareness training
• Backup systems and business continuity plans
Scenario: A company identifies vulnerabilities in its web application. It deploys a Web Application Firewall
(WAF) and updates code to mitigate the risk of SQL injection attacks.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 129


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

2) Risk Avoidance:
Avoidance involves completely eliminating the activity that introduces the risk. This is suitable when
the risk is too severe and cannot be brought within the organization’s risk tolerance.
Scenario: A bank considering entering the cryptocurrency market decides to avoid it due to high
regulatory uncertainty and volatility, thereby avoiding associated compliance and operational risks.
3) Risk Transfer:
Here, the risk is shifted to another entity, typically through contracts or insurance. This strategy is
ideal when another party is better equipped to manage the risk.
Common Mechanisms:
• Cyber insurance
• Outsourcing to third-party vendors
• Service Level Agreements (SLAs)
Scenario: A company handling sensitive customer data transfers its cloud data storage to a certified
third-party provider with higher-grade security and compliance infrastructure. This transfers some
data protection risks to the vendor.
4) Risk Acceptance:
This strategy involves acknowledging the risk and choosing not to take action, often due to its low
impact or the high cost of mitigation outweighing the benefits. Accepted risks are documented and
monitored.
Scenario: A minor bug in an internal reporting tool is identified. Since it does not impact business
operations or customer data, the organization decides to accept it with a plan to monitor for changes.
Decision-Making Criteria:
• Risk Appetite: Determines which risks are tolerable and which require action. A low appetite for
financial or compliance risk necessitates more proactive treatments.
• Impact & Likelihood: High-impact/high-probability risks often require mitigation or avoidance.
Low-impact/low-probability risks may be accepted.
• Regulatory Compliance: Legal requirements may mandate specific responses (e.g., breach
notification, encryption standards).
• Cost-Benefit Analysis: The cost of implementing controls should be weighed against potential
loss.
Example: A hospital must comply with HIPAA regulations. Even minor data breach risks must be mitigated
or transferred, regardless of cost, due to regulatory consequences.
An effective risk treatment strategy balances risk appetite, business goals, and regulatory obligations. The
selected strategy should be dynamic, revisited periodically, and supported by proper documentation and
oversight to ensure continued effectiveness and alignment with evolving risks.

130 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

31 Foundational Security Measures

Defense-in-depth is a strategic approach to cybersecurity that employs multiple layers of security controls
throughout an organization’s IT environment. The objective is to create redundancy and resilience, ensuring
that if one layer fails, others remain in place to mitigate risk. This layered defense minimizes the likelihood
of a successful cyberattack or unauthorized data access.
One of the foundational layers in this strategy is access control, which ensures that only authorized users
can access systems, applications, or data. Techniques include Role-Based Access Control (RBAC) and
Attribute-Based Access Control (ABAC), which assign permissions based on user roles or contextual
attributes like location and time. The Principle of Least Privilege further restricts users to the minimum level
of access necessary, thereby reducing the attack surface. Enhancing access control, Multi-Factor
Authentication (MFA) provides an additional layer of identity verification through the use of passwords,
biometric data, and physical tokens.
Another critical component is encryption, which safeguards data confidentiality and integrity both at rest
and in transit. For example, AES-256 encryption is widely used to protect stored data, while Public Key
Infrastructure (PKI) enables secure communication over the internet using paired public-private keys and
digital certificates. These encryption methods ensure that even if data is intercepted, it cannot be understood
without the decryption keys.
Incident response (IR) forms the reactive layer of defense-in-depth. It involves detecting, analyzing, and
responding to security breaches to minimize damage. An effective Incident Response Plan (IRP) includes
steps such as preparation, detection, containment, eradication, and recovery. Complementing this is Security
Information and Event Management (SIEM) technology, which collects and analyzes logs across systems to
detect anomalies and generate real-time alerts.
Together, these foundational measures—access control, encryption, and incident response—provide a
robust, integrated security posture. They deter unauthorized access, protect sensitive information, and
enable swift recovery from threats, thereby strengthening the organization’s overall cyber resilience.
32 Role of Advanced and Infrastructure-level Technologies in IT security
As cyber threats evolve in sophistication, traditional IT security approaches are often insufficient.
Organizations are increasingly adopting advanced security technologies and infrastructure-level controls to
proactively detect, prevent, and respond to modern attacks.
Artificial Intelligence (AI) and Machine Learning (ML) have become vital tools in cybersecurity. These
technologies enable real-time analysis of vast datasets to detect anomalies and predict threats. Unlike static
rule-based systems, AI-driven tools can learn from patterns of behavior and identify previously unknown
attack vectors. For example, AI is used in behavioral analysis to detect unusual user activities, which may
indicate credential compromise or insider threats.
A paradigm shift in IT security is the implementation of Zero Trust Architecture. This model assumes that
threats can exist both outside and inside the network perimeter. As such, it enforces continuous
authentication, strict identity verification, and micro-segmentation of network resources. This means that
no user or device is automatically trusted—even those inside the corporate network. Every access request
is validated, minimizing the potential damage of breaches.
In parallel, endpoint security has become critical due to the rise in remote work and mobile device usage.
Endpoint Detection and Response (EDR) tools continuously monitor devices such as laptops, smartphones,
and desktops for threats and provide visibility into endpoint activities. They not only prevent malware but
also allow for rapid response and forensic investigation. Mobile Device Management (MDM) further
strengthens control by enforcing security policies across employee devices that access organizational data.
Infrastructure-level controls like Hardware Security Modules (HSMs) and Trusted Platform Modules (TPMs)
provide secure environments for cryptographic operations and secure boot processes. Additionally, network
security tools like firewalls, Intrusion Detection/Prevention Systems (IDS/IPS), VPNs, and network
segmentation enhance perimeter and internal network defenses.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 131


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

By integrating these technologies—AI/ML for intelligent threat detection, Zero Trust for strong identity and
access enforcement, and endpoint and infrastructure controls for layered protection—organizations can
create a comprehensive and adaptive cybersecurity strategy. This not only helps in defending against known
threats but also in anticipating and neutralizing emerging attack vectors in real-time.
33 Cybersecurity Threats and Defense Strategies
1) Malware

o How It Exploits Vulnerabilities: Malware, such as viruses, ransomware, or trojans, is malicious
software that infiltrates systems to disrupt operations, steal data, or extort money. It exploits
vulnerabilities like unpatched software, weak access controls, or user errors (e.g., downloading
infected files). For example, ransomware can encrypt critical data by exploiting an outdated
operating system, rendering it inaccessible until a ransom is paid.

o Preventive Measure: Regular Software Patching and Updates
- ▪ Explanation:
▪ ▪ This involves installing the latest security patches released by software
vendors to fix known vulnerabilities. By keeping systems up to date, organizations close
entry points that malware might exploit, reducing the likelihood of infection. For instance,
patching a flaw in a web server prevents ransomware from gaining a foothold.

o Detection Method: Antivirus and Anti-Malware Software
- ▪ Explanation:
▪ ▪ Antivirus tools scan systems in real-time, using signature-based detection to
identify known malware and heuristic analysis to spot suspicious behavior from unknown
threats. If malware slips through, the software can quarantine it, alerting IT teams to
respond.
- ▪ How▪ They ▪ Work Together: Patching prevents malware from exploiting known
vulnerabilities, while antivirus detects and neutralizes any malware that bypasses this
defense, such as zero-day threats. This layered approach minimizes both the chance of
infection and the extent of damage, enhancing overall security.
2) Phishing
•o How
o It Exploits Vulnerabilities: Phishing involves fraudulent emails, texts, or messages that
trick users into revealing credentials or clicking malicious links, often disguised as legitimate
communications. It exploits human vulnerabilities—such as lack of awareness or trust in familiar
brands—rather than technical flaws. For example, an employee might enter login details on a
fake banking site, granting attackers access to corporate systems.
•o Preventive
o Measure: Employee Training and Awareness Programs
- ▪ Explanation:
▪ ▪ Regular training educates employees to recognize phishing signs (e.g.,
misspelled domains, urgent requests) and avoid risky actions. Simulated phishing exercises
reinforce this, reducing the likelihood of successful attacks by strengthening the human
firewall.
•o Detection
o Method: Security Information and Event Management (SIEM) Systems
- ▪ Explanation:
▪ ▪ SIEM systems aggregate and analyze logs from across the network, detecting
anomalies like unusual login attempts or data exfiltration that might indicate a phishing
breach. They provide real-time alerts, enabling rapid investigation.
- ▪ How▪ They▪ Work Together: Training prevents employees from falling for phishing
attempts, lowering the initial risk, while SIEM detects successful attacks by flagging
suspicious activity (e.g., a login from an unfamiliar location). Together, they reduce both the
entry and escalation of phishing incidents, bolstering security.

132 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

Impact of Failing to Address These Threats on Reputation and Operations:


Failing to address malware and phishing can severely damage an organization’s reputation and
operations. A malware attack, like ransomware, could halt operations by locking critical systems—
e.g., an e-commerce site losing sales during a peak season—leading to revenue loss and customer
frustration. If sensitive data is compromised, customers may lose trust, perceiving the organization as
unreliable, which harms its reputation long-term. Similarly, a successful phishing attack exposing
client data could trigger regulatory fines and lawsuits, further eroding credibility. Operationally, the
downtime and recovery efforts divert resources, delaying projects and disrupting supply chains. For
instance, a hospital hit by ransomware might delay patient care, amplifying reputational damage as
public safety is compromised. In both cases, the failure to act proactively signals weakness, potentially
driving customers and partners to competitors perceived as more secure.
34 Emerging Cybersecurity Risks in a Tech-Driven Organization
Answer to Question 26: Emerging Cybersecurity Risks in a Tech-Driven Organization
TechNova Solutions has embraced Artificial Intelligence (AI), Internet of Things (IoT), and cloud computing
to enhance its operations. However, these technologies introduce cybersecurity risks that threaten its
systems, data, and reputation. Given below is an analysis of specific risks for each technology, along with
mitigation measures and their impact on security.
a) AI Risks and Mitigation
Risk 1: Algorithmic Bias in Chatbots
• Analysis: TechNova’s AI-powered chatbots rely on training data to respond to customer queries.
If the data contains biases (e.g., favoring certain demographics), the chatbots could provide
discriminatory or unfair responses, alienating customers and exposing the company to legal and
reputational risks.
• Mitigation Measure: Regular Audits and Bias Detection
o -o Explanation: TechNova should conduct periodic audits of its AI models, reviewing training
data for representativeness and testing chatbot outputs across diverse scenarios. Bias
detection tools can flag unfair patterns, allowing the team to retrain models with corrected
datasets. This reduces the risk by ensuring fair, accurate responses, enhancing security by
maintaining customer trust and avoiding legal liabilities.
Risk 2: Lack of Transparency (Black Box Problem)
• Analysis: The AI’s decision-making process may be opaque, making it difficult to explain chatbot
actions to customers or regulators (e.g., why a query was mishandled). This lack of transparency
could erode trust and complicate compliance with data protection laws.
• Mitigation Measure: Implement Explainable AI (XAI) Techniques
o -o Explanation: By integrating XAI, TechNova can provide insights into how chatbots reach
decisions (e.g., showing key factors influencing a response). This enhances security by
increasing transparency, enabling the IT team to identify and fix errors quickly, and
reassuring customers and regulators of accountability, thus reducing reputational and
compliance risks.
b) IoT Risks and Mitigation
Risk 1: Inadequate Security of IoT Devices
• Analysis: TechNova’s smart office IoT devices (e.g., sensors, cameras) may have weak security
features like default passwords or unencrypted communications, making them easy targets for
hackers to gain unauthorized access and disrupt operations.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 133


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

• Mitigation Measure: Strong Authentication Protocols


o -o Explanation: Implementing multi-factor authentication (MFA) and replacing default
credentials with unique, complex passwords strengthens device security. This reduces the
risk by making it harder for attackers to breach devices, even if one authentication factor is
compromised, thereby enhancing TechNova’s cybersecurity posture by protecting the
office infrastructure from unauthorized control or data theft.
Risk 2: Increased Attack Surface
• Analysis: Each IoT device expands TechNova’s attack surface, providing multiple entry points
for attackers. A compromised device could enable lateral movement to critical systems,
amplifying the potential damage.
• Mitigation Measure: Network Segmentation
o -o Explanation: Segmenting IoT devices onto a separate network isolates them from core IT
systems. If a device is hacked, the attack is contained, preventing access to sensitive data or
operational controls. This strengthens cybersecurity by limiting the scope of a breach,
ensuring that the broader network remains secure and operational.
c) Cloud Computing Risks and Mitigation
Risk 1: Misconfigured Cloud Settings
• Analysis: TechNova’s cloud platform could have misconfigured settings, such as publicly
accessible storage buckets or lax access controls, risking data exposure. For example, customer
data could be leaked due to an improperly secured database, leading to breaches and fines.
• Mitigation Measure: Regular Cloud Configuration Audits
o -o Explanation: Conducting frequent audits using tools like AWS Trusted Advisor identifies
and corrects misconfigurations (e.g., disabling public access). This mitigates the risk by
ensuring settings align with security best practices, preventing unauthorized access and
enhancing data security, thus protecting TechNova from breaches and compliance issues.
Risk 2: Lack of Visibility and Control
• Analysis: In a cloud environment, TechNova may lack full visibility into data flows and user
activities, creating blind spots where threats (e.g., insider misuse) go unnoticed, especially in a
multi-cloud setup.
• Mitigation Measure: Deploy Cloud Access Security Brokers (CASBs)**
o -o Explanation: CASBs provide visibility into cloud usage, monitor activities, and enforce
policies (e.g., blocking risky behaviors). This reduces the risk by detecting anomalies in real-
time and ensuring control over data, improving security by closing visibility gaps and
safeguarding against internal and external threats.
d) Strategic Impact
Discussion:
Proactive management of these AI, IoT, and cloud computing risks can position TechNova Solutions as
an industry leader by enhancing customer trust, operational resilience, and competitive advantage. By
addressing AI risks like bias and transparency, TechNova ensures its chatbots deliver fair, reliable
service, building customer confidence and loyalty—key differentiators in a competitive market.
Mitigating IoT risks strengthens the smart office’s security, ensuring uninterrupted operations and
showcasing TechNova’s reliability, which attracts clients valuing stability. Securing the cloud platform
protects sensitive data, aligning with regulatory standards and reinforcing trust with stakeholders,
potentially avoiding costly penalties that competitors might face.

134 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

Operationally, these measures minimize disruptions—e.g., preventing ransomware via IoT or data
leaks from the cloud—allowing TechNova to maintain service continuity and focus on innovation. This
resilience supports scalability, a critical factor for growth. Competitively, TechNova can market its
robust cybersecurity as a unique selling point, appealing to security-conscious customers and
partners, especially in industries like finance or healthcare. By demonstrating leadership in managing
emerging tech risks, TechNova not only safeguards its reputation but also sets a benchmark,
potentially influencing industry standards and gaining a first-mover advantage in a tech-driven
landscape.
Conclusion
TechNova Solutions faces distinct cybersecurity risks from AI (bias, transparency), IoT (device
security, attack surface), and cloud computing (misconfiguration, visibility). Mitigation measures—
audits, XAI, MFA, segmentation, and CASBs—address these risks by preventing incidents and
enhancing detection, collectively strengthening security. Proactively managing these threats positions
TechNova as a trusted, resilient leader, driving customer loyalty and competitive edge in its industry.
35 Objectives and Importance of IT Controls
Definition of IT General Controls (ITGCs):
IT General Controls (ITGCs) are a set of policies, procedures, and practices designed to ensure the reliability,
security, and integrity of an organization’s IT systems. Unlike application-specific controls tailored to
individual software, ITGCs apply broadly across the IT environment, covering areas such as access
management, system operations, and physical security. They provide a foundational framework to safeguard
IT infrastructure, ensuring that technology supports business objectives effectively.
Role in Managing Risks within an Organization’s IT Environment:
ITGCs play a critical role in managing risks by establishing standardized processes to protect IT systems
from threats like cyberattacks, data breaches, system failures, and human errors. They mitigate
vulnerabilities by enforcing controls that maintain the confidentiality, integrity, and availability of data and
systems. By aligning IT operations with regulatory requirements and business goals, ITGCs reduce the
likelihood of disruptions, financial losses, and reputational damage, fostering operational resilience and
stakeholder trust in an increasingly digital landscape.
Key Objectives and How They Mitigate Specific IT-Related Risks:
1) Ensuring Data Integrity

o Description: This objective focuses on preventing unauthorized access, modification, or
deletion of data, ensuring it remains accurate, complete, and reliable throughout its lifecycle.
ITGCs achieve this through measures like access restrictions, data validation checks, and audit
trails.

o Mitigation of Specific Risk: Ensuring data integrity mitigates the risk of data tampering or
corruption, which could stem from insider threats or malware. For instance, if an employee or
malicious software alters financial records undetected, it could lead to fraudulent reporting or
decision-making errors. Robust access controls and monitoring prevent unauthorized changes,
reducing this risk by maintaining data trustworthiness.

o Contribution to Risk Management: By safeguarding data accuracy, this objective protects
against financial misstatements, compliance violations, and operational inefficiencies, ensuring
reliable information for critical processes.
2) Maintaining System Availability

o Description: This objective ensures IT systems remain operational and accessible when
needed, supporting business continuity. ITGCs achieve this through redundancy planning,
disaster recovery protocols, and regular maintenance to prevent outages.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 135


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS


o Mitigation of Specific Risk: Maintaining system availability mitigates the risk of system
downtime caused by hardware failures, cyberattacks (e.g., ransomware), or natural disasters.
For example, a DDoS attack could overwhelm servers, halting online services, but redundant
systems and recovery plans ensure continuity. This reduces the risk of operational
interruptions and revenue loss.

o Contribution to Risk Management: By keeping systems functional, this objective minimizes
disruptions to customer services, supply chains, or internal workflows, preserving operational
stability and responsiveness.
36 IT General Controls and Information and Communication Technology strategies
A financial institution facing rising cybersecurity risks—such as data breaches, ransomware, phishing,
insider threats, and regulatory non-compliance—requires a robust combination of IT General Controls
(ITGCs) and Information and Communication Technology (ICT) strategies to safeguard its systems,
data, and operations. The following recommendations are tailored to address the institution’s specific needs.
Each recommendation is justified based on its ability to mitigate identified risks, ensure compliance, and
enhance resilience.
Recommended IT General Controls (ITGCs)
ITGCs provide a foundational framework to ensure the integrity, security, and availability of IT systems. The
following components are critical for mitigating cybersecurity risks in a financial institution:
1) Access Controls

o Recommendations:
- ▪ Implement Multi-Factor Authentication (MFA) for all users, requiring a password and
a one-time code sent to a mobile device or biometric verification (e.g., fingerprint or facial
recognition).
- ▪ Use Role-Based Access Control (RBAC) to limit access to sensitive financial systems and
data based on job roles (e.g., tellers cannot access administrative systems).
- ▪ Deploy Privileged Access Management (PAM) to monitor and restrict system
administrators’ activities, ensuring elevated permissions are logged and time-limited.

o Justification:
- ▪ Financial institutions handle highly sensitive data (e.g., customer financial records),
making unauthorized access a top risk. MFA and RBAC reduce the likelihood of breaches
from stolen credentials or phishing attacks, which are prevalent in the sector. PAM
addresses insider threats by controlling privileged users who could misuse access, a
critical concern given the potential for significant financial damage.
2) Change Management Controls

o Recommendations:
- ▪ Establish a formal change approval process requiring authorization from IT and
compliance teams for system updates or software patches.
- ▪ Conduct testing and validation in a sandbox environment before deploying changes to
production systems (e.g., testing a new payment processing update).
- ▪ Maintain detailed documentation of all changes for audit trails and accountability.

o Justification:
- ▪ Uncontrolled changes can introduce vulnerabilities, such as unpatched software
exploited by ransomware. A structured change management process ensures updates are
secure and compliant with regulations like SOX, reducing operational and cybersecurity
risks.

136 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

3) IT Operations Controls

o Recommendations:
- ▪ Implement daily backups of critical financial data (e.g., transaction records) stored in a
secure, offsite location, with quarterly recovery testing.
- ▪ Develop an incident management process with predefined protocols to address
breaches or outages swiftly.
- ▪ Use real-time system monitoring tools to detect anomalies, such as unusual transaction
patterns or network traffic spikes.

o Justification:
- ▪ Ransomware and system failures threaten data availability, and backups ensure rapid
recovery. Incident management minimizes damage from breaches, while monitoring
detects threats like hacking or DDoS attacks early, critical for maintaining trust in a
financial institution.
4) Physical Security Controls

o Recommendations:
- ▪ Restrict access to data centers with biometric scanners and key card systems, logging
all entries.
- ▪ Install environmental controls (e.g., fire suppression, temperature regulation) and
uninterruptible power supplies (UPS) to protect servers.
- ▪ Deploy CCTV surveillance and motion sensors to monitor server rooms.

o Justification:
- ▪ Physical
▪ breaches can compromise servers hosting sensitive financial data. These
controls prevent unauthorized access and ensure system availability, aligning with the
need to safeguard confidentiality and integrity.
Recommended ICT Strategies
ICT enhances risk management by leveraging advanced tools and technologies for proactive identification,
reporting, and mitigation of cybersecurity risks. The following strategies are recommended:
1) Data Analytics and Real-Time Monitoring

o Recommendations:
- ▪ Deploy Security Information and Event Management (SIEM) systems to aggregate and
analyze logs from financial systems in real time.
- ▪ Use AI-driven analytics to detect anomalies, such as unusual login attempts or large fund
transfers, indicating potential breaches or fraud.

o Justification:
- ▪ Financial institutions face sophisticated threats like phishing and insider attacks. SIEM
and AI tools provide continuous monitoring and early detection, enabling rapid response
to minimize financial and reputational damage.
2) Automated Risk Scanners and Vulnerability Management

o Recommendations:
- ▪ Implement automated vulnerability scanners to regularly check for unpatched
software, misconfigured firewalls, or weak encryption in banking systems.
- ▪ Conduct penetration testing quarterly to simulate cyberattacks and identify exploitable
weaknesses.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 137


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS


o Justification:
- ▪ Automated scanners proactively identify vulnerabilities that could be exploited by
malware or hackers, while penetration testing ensures defenses are robust against
advanced threats like zero-day exploits. This is vital for a sector targeted by
cybercriminals.
3) Incident Response Systems

o Recommendations:
- ▪ Use Security Orchestration, Automation, and Response (SOAR) platforms to
automate responses, such as isolating compromised systems or blocking malicious IP
addresses.
- ▪ Develop a forensic analysis capability to investigate breaches and strengthen defenses
post-incident.

o Justification:
- ▪ Rapid response is critical to limit damage from ransomware or data breaches. SOAR
automates containment, reducing downtime, while forensics ensures lessons learned
improve future resilience, aligning with financial sector needs for operational continuity.
4) Business Continuity and Disaster Recovery

o Recommendations:
- Implement cloud-based disaster recovery solutions to replicate critical systems and
data offsite, ensuring rapid restoration after an attack.
- Conduct regular disaster recovery drills to test failover to backup systems.

o Justification:
- ▪ Cyberattacks like DDoS or ransomware can disrupt services. Cloud backups and drills
ensure minimal downtime and data loss, maintaining customer trust and regulatory
compliance.
5) User Education and Phishing Awareness

o Recommendations:
- ▪ Conduct regular phishing simulations and training to educate employees on
recognizing fraudulent emails or links.
- ▪ Promote password management practices, encouraging the use of password managers
and MFA.

o Justification:
- ▪ Human error, especially via phishing, is a leading cause of breaches. Training reduces this
risk, while strong password practices bolster access security, critical for protecting
sensitive financial data.
Integration and Justification Summary
• Why ITGCs and ICT Together?
o
­ ITGCs establish a structured, foundational control environment, ensuring system reliability and
compliance, while ICT provides dynamic, technology-driven tools to address evolving threats.
Together, they create a layered defense that is both preventive and responsive.

138 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN


CAF 3 - DATA SYSTEMS AND RISKS LONG-FORM ANSWERS

• Alignment with Financial Sector Needs:


o
­ The recommendations address key risks like data breaches, ransomware, and insider threats,
which are acute in finance due to the high value of data and strict regulations. They ensure data
integrity, system availability, and confidentiality, while leveraging ICT for proactive risk
management.
• Cost-Benefit Consideration:
o
­ While implementing these controls and tools requires investment, the cost of a breach—
financial losses, fines, and reputational damage—far outweighs the expense. Automation and
scalability further optimize resource use.
By combining these ITGCs and ICT strategies, the financial institution can mitigate rising cybersecurity risks
effectively, ensuring resilience, regulatory compliance, and customer trust in an increasingly threat-laden
digital landscape.
37 National Cyber Security Policy (NCSP) 2021
Three-Tier Governance Structure & National-Level Entity
The NCSP 2021 establishes a three-tier cybersecurity governance structure:
1) National Level:

o Key Entity: National Computer Emergency Response Team (nCERT)

o Primary Function: Acts as Pakistan’s central cybersecurity watchdog, responsible for:
­ ▪ Monitoring and responding to cyber threats in real-time.

­ ▪ Coordinating with international CERTs (e.g., INTERPOL, APNIC) for threat intelligence
sharing.
­ ▪ Issuing alerts and advisories during cyber incidents (e.g., ransomware attacks on critical
infrastructure).
2) Sectoral Level:

o Sector-specific CERTs (e.g., Banking CERT, Telecom CERT) to address industry-specific risks.
3) Organizational Level:

o Mandates Chief Information Security Officers (CISOs) in critical organizations to implement
security policies and compliance audits.
38 Prevention of Electronic Crimes Act (PECA) 2016
Definition of "Cyberterrorism" under PECA 2016
Under Section 10 of PECA 2016, cyberterrorism is defined as any act committed using computers, networks,
or digital systems with the intent to:
• Terrorize the public or a segment of the public
• Advance sectarian, ethnic, or religious hatred
• Disrupt critical infrastructure (e.g., energy grids, banking systems, government databases)
• Coerce the government or international organizations
Example of Cyberterrorism under PECA
An act that qualifies as cyberterrorism could be:
• A coordinated cyberattack on Pakistan’s national power grid, causing prolonged blackouts.

THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN 139


LONG-FORM ANSWERS CAF 3 - DATA SYSTEMS AND RISKS

Key Characteristics of Cyberterrorism under PECA


• Intent to Cause Harm or Fear: Unlike hacking for financial gain, cyberterrorism aims to instill terror.
• Targeting Critical Infrastructure: Attacks on systems vital to national security are treated as
cyberterrorism.
• Link to Terrorism: The offense aligns with Pakistan’s broader counterterrorism laws, allowing
authorities to prosecute suspects under both PECA and anti-terrorism statutes.
39 Electronic Transactions Ordinance (ETO) 2002
Role of a Certification Service Provider (CSP) under ETO 2002
Under the Electronic Transactions Ordinance (ETO) 2002, a Certification Service Provider (CSP) is an
accredited entity responsible for:
1. Issuing Digital Certificates:

o CSPs verify the identity of individuals/organizations and issue digital certificates that bind an
entity to its public key, ensuring the authenticity of electronic signatures.
2. Validating Electronic Signatures:

o They provide Advanced Electronic Signatures (AES), which meet strict technical standards (e.g.,
PKI-based encryption) and are legally equivalent to handwritten signatures.
3. Maintaining Security Standards:

o CSPs must comply with regulations set by the Electronic Certification Accreditation Council
(ECAC) and publish a Certification Practice Statement (CPS) detailing their security protocols.
How CSPs
o Ensure Authenticity:
• Identity Verification: Before issuing a certificate, CSPs authenticate the applicant’s identity through
documents (e.g., CNIC, business registration).
• Non-Repudiation: Digital signatures created via CSP-issued certificates cannot be denied by the signer,
ensuring legal enforceability.
Revocation for Compromised Certificates: If a private key is breached, CSPs can revoke the certificate to
prevent misuse.

140 THE INSTITUTE OF CHARTERED ACCOUNTANTS OF PAKISTAN

You might also like