0% found this document useful (0 votes)
41 views188 pages

ADF - Module 1

The document outlines the processes and best practices for cyber-crime investigation and digital forensics, detailing the types of cyber-crimes, phases of investigation, and key tools used. It emphasizes the importance of legal compliance, evidence preservation, and the Daubert Standard for admissibility of expert testimony in court. Additionally, it discusses the ISO/IEC 27037:2012 guidelines for handling digital evidence, focusing on identification, collection, and preservation to ensure forensic soundness.

Uploaded by

Jay parmar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views188 pages

ADF - Module 1

The document outlines the processes and best practices for cyber-crime investigation and digital forensics, detailing the types of cyber-crimes, phases of investigation, and key tools used. It emphasizes the importance of legal compliance, evidence preservation, and the Daubert Standard for admissibility of expert testimony in court. Additionally, it discusses the ISO/IEC 27037:2012 guidelines for handling digital evidence, focusing on identification, collection, and preservation to ensure forensic soundness.

Uploaded by

Jay parmar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 188

National Forensic Sciences University

An Institution of National Importance


(Ministry of Home Affairs, Government of India)

MODULE-1 Cyber-crime Investigation and Digital Forensics

Advanced Digital Forensics

Presenter:
Yash Patel
Introduction to Cyber-crime Investigation
What is Cyber-crime?

● Cyber-crime refers to illegal activities conducted via the internet or electronic


devices. Criminal activities involving computers, networks, or data.
Introduction to Cyber-crime Investigation
Types of Cyber-Crimes

● Hacking
● Phishing
● Ransomware
● Identity theft
● Financial fraud
● Malware Distribution
● Cyberstalking
Introduction to Cyber-crime Investigation
Phases of Investigation

● Incident Detection: Identification of the cyber-crime.


● Evidence Collection: Gathering digital evidence through techniques like log
analysis, network monitoring, and forensic imaging.
● Analysis: Analyzing collected data to uncover the offender and modus
operandi.
● Legal Proceedings: Presenting evidence in court for prosecution.
Introduction to Cyber-crime Investigation
Key Tools Used in Cyber Investigations

● Forensic Imaging Tools: FTK Imager and EnCase


● Network Analyzers: Wireshark and Tcpdump
● Malware Analysis Tools: IDA Pro and Ghidra
● Log Analysis Tools: Splunk, Qradar and Graylog
● Endpoint Detection and Response (EDR): Carbon Black and CrowdStrike
● Password Cracking Tools: John the Ripper and Hashcat
● File Carving Tools: Scalpel and Foremost
● Threat Intelligence Platforms: ThreatConnect and Recorded Future
Introduction to Cyber-crime Investigation
Challenges in Cyber-Crime Investigations

● Encryption and anonymity


● Jurisdictional issues in cross-border crimes
● Rapidly evolving technologies
Conducting an Investigation
Key Phases:

1. Identification
● Locating potential sources of evidence (devices, networks, etc.)
● Example: Detecting that a compromised employee laptop contains potential
evidence of intellectual property theft.

2. Preservation
● Ensuring that evidence is protected and not altered.
● Example: Creating a bit-by-bit forensic image of the laptop's hard drive to
preserve the original data.
Conducting an Investigation
Key Phases:

3. Collection
● Gathering digital data in a forensically sound manner.
● Example: Extracting email archives, internet history, and deleted files from the
forensic image using specialized tools.

4. Examination
● Processing and analyzing the data to uncover relevant information.
● Example: Searching through the email archives for suspicious attachments or
communications related to the incident.
Conducting an Investigation
Key Phases:

5. Analysis
● Drawing conclusions from the collected evidence to support legal or internal
investigations.
● Example: Correlating the extracted data with timestamps to reconstruct a
timeline of the suspected theft activities.
6. Presentation
● Organizing and presenting findings in a clear and understandable manner for
stakeholders or court proceedings.
● Example: Creating a report and visual aids (e.g., timelines, charts) to explain
the findings clearly to non-technical stakeholders in court.
Conducting an Investigation
Best Practices:

● Chain of Custody: Always document who handles the evidence and when to
ensure it's admissible in court.
● Forensically Sound Methods: Use write blockers and trusted forensic tools
to prevent altering the original evidence.
● Comprehensive Documentation: Record every action taken during the
investigation to ensure the process is transparent and reproducible.
● Stay Up-to-Date: Continuously update knowledge and tools to keep up with
evolving technologies and techniques.
● Legal Compliance: Ensure that all steps taken in the investigation comply
with legal and regulatory standards.
Preparing for Search and Seizure
1. Legal Authorization
● Obtain search warrant or court order
● Ensure compliance with jurisdictional laws and regulations
● Example: Acquiring a warrant to search an employee's work laptop suspected
of containing stolen data.
2. Pre-Search Planning
● Conduct reconnaissance of the location
● Identify potential evidence and target areas
● Prepare necessary tools and equipment (e.g., forensic kits, data storage
devices)
● Example: Planning the collection of both on-site and cloud-stored data for a
case involving intellectual property theft.
Preparing for Search and Seizure
3. Assemble the Team
● Select experienced investigators and technical experts
● Brief team on legal constraints and objectives
● Assign specific roles and responsibilities
● Example: Ensuring that a forensic expert, legal advisor, and IT security
professional are present during the operation.
4. Securing the Scene
● Ensure the safety of personnel and evidence
● Isolate the area to prevent tampering
● Document the condition of the scene upon entry
● Example: Locking down a section of the office where digital devices are
located and restricting access only to authorized forensic personnel.
Preparing for Search and Seizure
5. Evidence Collection
● Follow chain of custody protocols
● Collect and preserve physical and digital evidence systematically
● Use proper forensic techniques to prevent contamination
● Example: Using a write blocker to connect a suspect's hard drive to prevent
any accidental data modification.
6. Post-Seizure Documentation
● Catalog all seized items
● Create detailed reports on the search process
● Prepare for potential legal challenges regarding the seizure
● Example: Logging the exact time, location, and person responsible for
handling each seized device.
Preparing for Search and Seizure
Best Practices:

● Minimize Disruption: Plan the operation to minimize disruption to ongoing


business operations or personal lives.
● Preserve Evidence: Ensure that all devices are handled with care to avoid
damaging potential evidence.
● Document Everything: Maintain detailed records of all activities, decisions,
and findings throughout the search and seizure process.
● On-Site Analysis: Where possible, perform an initial on-site analysis to
identify and isolate relevant devices quickly.
Securing the Crime Scene
1. Isolating the Affected Systems
● Disconnect compromised systems from the network
● Block unauthorized access to prevent further damage
● Preserve volatile data by ensuring devices remain powered on if necessary

2. Identifying Points of Compromise


● Determine the entry points (e.g., breached accounts, infected devices)
● Secure vulnerable systems to prevent continued exploitation
Securing the Crime Scene
3. Containment
● Segregate compromised systems from critical infrastructure
● Implement firewall rules, access control lists, and other containment
strategies

4. Evidence Preservation
● Capture live system data (e.g., RAM, active network connections)
● Create forensic images of storage devices
● Log and record all actions taken during the scene securing
Securing the Crime Scene
5. Documentation
● Document the state of all affected systems upon arrival
● Record actions taken, including network isolation, shutdowns, and forensic
captures
● Ensure detailed notes for legal and investigative purposes

6. Communication
● Coordinate with the response team to ensure all actions are properly aligned
● Brief stakeholders on containment progress and next steps
Seizing Digital Evidence at SOC
What is SOC?

● A Security Operations Center (SOC) is a centralized unit that monitors,


detects, and responds to cybersecurity incidents within an organization.
Seizing Digital Evidence at SOC
Key Functions:

● Monitoring: Continuous real-time monitoring of networks and systems.


● Incident Detection: Identifying potential security threats or breaches.
● Response: Taking appropriate actions to mitigate or neutralize incidents.
● Reporting: Documenting and analyzing incidents for future prevention.
Seizing Digital Evidence at SOC
Key Roles:

● SOC Analysts: Front-line defenders, monitoring and analyzing security


threats.
● Incident Responders: Specialists in handling and resolving security
breaches.
● Threat Hunters: Experts proactively searching for vulnerabilities and threats.
Seizing Digital Evidence at SOC
Examples:

● Phishing Attack: A SOC detects a phishing email targeting employees,


blocks the email, and initiates user awareness training to prevent future
incidents.
● Ransomware Attack: When ransomware is detected on a workstation, the
SOC isolates the system, removes the malware, and restores affected data
from backups.
● Unusual Login Activity: The SOC spots a login attempt from an unusual
location, flags the account, and enforces a password reset to prevent
unauthorized access.
Seizing Digital Evidence at SOC
Why It’s Important:

● SOC ensures that an organization’s cybersecurity defenses are active and


ready to respond to any potential threats, helping safeguard sensitive data
and maintain business continuity.
Seizing Digital Evidence at SOC
Procedures for Collecting Evidence in a SOC:

● Network Traffic Capture: Implement continuous packet capture (PCAP) to


record network traffic for forensic analysis. Use tools like Wireshark or Zeek to
dissect and preserve packet data.
● Log Aggregation & Correlation: Centralize logs from various sources
(firewalls, IDS/IPS, SIEM systems) using platforms like ELK Stack or Splunk,
ensuring they are timestamped and correlated for incident reconstruction.
● Memory Forensics: Extract volatile data from live systems (e.g., RAM) using
tools like Volatility or Rekall to gather in-memory evidence that could be lost
upon shutdown.
Seizing Digital Evidence at SOC
Procedures for Collecting Evidence in a SOC:

● File System Imaging: Create bit-for-bit forensic images of storage devices


using tools like FTK Imager or dd, preserving the entire file system for
in-depth analysis and recovery of deleted files.
● Automated Evidence Triage: Deploy automated tools that classify and
prioritize evidence based on incident severity and potential data loss, allowing
quick identification of key data.
● Data Encryption Handling: Ensure that encrypted data is captured in its
encrypted state and consider live decryption techniques when permissible
and feasible.
Seizing Digital Evidence at SOC
Specific Challenges in SOC Environments

● Encrypted Traffic: Increasing prevalence of TLS/SSL encryption in network


traffic complicates packet analysis. Solutions include deploying SSL/TLS
interception proxies or decrypting traffic in controlled environments.
● Cloud and Hybrid Environments: Evidence collection from cloud services
(AWS, Azure) requires specialized approaches, such as API-based access to
logs and data, ensuring adherence to service provider policies.
Seizing Digital Evidence at SOC
Specific Challenges in SOC Environments

● Containerized and Microservices Architectures: Seizing evidence from


containerized environments (e.g., Docker, Kubernetes) involves snapshotting
containers, capturing logs from ephemeral services, and dealing with
distributed storage.
● Advanced Persistent Threats (APT): Handling APTs requires long-term
monitoring, complex evidence correlation across multiple vectors, and
stealthy evidence collection to avoid tipping off attackers.
Seizing Digital Evidence at SOC
Best Practices:

● Immutable Log Storage: Store logs in immutable storage (e.g., WORM


(write once, read many) drives, blockchain-based systems) to ensure
evidence cannot be altered post-collection.
● Data Anonymization: When dealing with sensitive data, implement
anonymization techniques to comply with privacy regulations while
maintaining the integrity of the evidence.
Seizing Digital Evidence at SOC
Best Practices:

● Artifact Collection Automation: Automate the collection of system artifacts,


such as registry keys, scheduled tasks, and configuration files, using scripts
or tools like osquery.
● Cross-Platform Compatibility: Ensure forensic tools and methods are
compatible across various platforms (Windows, Linux, macOS, IoT devices)
for comprehensive evidence collection.
Daubert Standards
What is the Daubert Standard?

● A legal framework used in U.S. federal courts.


● Determines the admissibility of expert witness testimony.
● Established by the U.S. Supreme Court in Daubert v. Merrell Dow
Pharmaceuticals, Inc. (1993).
Daubert Standards
Purpose of the Daubert Standard

● Ensures expert testimony is:


○ Reliable: Scientifically sound.
○ Relevant: Pertinent to the case.
○ Applicable: Properly linked to the facts of the case.

● Empowers judges as gatekeepers to assess scientific validity.


Daubert Standards
The Daubert Criteria

● Testability
○ Can the theory or technique be tested?
● Peer Review
○ Has the theory or technique been peer-reviewed and published?
● Error Rate
○ What is the known or potential error rate?
● Standards
○ Are there standards controlling the operation of the technique?
● General Acceptance
○ Is the theory or technique generally accepted in the relevant scientific community?
Daubert Standards
The Role of Judges

● Judges act as gatekeepers.


● They assess the scientific validity of the expert's methods.
● They ensure that unreliable or irrelevant testimony is excluded.
Daubert Standards
Impact of the Daubert Standard

● Raised the bar for the admissibility of scientific evidence.


● Encourages rigorous scrutiny of expert testimony.
● Influences both criminal and civil cases in federal courts.
Daubert Standards
Example 1: Daubert v. Merrell Dow Pharmaceuticals, Inc. (1993)

● Case Summary: Parents sued the pharmaceutical company, claiming that a


drug caused birth defects.

● Expert Testimony: Complainant presented scientific experts who used


unpublished studies to support their claims.

● Outcome: The Supreme Court established that courts must evaluate the
scientific validity of the evidence, leading to the creation of the Daubert
Standard.
Daubert Standards
Example 2: General Electric Co. v. Joiner (1997)

● Case Summary: An electrician claimed exposure to PCB chemicals caused


his lung cancer.

● Expert Testimony: The complainant’s experts relied on studies linking PCBs


to cancer in laboratory animals.

● Outcome: The court found that the expert testimony was too uncertain and
not directly applicable to the facts of the case, affirming the trial court’s
exclusion of the evidence under the Daubert Standard.
Daubert Standards
Example 3: Kumho Tire Co. v. Carmichael (1999)

● Case Summary: A tire blowout led to a fatal accident, and the complainant
claimed the tire was defective.

● Expert Testimony: Complainant presented an expert in tire failure analysis.

● Outcome: The Supreme Court ruled that the Daubert Standard applies to all
expert testimony, not just scientific testimony, broadening its application.
Daubert Standards
Why the Daubert Standard is Useful

● Ensures Reliable Evidence


○ Promotes the use of scientifically valid methodologies.
○ Filters out unreliable, speculative, or untested theories.

● Protects the Integrity of the Legal Process


○ Prevents misleading or irrelevant expert testimony from influencing the jury.
○ Helps ensure that court decisions are based on sound evidence.

● Enhances Judicial Gatekeeping


○ Empowers judges to critically assess the quality of expert evidence.
○ Ensures that expert opinions presented in court meet rigorous scientific standards.
Daubert Standards
Why the Daubert Standard is Useful

● Promotes Fairness in Legal Proceedings


○ Establishes a clear, consistent standard for admissibility across cases.
○ Helps level the playing field by ensuring that all parties adhere to high standards of evidence.

● Encourages Peer-Reviewed and Established Methods


○ Incentivizes experts to rely on established, peer-reviewed research.
○ Reduces the risk of "junk science" influencing legal outcomes.
ISO/IEC 27037: 2012
Comprehensive Scope

● ISO/IEC 27037:2012 provides guidelines for the initial stages of digital


evidence handling, primarily focusing on the following phases:
○ Identification
○ Collection
○ Acquisition
○ Preservation

● This standard is designed to support digital forensic activities within various


environments, such as corporate investigations, cybercrime inquiries, and
government investigations. It ensures that digital evidence is collected in a
manner that is both legally admissible and forensically sound.
ISO/IEC 27037: 2012
Phase 1: Identification of Digital Evidence

This phase involves identifying all potential sources of digital evidence, which can
range from computers, mobile devices, servers, cloud storage, and IoT devices to
network logs and databases. ISO/IEC 27037 emphasizes that:

● The identification should be performed systematically and comprehensively.


● All sources, including metadata and residual data (e.g., deleted files), should
be considered.
● Potential risks of evidence destruction or alteration should be anticipated.
ISO/IEC 27037: 2012
Phase 2: Collection of Digital Evidence

During the collection phase, it is crucial to ensure that the data is gathered in a
manner that preserves its integrity. ISO/IEC 27037:2012 outlines specific methods
for ensuring:
● Forensic soundness: This refers to collecting evidence without altering the
original data, ensuring that it remains in its original state.
● Chain of custody: A documented process that records each person or entity
that handles the evidence, ensuring its traceability.
● Minimization of Contamination: Steps should be taken to reduce the risk of
contamination, including isolating the data from the live environment, avoiding
unnecessary data access, and employing write blockers or other tools.
ISO/IEC 27037: 2012
Phase 3: Acquisition of Digital Evidence

This phase involves the actual extraction of data from digital sources. ISO/IEC
27037:2012 provides guidelines for:

● Imaging: Making exact copies of digital storage media, such as hard drives,
in a way that preserves the original data. Bit-level copies are preferred to
ensure all data, including hidden or deleted files, is captured.
ISO/IEC 27037: 2012
Phase 3: Acquisition of Digital Evidence

● Data Integrity Verification: Ensuring the integrity of acquired evidence by


using cryptographic hash functions (e.g., SHA-256, MD5) to generate unique
identifiers (hash values) for the data. These hashes can later be used to verify
that the data has not been altered.

● Handling Different Devices: ISO/IEC 27037 provides guidance on acquiring


evidence from a wide range of devices, from traditional computers and mobile
phones to more complex systems like cloud services and IoT devices.
Specific considerations for each type of device are provided, including
technical and legal challenges.
ISO/IEC 27037: 2012
Phase 4: Preservation of Digital Evidence

After acquisition, evidence must be preserved for future analysis and potential
legal proceedings. This involves:

● Storage and Archiving: Evidence should be stored in a secure environment,


with controls to prevent unauthorized access, tampering, or degradation of the
data.
ISO/IEC 27037: 2012
Phase 4: Preservation of Digital Evidence

● Documentation: Detailed records should be kept, documenting every action


taken with the evidence, from collection through to analysis. This includes
logging the tools and methods used, the personnel involved, and any
changes in custody.
● Backup: Creating backups of digital evidence ensures that original data
remains untouched while allowing for analysis on duplicate copies.
ISO/IEC 27037: 2012
Legal and Regulatory Considerations

ISO/IEC 27037:2012 emphasizes that digital evidence handling must comply with
applicable laws and regulations. This includes:

● Jurisdictional Issues: Evidence may cross international borders, and legal


obligations can vary depending on the location of the evidence. Organizations
need to understand the legal requirements in each relevant jurisdiction.
● Admissibility: To ensure that digital evidence is admissible in court, it must
be collected, acquired, and preserved following strict legal standards. This
includes maintaining the chain of custody and adhering to established
forensic principles.
ISO/IEC 27037: 2012
Roles and Responsibilities

● ISO/IEC 27037 defines roles such as first responders, digital evidence


specialists, and forensic analysts, emphasizing the importance of
collaboration.

● First responders must ensure that they handle potential evidence properly to
avoid accidental contamination, while specialists and analysts are responsible
for the more technical aspects of evidence processing.
ISO/IEC 27037: 2012
Technical Considerations

The standard provides technical guidelines for handling various types of digital
evidence, including:

● File Systems and Data Structures: Understanding different file systems


(e.g., NTFS, FAT, EXT) is crucial for effective evidence collection and
acquisition.
ISO/IEC 27037: 2012
Technical Considerations

● Network Evidence: Gathering network logs, packet captures, and other data
from network devices can be vital. The standard discusses the importance of
preserving time synchronization and log integrity.

● Volatile Data: In some cases, data stored in volatile memory (e.g., RAM) may
be critical, and specific guidelines are provided for capturing and preserving
such data before it is lost.
ISO/IEC 27037: 2012
Risk Management

● ISO/IEC 27037 encourages a risk-based approach to digital evidence


handling.
● Risks include the potential for data corruption, unauthorized access, or
unintentional modification.
● The standard recommends implementing strong controls and continuously
assessing risks throughout the forensic process.
ISO/IEC 27037: 2012
Tools and Techniques

● The standard advises on the selection and validation of forensic tools.


● Only tools that have been thoroughly tested and validated should be used for
digital evidence handling.
● It also emphasizes the importance of documentation, ensuring that all tools
and methods used during the investigation are well-documented.
ISO/IEC 27037: 2012
Chain of Custody and Documentation

● The chain of custody is critical to maintaining the integrity of digital evidence.


● Every person who handles the evidence must be documented, and the exact
location of the evidence must be recorded at all times.
● The standard stresses that even small gaps in the documentation could
render evidence inadmissible in court.
ISO/IEC 27037: 2012
Relationship with Other Standards

ISO/IEC 27000 Family

● ISO/IEC 27000: A series of standards focusing on information security


management systems (ISMS).
● ISO/IEC 27001: Requirements for establishing, implementing, maintaining,
and improving an ISMS.
● ISO/IEC 27002: Provides best practices for security controls.
ISO/IEC 27037: 2012
Relationship with Other Standards

Digital Forensics Standards

● ISO/IEC 27041: Guidelines on assuring the suitability and adequacy of


incident investigative methods.
● ISO/IEC 27042: Guidelines on analysis and interpretation of digital evidence.
● ISO/IEC 27043: Incident investigation principles and processes.
ISO/IEC 27037: 2012
Relationship with Other Standards

Complementary Standards

● ISO/IEC 17025: General requirements for the competence of testing and


calibration laboratories, applicable to forensic labs.
● NIST SP 800-86: Guide to integrating forensic techniques into incident
response.
ISO/IEC 27037: 2012
Relationship with Other Standards

Key Integration

● ISO/IEC 27037 sets the foundation for digital evidence handling, and
subsequent standards (27041, 27042, 27043) build on its guidelines for
deeper forensic analysis and investigation processes.

● Adopting a holistic approach ensures a thorough and legally defensible


handling of digital evidence from identification to legal presentation.
Data Collection Methods
Collection of Volatile Data (e.g., RAM)

● Volatile data refers to information stored in temporary memory, such as RAM,


which is lost when a device is powered down.
● This type of data is crucial in forensic investigations because it contains
active, real-time data that can reveal important clues about system activities,
running processes, and potential malicious actions.
● However, volatile data collection is time-sensitive and technically challenging.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Memory Dumping Tools:

● Volatility Framework: A powerful memory forensics tool that can capture and
analyze the contents of RAM on Windows, Linux, and macOS systems.
Volatility supports various plugins that allow investigators to extract artifacts
such as active processes, network connections, loaded drivers, and even
decrypted content from encrypted applications.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Memory Dumping Tools:

● FTK Imager: Primarily used for disk imaging, FTK Imager can also capture
RAM. It provides a simple interface for acquiring memory dumps without
affecting the underlying system.

● DumpIt: A portable tool that can quickly create a full memory dump from a
system. It's designed for live response scenarios where time is critical.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Process and Thread Information:

● Active Processes: Investigators should capture a snapshot of all running


processes, including their Process IDs (PIDs), parent-child relationships, and
memory consumption. This helps in identifying unusual or malicious
processes that may not be visible through normal system monitoring tools.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Process and Thread Information:

● Threads and Handles: Threads within processes can be critical for


understanding how code is being executed. Investigators should also capture
handles, which represent open files, registry keys, and other system
resources being used by processes.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Network Connections:

● Open Ports and Active Connections: Volatile data collection must include
capturing information about open network ports and current network
connections. Tools like netstat or memory analysis can reveal ongoing
communications with external servers, which may indicate data stealing or
command-and-control activity.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Network Connections:

● Network Traffic in Memory: Some tools can reconstruct network packets


directly from memory. This is particularly useful in cases where an attacker
uses encrypted communication channels that bypass traditional network
traffic logging.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Encryption Keys in Memory:

● Decrypting Encrypted Volumes: Investigators can often retrieve encryption


keys from memory, which allows for the decryption of encrypted volumes
(e.g., BitLocker, VeraCrypt). For example, the decryption keys for full-disk
encryption solutions may remain in memory as long as the system is running.

● SSL/TLS Keys: SSL/TLS encryption keys can be extracted from memory to


decrypt encrypted network traffic, which is crucial in cases involving secure
communications.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Artifacts in Memory:

● Malware Analysis: RAM often contains malware that operates solely in


memory (fileless malware). Investigators must analyze memory dumps for
indicators of such malware, including abnormal code execution patterns,
injected code, and hidden processes.
● Volatile Artifacts: Key artifacts such as user credentials, session tokens, and
encryption keys may reside in RAM and should be carefully extracted and
analyzed. These artifacts can provide crucial insights into unauthorized
access and data manipulation.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Preservation of Volatile Data Integrity:

● Minimizing System Impact: Tools used for volatile data collection must
operate in a manner that minimizes their footprint on the system. The goal is
to avoid altering memory content or introducing new processes that could
overwrite critical evidence.
● Hashing and Verification: Just like with non-volatile data, the integrity of
volatile data must be ensured. Once the memory dump is created, it should
be hashed using algorithms like MD5 or SHA-256 to ensure that it remains
unaltered during analysis.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:

Legal and Ethical Considerations:

● Legal Authorization: Because volatile data often includes sensitive and


private information, investigators must ensure they have the proper legal
authorization before capturing memory dumps, especially in cases involving
live systems.

● Data Protection Compliance: Volatile data may contain personal data, and
investigators must handle it in in compliance with with data protection
regulations such as GDPR or CCPA.
Data Collection Methods
Case Study 1: Sony Pictures Hack (2014)

Background:
● In November 2014, Sony Pictures Entertainment was the target of a
devastating cyber attack. The attackers stole sensitive information and
deployed wiper malware that destroyed data across the network.
Data Collection Methods
Case Study 1: Sony Pictures Hack (2014)

Volatile Data Collection:


● Investigators collected RAM dumps from affected systems to understand the
nature of the malware and its operations in memory.
● Memory forensics revealed active processes, encryption keys, and network
communication channels used by the attackers.

Outcome:
● Volatile data analysis was instrumental in attributing the attack to North
Korean state-sponsored hackers and mitigating further damage.
Data Collection Methods
Case Study 2: Operation Aurora (2009-2010)

Background:
● Operation Aurora was a series of cyberattacks conducted by Chinese hackers
against major corporations, including Google, Adobe, and other tech firms, in
late 2009 and early 2010. The attackers exploited a zero-day vulnerability to
gain access to internal systems.
Data Collection Methods
Case Study 2: Operation Aurora (2009-2010)

Volatile Data Collection:


● Investigators collected RAM dumps from compromised systems to identify the
malware, which resided primarily in memory and exploited active processes.
● Memory analysis helped identify the C2 (command-and-control) infrastructure
used by the attackers and uncovered evidence of intellectual property theft.

Outcome:
● The volatile data collection provided critical insights into the attackers'
methods and led to the discovery of the widespread nature of the attacks
across multiple organizations.
Data Collection Methods
Collection of Persistent Data (e.g., Hard Drives)

● Persistent data refers to information stored on non-volatile storage devices,


such as hard drives, SSDs, and other forms of long-term storage.
● This data remains intact even after the system is powered down and typically
includes the operating system, application files, user data, and logs.
● Collecting persistent data involves creating exact duplicates of storage
devices for forensic analysis while preserving the original evidence.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Forensic Imaging:

● Bit-by-Bit Cloning: Forensic imaging requires creating a complete bit-by-bit


copy of a storage device, including unallocated space, slack space, and
deleted data. This is different from a simple file copy, as it ensures that all
data, including reamines of deleted files and system metadata, is preserved.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Forensic Imaging:

● Write-Blockers: To prevent any accidental alteration of the original storage


media, investigators use hardware or software write-blockers. These devices
ensure that data can be read from the storage device but not written to,
maintaining the integrity of the evidence.
● Tools: Industry-standard tools for forensic imaging include EnCase, FTK
Imager, and X-Ways Forensics. These tools allow investigators to create
forensic images in formats such as E01 (EnCase Evidence File) or raw (dd)
format.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Hidden and Encrypted Data:

● Hidden Partitions and Filesystems: Advanced persistent data collection


must account for hidden partitions, which may contain malware, encrypted
files, or hidden file systems (e.g., NTFS alternate data streams). Tools like
Sleuth Kit or Autopsy can be used to discover and analyze such hidden areas.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Hidden and Encrypted Data:

● Encrypted Data: If the storage device is encrypted, forensic investigators


need to address the challenge of decryption. This can involve acquiring
encryption keys from the device's memory, using brute-force techniques, or
leveraging known vulnerabilities in encryption schemes.

● Steganography: Investigators should also consider the possibility of


steganography, where data is hidden within other files, such as images or
videos. Specialized tools are required to detect and extract this data.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

File and Metadata Analysis:

● Filesystem Forensics: A deep analysis of the filesystem is crucial to uncover


hidden, deleted, or fragmented files. Investigators analyze metadata,
including file creation and access timestamps, to reconstruct the sequence of
events on the system.
● Journaling Filesystems: Many modern filesystems (e.g., NTFS, ext4) use
journaling, which can provide additional forensic evidence. By analyzing
journal logs, investigators can identify changes to the filesystem, such as file
deletions, renames, or modifications, even if the data has been erased.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Unallocated Space and Slack Space:

● Unallocated Space: Even when files are deleted, remains often remain in the
unallocated space of the storage device until they are overwritten. Advanced
forensic tools can recover these remains, providing access to deleted files
and fragments.
● Slack Space: Slack space is the unused space within allocated clusters on
the disk. For example, if a file is smaller than the cluster size, the remaining
space may still contain data from previously deleted files. Investigators should
analyze slack space to uncover hidden data.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Log and Event Analysis:

● System Logs: Persistent data includes various system logs that can provide
valuable insights into system activity, including login attempts, software
installation, and file access. These logs are often crucial in establishing a
timeline of events.

● Application Logs: In addition to system logs, application logs (e.g., web


server logs, database logs) must be analyzed to track user actions, identify
suspicious behavior, and reconstruct malicious activities.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Cloud Storage and Remote Devices:

● Cloud Forensics: In cases involving cloud storage, investigators must follow


specific procedures to collect data from cloud environments. This often
involves acquiring access credentials and capturing data from virtual
machines, databases, and remote storage systems. Cloud service providers
may have forensic tools and processes to assist investigators, but legal and
jurisdictional challenges can arise.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Cloud Storage and Remote Devices:

● Remote Acquisition: For remote devices (e.g., mobile devices, IoT devices),
specialized forensic tools like Cellebrite or Magnet AXIOM are used to extract
data while maintaining its integrity. This often involves rooting or jailbreaking
the device to gain full access to its storage.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Data Integrity and Documentation:

● Hashing: As with volatile data, ensuring data integrity is paramount.


Investigators must compute and document cryptographic hashes (MD5,
SHA-1, or SHA-256) of the original data and forensic images. These hashes
serve as digital fingerprints to verify that the evidence has not been altered
during the investigation.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Data Integrity and Documentation:

● Chain of Custody: Every step of the data collection process must be


thoroughly documented, including who handled the evidence, how it was
stored, and any actions taken. This documentation is essential for maintaining
the chain of custody and ensuring the admissibility of the evidence in court.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:

Preservation Techniques:

● Secure Storage: After imaging, the original storage device should be


securely stored in a controlled environment, with access restricted to
authorized personnel only. The forensic image is then used for analysis,
ensuring that the original data remains untouched.
● Data Retention Policies: Forensic investigators must comply with legal and
organizational policies regarding data retention. This may involve securely
archiving forensic images and related documentation for a specific period,
ensuring that evidence can be re-examined if necessary.
Data Collection Methods
Case Study 3: BTK Killer Dennis Rader (2005)

Background:
● Dennis Rader, known as the BTK Killer, was caught after sending a floppy
disk to the media, believing it was anonymous.
Data Collection Methods
Case Study 3: BTK Killer Dennis Rader (2005)

Persistent Data Collection:


● Forensic investigators created a forensic image of the floppy disk and
analyzed its metadata.
● The metadata revealed that the document had been created on a computer at
a local church, leading directly to Dennis’s arrest.

Outcome:
● Persistent data analysis played a crucial role in solving this high-profile case,
linking digital evidence to a notorious serial killer.
Data Collection Methods
Case Study 4: Enron Investigation (2001)

Background:
● The Enron scandal involved massive corporate fraud, leading to the
company's bankruptcy and the downfall of Arthur Andersen LLP, one of the
largest audit firms at the time.
Data Collection Methods
Case Study 4: Enron Investigation (2001)

Persistent Data Collection:


● Investigators collected and analyzed massive amounts of persistent data from
Enron's servers, hard drives, and email archives.
● The forensic analysis uncovered evidence of fraudulent accounting practices,
email conversations discussing illegal activities, and deleted files related to
the fraud.

Outcome:
● Persistent data collection was key to building the case against Enron
executives and securing multiple convictions for corporate fraud.
Data Collection Methods
Case Study 5: Target Data Breach (2013)

Background:
● Target Corporation suffered a massive data breach in 2013, where hackers
stole payment card information from over 40 million customers by
compromising Target’s point-of-sale (POS) systems.
Data Collection Methods
Case Study 5: Target Data Breach (2013)

Hybrid Data Collection:


● Volatile Data: RAM data was collected from POS systems to analyze the
malware that captured unencrypted card data in memory.
● Persistent Data: Persistent data from network logs and servers was
analyzed to track the attackers’ movements and understand how they
exfiltrated the stolen data.

Outcome:
● The hybrid approach helped investigators mitigate the breach and led to
industry-wide changes in retail cybersecurity practices.
Data Collection Methods
Case Study 6: Ukraine Power Grid Cyberattack (2015)

Background:
● In December 2015, a cyberattack targeted Ukraine's power grid, causing
power outages across the Ivano-Frankivsk region. The attack was attributed
to Russian state-sponsored hackers and involved sophisticated malware.
Data Collection Methods
Case Study 6: Ukraine Power Grid Cyber Attack (2015)

Hybrid Data Collection:


● Volatile Data: RAM data was collected from compromised control systems to
analyze the malware, which operated in memory to disable industrial control
systems.
● Persistent Data: Persistent data from control system logs and hard drives
helped reconstruct the attackers’ steps and identify how they gained access
to critical infrastructure systems.
Outcome:
● The hybrid data collection allowed investigators to understand the full scope
of the attack and helped bolster defenses for critical infrastructure worldwide.
Hashing Algorithms

● Definition: Hashing is the process of converting an input (or 'message') into a


fixed-size string of bytes, typically a digest that appears random. The output,
known as a hash value, is unique for different inputs.

● Importance in Digital Forensics: Hashing is used to verify the integrity of


data, ensuring that evidence has not been altered. This is critical in
maintaining the chain of custody.
Hashing Algorithms
Common Hashing Algorithms:

● MD5 (Message Digest Algorithm 5): Produces a 128-bit hash value. Widely
used but vulnerable to collisions.

● SHA-1 (Secure Hash Algorithm 1): Produces a 160-bit hash value. More
secure than MD5 but has been deprecated due to vulnerabilities.

● SHA-256 (Secure Hash Algorithm 256): Part of the SHA-2 family, produces
a 256-bit hash value, offering robust security.
Hashing Algorithms
MD5 (Message Digest Algorithm 5)

● Developed by: Ronald Rivest in 1991

● Digest Size: 128-bit (32-character hexadecimal number)

● Input Block Size: 512-bit blocks

● Purpose: Originally designed for cryptographic security, MD5 was widely


used to verify data integrity, especially for file verification and storing
passwords.
Hashing Algorithms
How it works:
● Padding the Input: MD5 processes the input message in blocks of 512 bits.
If the input is not a multiple of 512 bits, it is padded with a 1 followed by
enough zero bits to make the length congruent to 448 modulo 512, then
appends the length of the original message as a 64-bit number.

● Message Processing: The padded message is divided into 512-bit chunks.


MD5 operates on these chunks using a compression function that involves
bitwise operations, modular additions, and shifts.

● Final Digest: After processing all chunks, MD5 produces a 128-bit message
digest (hash).
Hashing Algorithms
Applications:

● File Integrity Checks: MD5 was extensively used to ensure files were
transferred correctly (e.g., when downloading software, the MD5 hash of the
file was used to check if the download was correct).

● Password Hashing: Early systems stored passwords in MD5 hash form to


obscure them.

● Digital Signatures: MD5 was used in certificate generation for secure


communications.
Hashing Algorithms
Vulnerabilities

● Collision Attacks: MD5 is vulnerable to collision attacks, where two different


inputs produce the same hash value. In 2004, researchers demonstrated
practical collision attacks on MD5.

● Weakness in Security: Due to its vulnerability to collisions, MD5 is


considered insecure for cryptographic purposes. Modern applications no
longer rely on MD5 for cryptographic security but may still use it for
non-cryptographic checksums and file integrity verification.
Hashing Algorithms
MD5 Collision Example:

● Input 1: "Example A" - MD5: b1946ac92492d2347c6235b4d2611184


● Input 2: "Example B" - MD5: b1946ac92492d2347c6235b4d2611184

SHA-1 Collision Example:

● Input 1: "Collision Test 1" - SHA-1:


b4e23a8f84a1e12b8a5d0d9f93b2d9df6c5939fc
● Input 2: "Collision Test 2" - SHA-1:
b4e23a8f84a1e12b8a5d0d9f93b2d9df6c5939fc
Hashing Algorithms
SHA-1 (Secure Hash Algorithm 1)

● Developed by: National Security Agency (NSA) in 1993

● Digest Size: 160-bit (40-character hexadecimal number)

● Input Block Size: 512-bit blocks

● Purpose: SHA-1 was widely used in security protocols like SSL, TLS, and
cryptographic applications, particularly in digital signatures.
Hashing Algorithms
How It Works:
● Padding the Input: Similar to MD5, SHA-1 processes the message in 512-bit
chunks, padding it to make the message length congruent to 448 modulo 512.
The padding includes the length of the original message as a 64-bit integer at
the end.

● Message Processing: SHA-1 uses a series of logical functions and bitwise


operations on the 512-bit blocks. It applies a compression function that works
through a series of 80 rounds, each modifying an internal state that is
eventually transformed into the final hash value.

● Final Digest: After processing, SHA-1 outputs a 160-bit message digest.


Hashing Algorithms
Applications:

● Digital Signatures: SHA-1 was heavily used in secure communication


protocols, including SSL/TLS, to generate digital signatures.

● Software Integrity: Many operating systems and software used SHA-1 to


check file integrity during updates.

● Version Control Systems: Git and other version control systems used
SHA-1 to create unique hashes for identifying different file versions.
Hashing Algorithms
Vulnerabilities:

● Collision Attacks: In 2005, cryptographers found weaknesses in SHA-1,


which made collision attacks feasible. By 2017, Google publicly demonstrated
a full SHA-1 collision, confirming the algorithm's insecurity for cryptographic
uses.

● Deprecation: Due to these vulnerabilities, SHA-1 has been deprecated for


most security purposes, and SHA-2 (which includes SHA-256) is preferred for
secure applications.
Hashing Algorithms
SHA-256 (Secure Hash Algorithm 256):

● Developed by: NSA as part of the SHA-2 family in 2001

● Digest Size: 256-bit (64-character hexadecimal number)

● Input Block Size: 512-bit blocks

● Purpose: SHA-256 was designed to provide enhanced security over SHA-1.


It is widely used in security protocols, digital signatures, blockchain
technologies, and more.
Hashing Algorithms
How It Works:

● Padding the Input: Like MD5 and SHA-1, the input message is padded to
make the message length congruent to 448 modulo 512. The padding
includes the length of the message as a 64-bit integer at the end.
● Message Processing: SHA-256 processes the input in 512-bit blocks. The
algorithm goes through 64 rounds of processing, involving modular additions,
bitwise operations, and logical functions. It uses eight working variables and a
series of constants derived from the first 32 bits of the fractional parts of the
cube roots of the first 64 prime numbers.
● Final Digest: After processing all message blocks, SHA-256 produces a
256-bit message digest, providing stronger security than SHA-1.
Hashing Algorithms
Applications:

● Cryptographic Security: SHA-256 is widely used in security protocols,


including SSL/TLS and VPNs, to ensure secure communications.

● Blockchain Technology: Cryptocurrencies like Bitcoin use SHA-256 as the


basis for creating proof-of-work hashes in blockchain networks.

● Digital Signatures and Certificates: SHA-256 is used to sign digital


certificates, providing strong verification of authenticity and data integrity.
Hashing Algorithms
Applications:

● Cryptographic Security: SHA-256 is widely used in security protocols,


including SSL/TLS and VPNs, to ensure secure communications.

● Blockchain Technology: Cryptocurrencies like Bitcoin use SHA-256 as the


basis for creating proof-of-work hashes in blockchain networks.

● Digital Signatures and Certificates: SHA-256 is used to sign digital


certificates, providing strong verification of authenticity and data integrity.
Hashing Algorithms
Feature MD5 SHA-1 SHA-256

Hash Size 128-bit (32-character hex) 160-bit (40-character hex) 256-bit (64-character hex)

Block Size 512 bits 512 bits 512 bits

Rounds 64 rounds 80 rounds 64 rounds

Security Vulnerable to collisions Vulnerable to collisions Secure against known


collision attacks

Speed Fast Moderate Slower

Applications File integrity, checksums Digital signatures, SSL SSL, blockchain, certificates
(non-crypto) (deprecated)

Known Collisions, broken in 2004 Collisions, broken in 2017 None known (as of 2024)
Vulnerabilities
Hashing Algorithms
Properties of Hashing Algorithms

● Deterministic: The same input will always produce the same hash output,
ensuring consistency in forensic analysis.

● Fast Computation: Hash functions are designed to be computationally


efficient, enabling quick processing of large datasets.

● Pre-image Resistance: It should be computationally infeasible to reverse the


hash value back to the original input, ensuring data security.
Hashing Algorithms
Properties of Hashing Algorithms

● Collision Resistance: A good hash function minimizes the likelihood that two
different inputs will produce the same hash value, which is crucial to avoid
evidentiary issues.

● Avalanche Effect: A small change in the input should drastically change the
hash value, preventing small alterations from going unnoticed.
Hashing Algorithms
Applications of Hashing in Digital Forensics

● Verifying Integrity: Hashes are used to ensure that digital evidence remains
unaltered from the point of acquisition to analysis.

● Detecting Data Tampering: Hashes help identify unauthorized changes to


data, crucial in both criminal and civil cases.

● Ensuring Authenticity: Hashes are used to confirm the authenticity of digital


evidence, especially in court settings.
Hashing Algorithms
MD5 Example:
● Input: "Forensic Analysis"
● MD5 Hash Output: 8c69d909f1dc6a42e3ef4c6b5a2d46ef

SHA-1 Example:
● Input: "Forensic Analysis"
● SHA-1 Hash Output: 3d5aa13adcaebed1f524efb6f3a2040e8f92fb65

SHA-256 Example:
● Input: "Forensic Analysis"
● SHA-256 Hash Output:
16df40b722ca02cc44d1f6f8d18e2c7686d5b865cf29c14263215c5f865c8021
Cloning of Digital Exhibits
Understanding Digital Exhibits

● Definition: Digital exhibits refer to any form of digital data that can be used as
evidence in a legal context. This includes files, emails, digital photos, log files,
and even entire hard drives.
Cloning of Digital Exhibits
Understanding Digital Exhibits

Types of Digital Exhibits:


● Files: Documents, images, videos, etc.
● Emails: Content, metadata, attachments.
● Logs: System logs, network logs, audit trails.
● Disk Images: Exact copies of a storage device, preserving all data for
analysis.

● Importance: Digital exhibits often contain crucial information that can


determine the outcome of a legal case, making their integrity and authenticity
paramount.
Cloning of Digital Exhibits
What is Cloning?

● Definition: Cloning is the process of creating an exact, bit-for-bit copy of a


digital exhibit. This ensures that the original evidence is preserved while the
clone can be analyzed.

Difference Between Cloning and Copying:

● Cloning: Captures all data, including deleted files, file slack, and unallocated
space.
● Copying: Typically only copies active files and may miss hidden or deleted
data.
Cloning of Digital Exhibits
Methods of Cloning:

● Hardware-based Cloning: Uses dedicated hardware devices to clone disks,


ensuring no data is altered during the process.

● Software-based Cloning:

○ dd command: A Unix-based command-line utility that can create exact copies of files or entire
disks.
○ FTK Imager: A widely-used forensic imaging tool that can create forensic disk images and
verify them with hash values.
Cloning of Digital Exhibits
Best Practices for Cloning Digital Exhibits

● Use Write Blockers: Ensure that the original media cannot be altered during
the cloning process.

● Verify the Integrity: Generate hash values for the original and cloned
exhibits both before and after cloning to confirm no changes have occurred.

● Document Every Step: Maintain detailed records of the cloning process,


including the tools used, the operators involved, and the exact steps taken.
This documentation is crucial for legal admissibility.
Cloning of Digital Exhibits
Hashing and Cloning Workflow

● Acquire Original Digital Exhibit: Ensure the exhibit is collected in a manner


that maintains its integrity.
● Generate Hash Value of the Original: Create a hash of the original exhibit to
use as a baseline for integrity checks.
● Perform Cloning of the Exhibit: Use either hardware or software-based
methods to create an exact copy.
● Generate Hash Value of the Clone: Compare this with the original hash to
ensure the clone is identical.
● Compare Hashes to Ensure Integrity: If the hashes match, the cloning
process is verified. If not, investigate potential issues.
Cloning of Digital Exhibits
Tools for Hashing and Cloning

Hashing Tools:

● HashCalc: A simple tool that can calculate hash values using various
algorithms.

● md5sum: A command-line utility in Unix/Linux that calculates and verifies


MD5 hashes.

● SHA256sum: A command-line utility similar to md5sum, but for SHA-256.


Cloning of Digital Exhibits
Tools for Hashing and Cloning

Cloning Tools:

● FTK Imager: Allows for the creation of forensic images, as well as the
verification of image integrity.

● EnCase: A comprehensive forensic suite that includes powerful imaging


capabilities.

● dd: A versatile Unix/Linux command that can create exact copies of disks and
partitions.
Cloning of Digital Exhibits
Challenges

● Hash Collisions: Although rare, hash collisions can occur, where two
different inputs produce the same hash value. This can compromise the
integrity of forensic evidence.

● Data Volatility: Live systems, such as running servers or databases, can


change rapidly. Capturing an exact state without alteration is challenging.

● Legal Considerations: Different jurisdictions may have varying requirements


for the admissibility of digital evidence. Ensuring that cloning and hashing
processes meet legal standards is crucial.
Cloning of Digital Exhibits
Case Study

● Case: A financial fraud investigation where digital evidence was crucial.


Discuss how hashing ensured the integrity of transaction records and how
cloning allowed for detailed analysis without altering the original data.

● Process: Detail the steps taken to hash, clone, and analyze the evidence,
emphasizing the importance of maintaining data integrity.

● Outcome: Highlight how the use of these forensic techniques contributed to a


successful legal outcome.
Cloning of Digital Exhibits
Key Takeaways:

● Hashing: Essential for verifying the integrity of digital evidence. Ensures that
data remains unchanged from acquisition to analysis.

● Cloning: Critical for preserving the original evidence while allowing for
in-depth analysis. Ensures that the original data remains intact and unaltered.

● Best Practices: Following established procedures and using reliable tools


ensures that digital evidence is admissible in court.
Digital Imaging Formats
Definition of Forensic Formats:
● Forensic formats are specific data structures designed to store digital
evidence acquired during an investigation.
● Essential for ensuring the integrity and authenticity of digital evidence.

Categories of Forensic Formats:


● Raw Formats: Unprocessed data dumps from digital storage media.
● Proprietary Formats: Formats developed and maintained by specific
vendors.
● Advanced Forensic Formats: Evolved formats that offer flexibility and
extended capabilities.
Digital Imaging Formats
Raw Formats

Overview:
● Raw formats represent a direct bit-by-bit copy of the original storage medium,
capturing all data as it is.

Examples:
● Common Extensions: .dd, .img, .raw
Digital Imaging Formats
Raw Formats

Advantages:

● Simplicity: No additional data processing or compression.


● Tool Support: Supported by nearly all forensic tools due to its simplicity.
● Exact Replica: Ensures an exact digital duplicate of the source media.
Digital Imaging Formats
Raw Formats

Disadvantages:

● File Size: Typically large because of lack of compression; exact copy of the
original media size.
● Metadata Storage: Raw formats do not inherently store metadata such as
hash values, timestamps, or acquisition details, which must be managed
separately.
● No Error Detection/Correction: Cannot inherently detect errors in data
acquisition.
Digital Imaging Formats
Proprietary Formats

Overview:
● Proprietary formats are developed by specific software vendors and are
designed to offer enhanced features like compression, encryption, and error
detection.

Examples:
● EnCase Format (.E01): Widely used by EnCase forensic software.
● FTK Image Format (.AD1): Used by AccessData’s FTK.
Digital Imaging Formats
Proprietary Formats

Advantages:

● Compression: Reduces storage space, making it more efficient for


large-scale investigations.
● Encryption: Ensures data confidentiality.
● Metadata Storage: Captures and stores important acquisition details, such
as hash values, time stamps, and investigator information, directly within the
format.
● Error Detection: Can include error-checking mechanisms to ensure data
integrity.
Digital Imaging Formats
Proprietary Formats

Disadvantages:

● Vendor Dependency: Requires specific tools or software from the vendor to


access and analyze data.
● Compatibility Issues: May not be universally supported across different
forensic tools, limiting flexibility.
● Cost: Often requires purchase of expensive software licenses.
Digital Imaging Formats
Advanced Forensic Formats

Overview:
● Advanced Forensic Formats (AFF): Developed as an open or semi-open
standard to address limitations of raw and proprietary formats.

Examples:
● Advanced Forensic Format (.aff): Supports compression, encryption, and
extensive metadata storage.
● AFF4: An enhanced version of AFF with better support for large-scale
investigations and faster processing.
Digital Imaging Formats
Advanced Forensic Formats

Advantages:

● Flexibility: Supports complex metadata, such as multiple hash values, case


notes, and evidence handling logs.
● Open Standard: Reduces vendor lock-in, encourage compatibility across
different forensic tools.
● Scalability: Well-suited for large-scale investigations requiring extensive data
management.
● Modularity: Allows for easy extension of the format to include new features or
accommodate new types of evidence.
Digital Imaging Formats
Explanation of Criteria:

● Data Integrity: Ability to maintain an exact copy of the original data.


● Metadata Support: Capability to store additional information, such as
acquisition details.
● Tool Compatibility: Number of forensic tools that can read or write the
format.
● File Size: How efficiently the format handles storage space.
● Encryption Support: Ability to encrypt data within the format.
● Error Detection: Mechanisms to ensure data was captured without
corruption.
Digital Imaging Formats

Criteria Raw Formats Proprietary Formats Advanced Formats

Data Integrity High High High

Metadata Support Low High Very High

Tool Compatibility Very High Medium Medium-High

File Size Large Compressed Flexible

Encryption Support No Yes Yes

Error Detection No Yes Yes


Digital Imaging Formats
When to Use Raw Formats:

● Small-scale Investigations: Ideal for simple cases with straightforward


evidence.
● Tool-challenger Needs: When a wide range of tool compatibility is required.
● No Metadata Requirements: Suitable when additional metadata storage is
not necessary.
Digital Imaging Formats
When to Use Proprietary Formats:

● Specific Software Workflows: When using tools like EnCase or FTK that
require their proprietary formats.
● Need for Compression: Efficient storage of large volumes of evidence.
● Enhanced Security: Cases where encryption and error detection are critical.
Digital Imaging Formats
When to Use Proprietary Formats:

● Specific Software Workflows: When using tools like EnCase or FTK that
require their proprietary formats.
● Need for Compression: Efficient storage of large volumes of evidence.
● Enhanced Security: Cases where encryption and error detection are critical.
Digital Imaging Formats
When to Use Advanced Formats:

● Large-Scale Investigations: Suitable for complex cases involving multiple


sources of digital evidence.
● Need for Extensive Metadata: When tracking detailed forensic metadata is
essential.
● Future-Proofing: Open standards ensure long-term accessibility and reduce
reliance on specific vendors.
Digital Imaging Formats
Summary:

● Raw Formats: Simple and widely compatible but lack advanced features.
● Proprietary Formats: Offer enhanced capabilities but come with vendor
lock-in and compatibility issues.
● Advanced Formats: Provide flexibility and scalability, ideal for complex
investigations but may require specialized tools and knowledge.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:

Definition:
● Imaging refers to creating a bit-by-bit copy of a digital storage device,
including all data sectors, unallocated space, and slack space.

Purpose:
● To preserve the exact state of the original device, ensuring that no data is
altered or omitted during the process.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:

Definition:
● Imaging refers to creating a bit-by-bit copy of a digital storage device,
including all data sectors, unallocated space, and slack space.

Purpose:
● To preserve the exact state of the original device, ensuring that no data is
altered or omitted during the process.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:

Advantages:
● Forensic Integrity: Captures all data, including deleted files and hidden
information.
● Authenticity: Generates a hash value for verification, ensuring the image is
an exact replica of the original.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:

Advantages:
● Forensic Integrity: Captures all data, including deleted files and hidden
information.
● Authenticity: Generates a hash value for verification, ensuring the image is
an exact replica of the original.

Common Tools: FTK Imager, EnCase, dd (Linux utility).


Imaging vs cloning/copying digital evidence
Cloning/Copying Digital Evidence:

Definition:
● Cloning or copying involves replicating the file system of a device, copying
active files and directories, but not necessarily capturing slack space or
deleted data.

Purpose:
● Typically used for backup or data migration purposes, rather than forensic
analysis.
Imaging vs cloning/copying digital evidence
Cloning/Copying Digital Evidence:

Advantages:
● Speed: Generally faster than imaging, as it doesn’t include every sector.
● Practicality: Useful for scenarios where only active files are required.

Disadvantages:
● Incomplete Capture: Does not include deleted files, unallocated space, or
hidden data, which are often critical in forensic investigations.
● Lacks Forensic Integrity: May alter file metadata during the copying
process, compromising the evidence.
Imaging vs cloning/copying digital evidence
Comparison: Imaging vs. Cloning:

Imaging:
● Forensic Accuracy: High
● Data Captured: Entire storage device (including hidden and deleted data)
● Use Case: Digital forensics, legal investigations.

Cloning/Copying:
● Forensic Accuracy: Low
● Data Captured: Only active files and directories
● Use Case: Backup, data migration, non-forensic needs.
Imaging vs cloning/copying digital evidence
Comparison: Imaging vs. Cloning:

Imaging:
● Forensic Accuracy: High
● Data Captured: Entire storage device (including hidden and deleted data)
● Use Case: Digital forensics, legal investigations.

Cloning/Copying:
● Forensic Accuracy: Low
● Data Captured: Only active files and directories
● Use Case: Backup, data migration, non-forensic needs.
Imaging vs cloning/copying digital evidence
Key Takeaway:

● Imaging is the preferred method in digital forensics for its ability to capture the
complete digital footprint of a device, ensuring no potential evidence is lost.

● Cloning is more suited for general IT purposes where forensic integrity is not
a concern.
Data Acquisition from Various Systems
● Definition: Data acquisition is the process of extracting and preserving digital
evidence for forensic analysis.

● Goals: Maintain data integrity, authenticity, and ensure evidence is legally


admissible.

● Types of Systems: Live systems, shutdown systems, remote servers, RAID


arrays, encrypted devices.

● Challenges: Each system type presents unique challenges, requiring tailored


approaches for proper evidence handling.
Data Acquisition from Live Systems
● Definition: Live systems are active and running, allowing access to both
volatile (e.g., RAM) and non-volatile data.

● Importance: Capture volatile memory (RAM), running processes, network


connections, and system states.

Tools:
● FTK Imager: Acquires live memory and files without shutting down the
system.
● X-Ways Forensics: Handles disk imaging and live acquisition.
● Volatility Framework: Specializes in RAM and volatile data analysis.
Data Acquisition from Live Systems
Challenges:
● Volatile memory disappears if the system is shut down.
● System changes might occur during acquisition, affecting integrity.

Best Practices:
● Use write-blocking software to avoid altering data.
● Document every action taken during acquisition (time-stamped).
● Prioritize capturing volatile data first.
Data Acquisition from Live Systems
Case Study 1: A malware attack on a corporate network.

● Details: An IT security team detects unusual network traffic on a live server


suspected of being compromised by ransomware. The server is still
operational and critical to business functions.
● Solution: Using FTK Imager, the team captures live memory (RAM) to
identify active processes, network connections, and any encryption keys or
ransomware executables in memory.
● Outcome: Memory analysis reveals the ransomware method of encryption
and command-and-control (C2) server communications, leading to an
effective containment strategy without needing to power down the server.
Data Acquisition from Shutdown Systems
● Definition: Powered-down systems; focus is on acquiring non-volatile data
such as hard drive content.

● Benefits: Easier to maintain data integrity as no changes are made while


acquiring.

Tools:
● EnCase: Industry-standard tool for disk imaging and analysis.
● dd: A Unix-based tool for bit-by-bit copies of disks.
● Forensic Imager: Simplifies the process of acquiring disk images.
Data Acquisition from Shutdown Systems
Challenges:
● Inaccessible volatile data (RAM, running processes).
● Risk of damaging evidence if the system is powered on improperly.

Best Practices:
● Ensure physical access to the system without altering it.
● Always verify the hash values of disk images post-acquisition for authenticity.
● Avoid booting the system; use hardware write blockers if necessary.
Data Acquisition from Shutdown Systems
Case Study 2: Investigation of a fraudulent activity on a personal computer.

● Details: Law enforcement seizes a suspect's computer suspected of


containing evidence of financial fraud. The computer is turned off at the time
of acquisition.
● Solution: The investigators use EnCase to image the hard drive without
powering the system on. The image is later analyzed to uncover incriminating
emails and financial records.
● Outcome: The forensic investigation retrieves vital financial documents that
link the suspect to fraudulent transactions, leading to their conviction.
Data Acquisition from Remote Systems
● Definition: Acquisition over a network without physical access to the device.

● Common Use Cases: Cloud environments, remote servers, and corporate


networks.

Tools:
● F-Response: Provides remote access to disks and volatile memory over the
network.
● Network Miner: Gathers network traffic data for forensic analysis.
● Magnet AXIOM: Acquires and analyzes data from cloud services and remote
endpoints.
Data Acquisition from Remote Systems
Challenges:
● Risk of data modification or loss during transfer.
● Legal considerations: Ensuring chain of custody across jurisdictions.
● Encryption or VPNs can slow down or block acquisition.

Best Practices:
● Use secure, encrypted channels (e.g., SSH, VPN) for transmission.
● Log all remote access sessions and actions performed.
● Document IP addresses, network configurations, and timestamps.
Data Acquisition from Remote Systems
Case Study 3: Breach of a cloud-based storage system.

● Details: A company suspects that their cloud server has been compromised,
leaking sensitive customer data. Physical access to the cloud infrastructure is
not available.
● Solution: Investigators use F-Response to remotely access the cloud server
and acquire a forensic image. Logs and volatile data from the live cloud
environment are also captured.
● Outcome: Analysis of the remote acquisition reveals unauthorized access to
the cloud server, including API calls that exposed customer data. This
evidence helps the company patch vulnerabilities and support a legal case
against the attackers.
Data Acquisition from RAID Servers
● Definition: RAID (Redundant Array of Independent Disks) servers use
multiple disks to enhance performance, reliability, or redundancy.

● RAID Levels: Understanding striping (RAID 0), mirroring (RAID 1), parity
(RAID 5/6) is critical for proper reassembly.

Tools:
● R-Studio: Forensics tool that reconstructs RAID configurations for imaging.
● X-Ways RAID Reconstructor: Rebuilds RAID arrays from individual disk
images.
● ProDiscover: Supports RAID recovery and forensic imaging.
Data Acquisition from RAID Servers
Challenges:
● Must correctly identify RAID configuration before imaging.
● RAID arrays may fail partially, requiring reconstruction from degraded disks.

Best Practices:
● Identify RAID controller type and configuration (metadata).
● Image each disk in the array individually before reconstruction.
● Ensure proper assembly of the RAID array to maintain data accuracy.
Data Acquisition from RAID Servers
Case Study 4: Corporate server crash with RAID failure.

● Details: A business's RAID 5 server crashes, losing access to critical data.


The RAID configuration is partially degraded, with one disk completely failed.
● Solution: Forensic experts use R-Studio to image the remaining functional
disks and reconstruct the RAID configuration. ProDiscover assists in
extracting important business data from the recovered RAID structure.
● Outcome: The forensic team successfully recovers 95% of the company’s
important files from the RAID array, minimizing data loss and enabling the
business to resume operations.
Data Acquisition from Encrypted Systems
● Definition: Encrypted systems protect data through encryption mechanisms,
requiring decryption before analysis.

● Types of Encryption: Full-disk encryption (FDE), file-level encryption,


hardware-based encryption.

Tools:
● Passware: Decrypts and acquires encrypted drives with known passwords or
credentials.
● Elcomsoft Forensic Toolkit: Specialized in password recovery and
encryption bypass.
● Hashcat: High-speed password cracking tool for encrypted data.
Data Acquisition from Encrypted Systems
Challenges:
● Accessing encryption keys, which may reside in memory (live acquisition).
● Cracking passwords can be time-consuming, depending on the strength of
encryption.

Best Practices:
● For live systems, capture memory to extract encryption keys if possible.
● Document the encryption type and any decryption attempts thoroughly.
● Use legal authority to obtain decryption keys when possible.
Data Acquisition from Encrypted Systems
Case Study 5: Criminal investigation involving encrypted laptops.

● Details: Law enforcement seizes a laptop used in an organized crime


syndicate. The laptop is protected by full-disk encryption, preventing access
to stored data.
● Solution: Investigators use Passware to search for encryption keys in the
system’s memory (RAM) while the laptop is still powered on. Hashcat is
employed to attempt brute-forcing the password if the keys are unavailable.
● Outcome: The encryption key is successfully retrieved from memory, allowing
full decryption of the laptop’s contents. The data reveals incriminating
financial records and communication between syndicate members, leading to
multiple arrests.
Comparing Data Acquisition Methods
System Type Key Tools Challenges Best Practices

Live Systems FTK Imager, Volatility Volatile data, potential Capture volatile data
system changes first, use minimal
footprint

Shutdown Systems EnCase, dd, Forensic Non volatile data, risk if Use write blockers,
Imager powered on incorrectly verify hash values

Remote Systems F-Response, Magnet Network limitations, Use secure


AXIOM legal issues connections, document
remote activities

RAID Servers R-Studio, X-Ways RAID Reconstructing RAID, Image disks


identifying individually, reconstruct
configurations accurately

Encrypted Systems Passware, Elcomsoft, Encryption decryption, Capture memory for


Hashcat cracking passwords keys, use legal methods
for decryption
Data Acquisition from Various Systems
Key Consideration:

● Each system type requires specific tools and methods for effective data
acquisition.
● Real-world case studies demonstrate the importance of choosing the right
approach:
○ Live Systems: Fast action is crucial to capture volatile data.
○ Shutdown Systems: Ensures data integrity by focusing on non-volatile storage.
○ Remote Systems: Critical for cloud and distributed environments.
○ RAID Servers: Involves complex recovery but essential in corporate settings.
○ Encrypted Systems: Encryption adds another layer of complexity but can be overcome with
memory acquisition or brute-forcing techniques.
● Proper preparation and technique ensure that valuable evidence is preserved
and admissible in court.
Linux Validation Techniques
Checksum Tools:
● md5sum: Generates MD5 hash for comparison between source and acquired
data.
● sha256sum: Creates a SHA-256 hash for stronger integrity checks.

Built-in Forensics Tools:


● dcfldd: Enhanced version of dd, includes hashing during data acquisition.
● dd: Simple disk copying tool with optional post-process hashing.

Automated Tools:
● Guymager: Open-source tool, provides hash verification while imaging.
● AIR (Automated Image and Restore): Provides automated acquisition and
validation.
Windows Validation Techniques
Checksum Tools:
● CertUtil: Windows command-line tool to generate hashes (MD5, SHA256).
● FCIV (File Checksum Integrity Verifier): Simple checksum tool for hash
validation.

Third-Party Tools:
● FTK Imager: Industry-standard tool that includes hashing and validation
during acquisition.
● X-Ways Forensics: Comprehensive tool for imaging and validation with
hashing algorithms.

Native Windows Options:


● Powershell: Use Get-FileHash cmdlet for hashing files and images.
Digital Forensics Standard Operating Procedures (SOPs)
Definition
● A structured, step-by-step set of guidelines for digital forensic investigation
processes to ensure consistency, accuracy, and legal compliance.

Importance
● Ensures evidence integrity: SOPs maintain the original state of digital
evidence, preventing tampering or alteration.
● Reduces risk of contamination: Standardized procedures prevent
accidental or intentional corruption of digital evidence.
● Provides legal defensibility: Following SOPs ensures the investigation
process is reliable and admissible in court.
Digital Forensics SOPs
Goals of SOPs
● Maintain the chain of custody.
● Preserve the integrity of digital evidence.
● Follow legal and regulatory requirements.
● Ensure repeatability and reproducibility of results.

Common Challenges Without SOPs


● Inconsistent Findings: Different investigators could produce different results
on the same evidence without standardized procedures.
● Legal Rejection: Poor evidence handling could lead to evidence being
inadmissible in court.
● Security Risks: Unstandardized processes may expose sensitive data to
risks like unauthorized access or accidental modification.
Key Phases in Digital Forensics SOPs
1. Identification: Determining the scope of investigation and target devices.

2. Collection: Gathering evidence while ensuring it remains unaltered.

3. Examination: Analyzing data in a structured manner.

4. Analysis: Extracting relevant information from evidence.

5. Presentation: Reporting findings clearly and accurately.

6. Documentation: Keeping detailed logs of each step taken.


SOP for Evidence Collection
Steps:
1. Use proper tools to collect evidence (hardware/software).
2. Ensure write protection is enabled.
3. Create cryptographic hashes of evidence before collection.
4. Maintain detailed chain of custody documentation.

Best Practices:
● Use forensically sound methods.
● Avoid unnecessary handling.
● Label and store evidence securely.
SOP for Evidence Examination
Steps:
1. Verify integrity using hash comparisons.
2. Use designated forensic tools (e.g., EnCase, FTK).
3. Document each action performed on the evidence.

Best Practices:
● Always use verified, tested tools.
● Create working copies of data before examination.
● Maintain logs of tool versions and settings used.
SOP for Analysis and Reporting
Steps:
1. Analyze data based on the scope of the investigation.
2. Cross-validate results using different tools or methods.
3. Write clear, concise, and accurate reports.

Best Practices:
● Focus on factual findings without assumptions.
● Use visual aids (graphs, timelines) for clarity.
● Peer review the report before final submission.
Chain of Custody in Digital Forensics
Definition
The chronological documentation showing the seizure, custody, control, transfer,
analysis, and disposition of evidence.

Components
● Evidence identifier (serial number, case ID).
● Time and date stamps.
● List of all individuals handling the evidence.
● Documentation of any access to the evidence.
Legal Considerations in SOPs
Regulatory Requirements
● Compliance with local, national, and international laws.
● Privacy and data protection standards (e.g., GDPR).

Court Admissibility
● Evidence must be collected and handled according to SOPs for it to be
admissible in court.
● Testimony of examiners relies on adherence to procedures.
Digital Forensics Standard Operating Procedures (SOPs)
Key Considerations

● SOPs are critical for maintaining evidence integrity and legal defensibility.
● Consistent application of SOPs ensures reliable forensic outcomes.
● Adhering to Digital Forensics SOPs is not just a best practice but a necessity
for lawful, credible investigations.
Software and Hardware Tools Used in Forensic
Analysis
Definition
● Digital forensic tools, both software and hardware, are used to collect,
analyze, and preserve digital evidence while maintaining its integrity.

Categories of Tools
● Open Source: Freely available tools used for forensic investigations.
● Proprietary: Commercial tools with advanced features and support.

Importance
● Facilitates accurate evidence collection.
● Assists in analyzing complex datasets.
● Ensures legal compliance by following forensically sound methods.
Popular Open Source Tools in Digital Forensics
● Autopsy: A GUI-based platform used for digital forensic analysis. Best for
investigating hard drives and smartphones.
● Sleuth Kit (TSK): Command-line tools for recovering deleted files and
partitions.
● Volatility: Advanced memory forensics tool used for analyzing RAM.
● Wireshark: Network packet analyzer for capturing and inspecting network
traffic.
Advantages
● Free and customizable.
● Community-driven development and support.
● Useful for smaller organizations or specific forensic tasks.
Popular Proprietary Tools in Digital Forensics
● EnCase: One of the most widely used forensic tools. It supports disk imaging,
data recovery, and analysis.
● FTK (Forensic Toolkit): A comprehensive suite for disk imaging, email
analysis, and registry analysis.
● X-Ways Forensics: Lightweight but powerful forensic software for advanced
data recovery and analysis.
● Magnet AXIOM: A tool for deep analysis of mobile devices, cloud storage,
and social media.
Advantages
● Advanced features for large-scale investigations.
● Official customer support and regular updates.
● Integration with other commercial forensic hardware.
Hardware Tools in Digital Forensics
● Write Blockers: Prevent modification of data on digital storage devices while
allowing read access for analysis (e.g., Tableau Write Blocker).
● Forensic Duplicators/Imagers: Tools that create bit-for-bit copies of digital
storage media for analysis (e.g., Logicube Falcon, Image MASSter).
● Portable Forensic Workstations: Rugged laptops or workstations equipped
with forensic software for use in the field (e.g., Forensic Laptop Workstations
from Digital Intelligence).

Advantages
● Physical preservation of evidence.
● Support for high-speed data transfer and analysis.
● Rugged designs for fieldwork.
Comparison: Open Source vs Proprietary Tools
Feature Open Source Proprietary

Cost Free Expensive

Customization Highly customizable Limited to licensed features

Support Community-driven Official vendor support

Features Limited to basic features Advanced and extensive features

Usage Small organizations and specific tasks Large-scale investigations

Key Consideration:
● Open source tools are ideal for budget-conscious cases or smaller tasks,
while proprietary tools are necessary for large, complex investigations that
require advanced features and official support.
Integration of Open Source and Proprietary Tools
Hybrid Approach
● Many forensic investigators use a combination of open-source and proprietary
tools to balance cost, functionality, and the complexity of the investigation.

Example Workflow
● Use Sleuth Kit for initial file system recovery and follow up with FTK for
deeper registry analysis and email extraction.

Cost Efficiency
● A hybrid approach allows smaller organizations to handle large cases
effectively without fully relying on expensive proprietary tools.
Choosing the Right Tool for the Investigation
Criteria to Consider
● Budget: Can the organization afford the proprietary tool?
● Investigation Scope: How complex is the investigation? Does it involve
advanced data like cloud or mobile forensics?
● Legal Requirements: Are there specific legal frameworks or certifications
that require certain tools?
● Team Expertise: Do investigators have the skill set to use open-source tools
effectively?
Key Considerations
● Selecting the right forensic tools, whether open-source or proprietary,
depends on the specific needs and resources of the investigation.

● Open source tools are valuable for basic forensic tasks and smaller
organizations, while proprietary tools provide advanced features and
professional support for complex cases.

● A combination of both types of tools can optimize both cost and efficiency in
digital forensic investigations.
Challenges in Cyber-Crime Investigation
Anonymity of Cyber Criminals:
● Criminals can hide their identity using VPNs, Tor networks, and encryption,
making it difficult to trace their actions.
Global Jurisdictional or Legal Issues:
● Cybercrime often crosses international borders, complicating legal processes,
cooperation between countries, and evidence collection.
Data Volume and Complexity:
● Investigators must deal with huge amounts of data, making it challenging to
analyze and extract relevant information quickly.
Rapid Evolution of Technology:
● New technologies and attack techniques, like ransomware or zero-day
exploits, make it difficult for forensic tools and methods to keep pace.
Challenges in Digital Forensics
Encryption:
● Widespread use of encryption on digital devices makes it difficult to access
evidence without the appropriate decryption keys.
Anti-forensic Techniques:
● Cybercriminals use methods like data obfuscation, steganography, or wiping tools
to make forensic analysis more difficult.
Volatile Evidence:
● Digital evidence can be fragile and easily altered or deleted, especially in live
systems (e.g., RAM data).
Diverse Platforms and Devices:
● Investigations span across multiple device types (computers, mobile phones, IoT
devices) and operating systems (Windows, Linux, Android, etc.), each requiring
specific tools and expertise.
Thank You

You might also like