ADF - Module 1
ADF - Module 1
                                                                Presenter:
                                                                Yash Patel
Introduction to Cyber-crime Investigation
What is Cyber-crime?
●   Hacking
●   Phishing
●   Ransomware
●   Identity theft
●   Financial fraud
●   Malware Distribution
●   Cyberstalking
Introduction to Cyber-crime Investigation
Phases of Investigation
1. Identification
 ● Locating potential sources of evidence (devices, networks, etc.)
 ● Example: Detecting that a compromised employee laptop contains potential
     evidence of intellectual property theft.
2. Preservation
 ● Ensuring that evidence is protected and not altered.
 ● Example: Creating a bit-by-bit forensic image of the laptop's hard drive to
     preserve the original data.
Conducting an Investigation
Key Phases:
3. Collection
 ● Gathering digital data in a forensically sound manner.
 ● Example: Extracting email archives, internet history, and deleted files from the
    forensic image using specialized tools.
4. Examination
 ● Processing and analyzing the data to uncover relevant information.
 ● Example: Searching through the email archives for suspicious attachments or
    communications related to the incident.
Conducting an Investigation
Key Phases:
5. Analysis
 ● Drawing conclusions from the collected evidence to support legal or internal
     investigations.
 ● Example: Correlating the extracted data with timestamps to reconstruct a
     timeline of the suspected theft activities.
6. Presentation
 ● Organizing and presenting findings in a clear and understandable manner for
     stakeholders or court proceedings.
 ● Example: Creating a report and visual aids (e.g., timelines, charts) to explain
     the findings clearly to non-technical stakeholders in court.
Conducting an Investigation
Best Practices:
●   Chain of Custody: Always document who handles the evidence and when to
    ensure it's admissible in court.
●   Forensically Sound Methods: Use write blockers and trusted forensic tools
    to prevent altering the original evidence.
●   Comprehensive Documentation: Record every action taken during the
    investigation to ensure the process is transparent and reproducible.
●   Stay Up-to-Date: Continuously update knowledge and tools to keep up with
    evolving technologies and techniques.
●   Legal Compliance: Ensure that all steps taken in the investigation comply
    with legal and regulatory standards.
Preparing for Search and Seizure
1. Legal Authorization
 ● Obtain search warrant or court order
 ● Ensure compliance with jurisdictional laws and regulations
 ● Example: Acquiring a warrant to search an employee's work laptop suspected
     of containing stolen data.
2. Pre-Search Planning
 ● Conduct reconnaissance of the location
 ● Identify potential evidence and target areas
 ● Prepare necessary tools and equipment (e.g., forensic kits, data storage
     devices)
 ● Example: Planning the collection of both on-site and cloud-stored data for a
     case involving intellectual property theft.
Preparing for Search and Seizure
3. Assemble the Team
 ● Select experienced investigators and technical experts
 ● Brief team on legal constraints and objectives
 ● Assign specific roles and responsibilities
 ● Example: Ensuring that a forensic expert, legal advisor, and IT security
    professional are present during the operation.
4. Securing the Scene
 ● Ensure the safety of personnel and evidence
 ● Isolate the area to prevent tampering
 ● Document the condition of the scene upon entry
 ● Example: Locking down a section of the office where digital devices are
    located and restricting access only to authorized forensic personnel.
Preparing for Search and Seizure
5. Evidence Collection
 ● Follow chain of custody protocols
 ● Collect and preserve physical and digital evidence systematically
 ● Use proper forensic techniques to prevent contamination
 ● Example: Using a write blocker to connect a suspect's hard drive to prevent
    any accidental data modification.
6. Post-Seizure Documentation
 ● Catalog all seized items
 ● Create detailed reports on the search process
 ● Prepare for potential legal challenges regarding the seizure
 ● Example: Logging the exact time, location, and person responsible for
    handling each seized device.
Preparing for Search and Seizure
Best Practices:
4. Evidence Preservation
 ● Capture live system data (e.g., RAM, active network connections)
 ● Create forensic images of storage devices
 ● Log and record all actions taken during the scene securing
Securing the Crime Scene
5. Documentation
 ● Document the state of all affected systems upon arrival
 ● Record actions taken, including network isolation, shutdowns, and forensic
    captures
 ● Ensure detailed notes for legal and investigative purposes
6. Communication
 ● Coordinate with the response team to ensure all actions are properly aligned
 ● Brief stakeholders on containment progress and next steps
Seizing Digital Evidence at SOC
What is SOC?
●   Testability
     ○   Can the theory or technique be tested?
●   Peer Review
     ○   Has the theory or technique been peer-reviewed and published?
●   Error Rate
     ○   What is the known or potential error rate?
●   Standards
     ○   Are there standards controlling the operation of the technique?
●   General Acceptance
     ○   Is the theory or technique generally accepted in the relevant scientific community?
Daubert Standards
The Role of Judges
●   Outcome: The Supreme Court established that courts must evaluate the
    scientific validity of the evidence, leading to the creation of the Daubert
    Standard.
Daubert Standards
Example 2: General Electric Co. v. Joiner (1997)
●   Outcome: The court found that the expert testimony was too uncertain and
    not directly applicable to the facts of the case, affirming the trial court’s
    exclusion of the evidence under the Daubert Standard.
Daubert Standards
Example 3: Kumho Tire Co. v. Carmichael (1999)
●   Case Summary: A tire blowout led to a fatal accident, and the complainant
    claimed the tire was defective.
●   Outcome: The Supreme Court ruled that the Daubert Standard applies to all
    expert testimony, not just scientific testimony, broadening its application.
Daubert Standards
Why the Daubert Standard is Useful
This phase involves identifying all potential sources of digital evidence, which can
range from computers, mobile devices, servers, cloud storage, and IoT devices to
network logs and databases. ISO/IEC 27037 emphasizes that:
During the collection phase, it is crucial to ensure that the data is gathered in a
manner that preserves its integrity. ISO/IEC 27037:2012 outlines specific methods
for ensuring:
 ● Forensic soundness: This refers to collecting evidence without altering the
     original data, ensuring that it remains in its original state.
 ● Chain of custody: A documented process that records each person or entity
     that handles the evidence, ensuring its traceability.
 ● Minimization of Contamination: Steps should be taken to reduce the risk of
     contamination, including isolating the data from the live environment, avoiding
     unnecessary data access, and employing write blockers or other tools.
ISO/IEC 27037: 2012
Phase 3: Acquisition of Digital Evidence
This phase involves the actual extraction of data from digital sources. ISO/IEC
27037:2012 provides guidelines for:
●   Imaging: Making exact copies of digital storage media, such as hard drives,
    in a way that preserves the original data. Bit-level copies are preferred to
    ensure all data, including hidden or deleted files, is captured.
ISO/IEC 27037: 2012
Phase 3: Acquisition of Digital Evidence
After acquisition, evidence must be preserved for future analysis and potential
legal proceedings. This involves:
ISO/IEC 27037:2012 emphasizes that digital evidence handling must comply with
applicable laws and regulations. This includes:
●   First responders must ensure that they handle potential evidence properly to
    avoid accidental contamination, while specialists and analysts are responsible
    for the more technical aspects of evidence processing.
ISO/IEC 27037: 2012
Technical Considerations
The standard provides technical guidelines for handling various types of digital
evidence, including:
●   Network Evidence: Gathering network logs, packet captures, and other data
    from network devices can be vital. The standard discusses the importance of
    preserving time synchronization and log integrity.
●   Volatile Data: In some cases, data stored in volatile memory (e.g., RAM) may
    be critical, and specific guidelines are provided for capturing and preserving
    such data before it is lost.
ISO/IEC 27037: 2012
Risk Management
Complementary Standards
Key Integration
●   ISO/IEC 27037 sets the foundation for digital evidence handling, and
    subsequent standards (27041, 27042, 27043) build on its guidelines for
    deeper forensic analysis and investigation processes.
●   Volatility Framework: A powerful memory forensics tool that can capture and
    analyze the contents of RAM on Windows, Linux, and macOS systems.
    Volatility supports various plugins that allow investigators to extract artifacts
    such as active processes, network connections, loaded drivers, and even
    decrypted content from encrypted applications.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:
●   FTK Imager: Primarily used for disk imaging, FTK Imager can also capture
    RAM. It provides a simple interface for acquiring memory dumps without
    affecting the underlying system.
●   DumpIt: A portable tool that can quickly create a full memory dump from a
    system. It's designed for live response scenarios where time is critical.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:
Network Connections:
●   Open Ports and Active Connections: Volatile data collection must include
    capturing information about open network ports and current network
    connections. Tools like netstat or memory analysis can reveal ongoing
    communications with external servers, which may indicate data stealing or
    command-and-control activity.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:
Network Connections:
Artifacts in Memory:
●   Minimizing System Impact: Tools used for volatile data collection must
    operate in a manner that minimizes their footprint on the system. The goal is
    to avoid altering memory content or introducing new processes that could
    overwrite critical evidence.
●   Hashing and Verification: Just like with non-volatile data, the integrity of
    volatile data must be ensured. Once the memory dump is created, it should
    be hashed using algorithms like MD5 or SHA-256 to ensure that it remains
    unaltered during analysis.
Data Collection Methods
Key Considerations and Techniques for Volatile Data Collection:
●   Data Protection Compliance: Volatile data may contain personal data, and
    investigators must handle it in in compliance with with data protection
    regulations such as GDPR or CCPA.
Data Collection Methods
Case Study 1: Sony Pictures Hack (2014)
Background:
 ● In November 2014, Sony Pictures Entertainment was the target of a
   devastating cyber attack. The attackers stole sensitive information and
   deployed wiper malware that destroyed data across the network.
Data Collection Methods
Case Study 1: Sony Pictures Hack (2014)
Outcome:
 ● Volatile data analysis was instrumental in attributing the attack to North
    Korean state-sponsored hackers and mitigating further damage.
Data Collection Methods
Case Study 2: Operation Aurora (2009-2010)
Background:
 ● Operation Aurora was a series of cyberattacks conducted by Chinese hackers
   against major corporations, including Google, Adobe, and other tech firms, in
   late 2009 and early 2010. The attackers exploited a zero-day vulnerability to
   gain access to internal systems.
Data Collection Methods
Case Study 2: Operation Aurora (2009-2010)
Outcome:
 ● The volatile data collection provided critical insights into the attackers'
    methods and led to the discovery of the widespread nature of the attacks
    across multiple organizations.
Data Collection Methods
Collection of Persistent Data (e.g., Hard Drives)
Forensic Imaging:
Forensic Imaging:
●   Unallocated Space: Even when files are deleted, remains often remain in the
    unallocated space of the storage device until they are overwritten. Advanced
    forensic tools can recover these remains, providing access to deleted files
    and fragments.
●   Slack Space: Slack space is the unused space within allocated clusters on
    the disk. For example, if a file is smaller than the cluster size, the remaining
    space may still contain data from previously deleted files. Investigators should
    analyze slack space to uncover hidden data.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:
●   System Logs: Persistent data includes various system logs that can provide
    valuable insights into system activity, including login attempts, software
    installation, and file access. These logs are often crucial in establishing a
    timeline of events.
●   Remote Acquisition: For remote devices (e.g., mobile devices, IoT devices),
    specialized forensic tools like Cellebrite or Magnet AXIOM are used to extract
    data while maintaining its integrity. This often involves rooting or jailbreaking
    the device to gain full access to its storage.
Data Collection Methods
Key Considerations and Techniques for Persistent Data Collection:
Preservation Techniques:
Background:
 ● Dennis Rader, known as the BTK Killer, was caught after sending a floppy
   disk to the media, believing it was anonymous.
Data Collection Methods
Case Study 3: BTK Killer Dennis Rader (2005)
Outcome:
 ● Persistent data analysis played a crucial role in solving this high-profile case,
    linking digital evidence to a notorious serial killer.
Data Collection Methods
Case Study 4: Enron Investigation (2001)
Background:
 ● The Enron scandal involved massive corporate fraud, leading to the
   company's bankruptcy and the downfall of Arthur Andersen LLP, one of the
   largest audit firms at the time.
Data Collection Methods
Case Study 4: Enron Investigation (2001)
Outcome:
 ● Persistent data collection was key to building the case against Enron
    executives and securing multiple convictions for corporate fraud.
Data Collection Methods
Case Study 5: Target Data Breach (2013)
Background:
 ● Target Corporation suffered a massive data breach in 2013, where hackers
   stole payment card information from over 40 million customers by
   compromising Target’s point-of-sale (POS) systems.
Data Collection Methods
Case Study 5: Target Data Breach (2013)
Outcome:
 ● The hybrid approach helped investigators mitigate the breach and led to
    industry-wide changes in retail cybersecurity practices.
Data Collection Methods
Case Study 6: Ukraine Power Grid Cyberattack (2015)
Background:
 ● In December 2015, a cyberattack targeted Ukraine's power grid, causing
   power outages across the Ivano-Frankivsk region. The attack was attributed
   to Russian state-sponsored hackers and involved sophisticated malware.
Data Collection Methods
Case Study 6: Ukraine Power Grid Cyber Attack (2015)
●   MD5 (Message Digest Algorithm 5): Produces a 128-bit hash value. Widely
    used but vulnerable to collisions.
●   SHA-1 (Secure Hash Algorithm 1): Produces a 160-bit hash value. More
    secure than MD5 but has been deprecated due to vulnerabilities.
●   SHA-256 (Secure Hash Algorithm 256): Part of the SHA-2 family, produces
    a 256-bit hash value, offering robust security.
Hashing Algorithms
MD5 (Message Digest Algorithm 5)
●   Final Digest: After processing all chunks, MD5 produces a 128-bit message
    digest (hash).
Hashing Algorithms
Applications:
●   File Integrity Checks: MD5 was extensively used to ensure files were
    transferred correctly (e.g., when downloading software, the MD5 hash of the
    file was used to check if the download was correct).
●   Purpose: SHA-1 was widely used in security protocols like SSL, TLS, and
    cryptographic applications, particularly in digital signatures.
Hashing Algorithms
How It Works:
 ● Padding the Input: Similar to MD5, SHA-1 processes the message in 512-bit
   chunks, padding it to make the message length congruent to 448 modulo 512.
   The padding includes the length of the original message as a 64-bit integer at
   the end.
●   Version Control Systems: Git and other version control systems used
    SHA-1 to create unique hashes for identifying different file versions.
Hashing Algorithms
Vulnerabilities:
●   Padding the Input: Like MD5 and SHA-1, the input message is padded to
    make the message length congruent to 448 modulo 512. The padding
    includes the length of the message as a 64-bit integer at the end.
●   Message Processing: SHA-256 processes the input in 512-bit blocks. The
    algorithm goes through 64 rounds of processing, involving modular additions,
    bitwise operations, and logical functions. It uses eight working variables and a
    series of constants derived from the first 32 bits of the fractional parts of the
    cube roots of the first 64 prime numbers.
●   Final Digest: After processing all message blocks, SHA-256 produces a
    256-bit message digest, providing stronger security than SHA-1.
Hashing Algorithms
Applications:
Hash Size 128-bit (32-character hex) 160-bit (40-character hex) 256-bit (64-character hex)
Applications      File integrity, checksums    Digital signatures, SSL      SSL, blockchain, certificates
                  (non-crypto)                 (deprecated)
Known             Collisions, broken in 2004   Collisions, broken in 2017   None known (as of 2024)
Vulnerabilities
Hashing Algorithms
Properties of Hashing Algorithms
●   Deterministic: The same input will always produce the same hash output,
    ensuring consistency in forensic analysis.
●   Collision Resistance: A good hash function minimizes the likelihood that two
    different inputs will produce the same hash value, which is crucial to avoid
    evidentiary issues.
●   Avalanche Effect: A small change in the input should drastically change the
    hash value, preventing small alterations from going unnoticed.
Hashing Algorithms
Applications of Hashing in Digital Forensics
●   Verifying Integrity: Hashes are used to ensure that digital evidence remains
    unaltered from the point of acquisition to analysis.
SHA-1 Example:
 ● Input: "Forensic Analysis"
 ● SHA-1 Hash Output: 3d5aa13adcaebed1f524efb6f3a2040e8f92fb65
SHA-256 Example:
 ● Input: "Forensic Analysis"
 ● SHA-256 Hash Output:
   16df40b722ca02cc44d1f6f8d18e2c7686d5b865cf29c14263215c5f865c8021
Cloning of Digital Exhibits
Understanding Digital Exhibits
●   Definition: Digital exhibits refer to any form of digital data that can be used as
    evidence in a legal context. This includes files, emails, digital photos, log files,
    and even entire hard drives.
Cloning of Digital Exhibits
Understanding Digital Exhibits
●   Cloning: Captures all data, including deleted files, file slack, and unallocated
    space.
●   Copying: Typically only copies active files and may miss hidden or deleted
    data.
Cloning of Digital Exhibits
Methods of Cloning:
● Software-based Cloning:
     ○   dd command: A Unix-based command-line utility that can create exact copies of files or entire
         disks.
     ○   FTK Imager: A widely-used forensic imaging tool that can create forensic disk images and
         verify them with hash values.
Cloning of Digital Exhibits
Best Practices for Cloning Digital Exhibits
●   Use Write Blockers: Ensure that the original media cannot be altered during
    the cloning process.
●   Verify the Integrity: Generate hash values for the original and cloned
    exhibits both before and after cloning to confirm no changes have occurred.
Hashing Tools:
●   HashCalc: A simple tool that can calculate hash values using various
    algorithms.
Cloning Tools:
●   FTK Imager: Allows for the creation of forensic images, as well as the
    verification of image integrity.
●   dd: A versatile Unix/Linux command that can create exact copies of disks and
    partitions.
Cloning of Digital Exhibits
Challenges
●   Hash Collisions: Although rare, hash collisions can occur, where two
    different inputs produce the same hash value. This can compromise the
    integrity of forensic evidence.
●   Process: Detail the steps taken to hash, clone, and analyze the evidence,
    emphasizing the importance of maintaining data integrity.
●   Hashing: Essential for verifying the integrity of digital evidence. Ensures that
    data remains unchanged from acquisition to analysis.
●   Cloning: Critical for preserving the original evidence while allowing for
    in-depth analysis. Ensures that the original data remains intact and unaltered.
Overview:
 ● Raw formats represent a direct bit-by-bit copy of the original storage medium,
   capturing all data as it is.
Examples:
 ● Common Extensions: .dd, .img, .raw
Digital Imaging Formats
Raw Formats
Advantages:
Disadvantages:
●   File Size: Typically large because of lack of compression; exact copy of the
    original media size.
●   Metadata Storage: Raw formats do not inherently store metadata such as
    hash values, timestamps, or acquisition details, which must be managed
    separately.
●   No Error Detection/Correction: Cannot inherently detect errors in data
    acquisition.
Digital Imaging Formats
Proprietary Formats
Overview:
 ● Proprietary formats are developed by specific software vendors and are
   designed to offer enhanced features like compression, encryption, and error
   detection.
Examples:
 ● EnCase Format (.E01): Widely used by EnCase forensic software.
 ● FTK Image Format (.AD1): Used by AccessData’s FTK.
Digital Imaging Formats
Proprietary Formats
Advantages:
Disadvantages:
Overview:
 ● Advanced Forensic Formats (AFF): Developed as an open or semi-open
   standard to address limitations of raw and proprietary formats.
Examples:
 ● Advanced Forensic Format (.aff): Supports compression, encryption, and
   extensive metadata storage.
 ● AFF4: An enhanced version of AFF with better support for large-scale
   investigations and faster processing.
Digital Imaging Formats
Advanced Forensic Formats
Advantages:
●   Specific Software Workflows: When using tools like EnCase or FTK that
    require their proprietary formats.
●   Need for Compression: Efficient storage of large volumes of evidence.
●   Enhanced Security: Cases where encryption and error detection are critical.
Digital Imaging Formats
When to Use Proprietary Formats:
●   Specific Software Workflows: When using tools like EnCase or FTK that
    require their proprietary formats.
●   Need for Compression: Efficient storage of large volumes of evidence.
●   Enhanced Security: Cases where encryption and error detection are critical.
Digital Imaging Formats
When to Use Advanced Formats:
●   Raw Formats: Simple and widely compatible but lack advanced features.
●   Proprietary Formats: Offer enhanced capabilities but come with vendor
    lock-in and compatibility issues.
●   Advanced Formats: Provide flexibility and scalability, ideal for complex
    investigations but may require specialized tools and knowledge.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:
Definition:
 ● Imaging refers to creating a bit-by-bit copy of a digital storage device,
    including all data sectors, unallocated space, and slack space.
Purpose:
 ● To preserve the exact state of the original device, ensuring that no data is
    altered or omitted during the process.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:
Definition:
 ● Imaging refers to creating a bit-by-bit copy of a digital storage device,
    including all data sectors, unallocated space, and slack space.
Purpose:
 ● To preserve the exact state of the original device, ensuring that no data is
    altered or omitted during the process.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:
Advantages:
 ● Forensic Integrity: Captures all data, including deleted files and hidden
   information.
 ● Authenticity: Generates a hash value for verification, ensuring the image is
   an exact replica of the original.
Imaging vs cloning/copying digital evidence
Imaging Digital Evidence:
Advantages:
 ● Forensic Integrity: Captures all data, including deleted files and hidden
   information.
 ● Authenticity: Generates a hash value for verification, ensuring the image is
   an exact replica of the original.
Definition:
 ● Cloning or copying involves replicating the file system of a device, copying
    active files and directories, but not necessarily capturing slack space or
    deleted data.
Purpose:
 ● Typically used for backup or data migration purposes, rather than forensic
    analysis.
Imaging vs cloning/copying digital evidence
Cloning/Copying Digital Evidence:
Advantages:
 ● Speed: Generally faster than imaging, as it doesn’t include every sector.
 ● Practicality: Useful for scenarios where only active files are required.
Disadvantages:
 ● Incomplete Capture: Does not include deleted files, unallocated space, or
    hidden data, which are often critical in forensic investigations.
 ● Lacks Forensic Integrity: May alter file metadata during the copying
    process, compromising the evidence.
Imaging vs cloning/copying digital evidence
Comparison: Imaging vs. Cloning:
Imaging:
 ● Forensic Accuracy: High
 ● Data Captured: Entire storage device (including hidden and deleted data)
 ● Use Case: Digital forensics, legal investigations.
Cloning/Copying:
 ● Forensic Accuracy: Low
 ● Data Captured: Only active files and directories
 ● Use Case: Backup, data migration, non-forensic needs.
Imaging vs cloning/copying digital evidence
Comparison: Imaging vs. Cloning:
Imaging:
 ● Forensic Accuracy: High
 ● Data Captured: Entire storage device (including hidden and deleted data)
 ● Use Case: Digital forensics, legal investigations.
Cloning/Copying:
 ● Forensic Accuracy: Low
 ● Data Captured: Only active files and directories
 ● Use Case: Backup, data migration, non-forensic needs.
Imaging vs cloning/copying digital evidence
Key Takeaway:
●   Imaging is the preferred method in digital forensics for its ability to capture the
    complete digital footprint of a device, ensuring no potential evidence is lost.
●   Cloning is more suited for general IT purposes where forensic integrity is not
    a concern.
Data Acquisition from Various Systems
●   Definition: Data acquisition is the process of extracting and preserving digital
    evidence for forensic analysis.
Tools:
 ● FTK Imager: Acquires live memory and files without shutting down the
    system.
 ● X-Ways Forensics: Handles disk imaging and live acquisition.
 ● Volatility Framework: Specializes in RAM and volatile data analysis.
Data Acquisition from Live Systems
Challenges:
 ● Volatile memory disappears if the system is shut down.
 ● System changes might occur during acquisition, affecting integrity.
Best Practices:
 ● Use write-blocking software to avoid altering data.
 ● Document every action taken during acquisition (time-stamped).
 ● Prioritize capturing volatile data first.
Data Acquisition from Live Systems
Case Study 1: A malware attack on a corporate network.
Tools:
 ● EnCase: Industry-standard tool for disk imaging and analysis.
 ● dd: A Unix-based tool for bit-by-bit copies of disks.
 ● Forensic Imager: Simplifies the process of acquiring disk images.
Data Acquisition from Shutdown Systems
Challenges:
 ● Inaccessible volatile data (RAM, running processes).
 ● Risk of damaging evidence if the system is powered on improperly.
Best Practices:
 ● Ensure physical access to the system without altering it.
 ● Always verify the hash values of disk images post-acquisition for authenticity.
 ● Avoid booting the system; use hardware write blockers if necessary.
Data Acquisition from Shutdown Systems
Case Study 2: Investigation of a fraudulent activity on a personal computer.
Tools:
 ● F-Response: Provides remote access to disks and volatile memory over the
    network.
 ● Network Miner: Gathers network traffic data for forensic analysis.
 ● Magnet AXIOM: Acquires and analyzes data from cloud services and remote
    endpoints.
Data Acquisition from Remote Systems
Challenges:
 ● Risk of data modification or loss during transfer.
 ● Legal considerations: Ensuring chain of custody across jurisdictions.
 ● Encryption or VPNs can slow down or block acquisition.
Best Practices:
 ● Use secure, encrypted channels (e.g., SSH, VPN) for transmission.
 ● Log all remote access sessions and actions performed.
 ● Document IP addresses, network configurations, and timestamps.
Data Acquisition from Remote Systems
Case Study 3: Breach of a cloud-based storage system.
●   Details: A company suspects that their cloud server has been compromised,
    leaking sensitive customer data. Physical access to the cloud infrastructure is
    not available.
●   Solution: Investigators use F-Response to remotely access the cloud server
    and acquire a forensic image. Logs and volatile data from the live cloud
    environment are also captured.
●   Outcome: Analysis of the remote acquisition reveals unauthorized access to
    the cloud server, including API calls that exposed customer data. This
    evidence helps the company patch vulnerabilities and support a legal case
    against the attackers.
Data Acquisition from RAID Servers
●   Definition: RAID (Redundant Array of Independent Disks) servers use
    multiple disks to enhance performance, reliability, or redundancy.
●   RAID Levels: Understanding striping (RAID 0), mirroring (RAID 1), parity
    (RAID 5/6) is critical for proper reassembly.
Tools:
 ● R-Studio: Forensics tool that reconstructs RAID configurations for imaging.
 ● X-Ways RAID Reconstructor: Rebuilds RAID arrays from individual disk
    images.
 ● ProDiscover: Supports RAID recovery and forensic imaging.
Data Acquisition from RAID Servers
Challenges:
 ● Must correctly identify RAID configuration before imaging.
 ● RAID arrays may fail partially, requiring reconstruction from degraded disks.
Best Practices:
 ● Identify RAID controller type and configuration (metadata).
 ● Image each disk in the array individually before reconstruction.
 ● Ensure proper assembly of the RAID array to maintain data accuracy.
Data Acquisition from RAID Servers
Case Study 4: Corporate server crash with RAID failure.
Tools:
 ● Passware: Decrypts and acquires encrypted drives with known passwords or
    credentials.
 ● Elcomsoft Forensic Toolkit: Specialized in password recovery and
    encryption bypass.
 ● Hashcat: High-speed password cracking tool for encrypted data.
Data Acquisition from Encrypted Systems
Challenges:
 ● Accessing encryption keys, which may reside in memory (live acquisition).
 ● Cracking passwords can be time-consuming, depending on the strength of
   encryption.
Best Practices:
 ● For live systems, capture memory to extract encryption keys if possible.
 ● Document the encryption type and any decryption attempts thoroughly.
 ● Use legal authority to obtain decryption keys when possible.
Data Acquisition from Encrypted Systems
Case Study 5: Criminal investigation involving encrypted laptops.
   Live Systems        FTK Imager, Volatility   Volatile data, potential     Capture volatile data
                                                system changes               first, use minimal
                                                                             footprint
   Shutdown Systems    EnCase, dd, Forensic     Non volatile data, risk if   Use write blockers,
                       Imager                   powered on incorrectly       verify hash values
●   Each system type requires specific tools and methods for effective data
    acquisition.
●   Real-world case studies demonstrate the importance of choosing the right
    approach:
     ○   Live Systems: Fast action is crucial to capture volatile data.
     ○   Shutdown Systems: Ensures data integrity by focusing on non-volatile storage.
     ○   Remote Systems: Critical for cloud and distributed environments.
     ○   RAID Servers: Involves complex recovery but essential in corporate settings.
     ○   Encrypted Systems: Encryption adds another layer of complexity but can be overcome with
         memory acquisition or brute-forcing techniques.
●   Proper preparation and technique ensure that valuable evidence is preserved
    and admissible in court.
Linux Validation Techniques
Checksum Tools:
 ● md5sum: Generates MD5 hash for comparison between source and acquired
   data.
 ● sha256sum: Creates a SHA-256 hash for stronger integrity checks.
Automated Tools:
 ● Guymager: Open-source tool, provides hash verification while imaging.
 ● AIR (Automated Image and Restore): Provides automated acquisition and
    validation.
Windows Validation Techniques
Checksum Tools:
 ● CertUtil: Windows command-line tool to generate hashes (MD5, SHA256).
 ● FCIV (File Checksum Integrity Verifier): Simple checksum tool for hash
   validation.
Third-Party Tools:
 ● FTK Imager: Industry-standard tool that includes hashing and validation
    during acquisition.
 ● X-Ways Forensics: Comprehensive tool for imaging and validation with
    hashing algorithms.
Importance
 ● Ensures evidence integrity: SOPs maintain the original state of digital
   evidence, preventing tampering or alteration.
 ● Reduces risk of contamination: Standardized procedures prevent
   accidental or intentional corruption of digital evidence.
 ● Provides legal defensibility: Following SOPs ensures the investigation
   process is reliable and admissible in court.
Digital Forensics SOPs
Goals of SOPs
 ● Maintain the chain of custody.
 ● Preserve the integrity of digital evidence.
 ● Follow legal and regulatory requirements.
 ● Ensure repeatability and reproducibility of results.
Best Practices:
 ● Use forensically sound methods.
 ● Avoid unnecessary handling.
 ● Label and store evidence securely.
SOP for Evidence Examination
Steps:
1. Verify integrity using hash comparisons.
2. Use designated forensic tools (e.g., EnCase, FTK).
3. Document each action performed on the evidence.
Best Practices:
 ● Always use verified, tested tools.
 ● Create working copies of data before examination.
 ● Maintain logs of tool versions and settings used.
SOP for Analysis and Reporting
Steps:
1. Analyze data based on the scope of the investigation.
2. Cross-validate results using different tools or methods.
3. Write clear, concise, and accurate reports.
Best Practices:
 ● Focus on factual findings without assumptions.
 ● Use visual aids (graphs, timelines) for clarity.
 ● Peer review the report before final submission.
Chain of Custody in Digital Forensics
Definition
The chronological documentation showing the seizure, custody, control, transfer,
analysis, and disposition of evidence.
Components
 ● Evidence identifier (serial number, case ID).
 ● Time and date stamps.
 ● List of all individuals handling the evidence.
 ● Documentation of any access to the evidence.
Legal Considerations in SOPs
Regulatory Requirements
 ● Compliance with local, national, and international laws.
 ● Privacy and data protection standards (e.g., GDPR).
Court Admissibility
 ● Evidence must be collected and handled according to SOPs for it to be
   admissible in court.
 ● Testimony of examiners relies on adherence to procedures.
Digital Forensics Standard Operating Procedures (SOPs)
Key Considerations
●   SOPs are critical for maintaining evidence integrity and legal defensibility.
●   Consistent application of SOPs ensures reliable forensic outcomes.
●   Adhering to Digital Forensics SOPs is not just a best practice but a necessity
    for lawful, credible investigations.
Software and Hardware Tools Used in Forensic
Analysis
Definition
 ● Digital forensic tools, both software and hardware, are used to collect,
    analyze, and preserve digital evidence while maintaining its integrity.
Categories of Tools
 ● Open Source: Freely available tools used for forensic investigations.
 ● Proprietary: Commercial tools with advanced features and support.
Importance
 ● Facilitates accurate evidence collection.
 ● Assists in analyzing complex datasets.
 ● Ensures legal compliance by following forensically sound methods.
Popular Open Source Tools in Digital Forensics
●  Autopsy: A GUI-based platform used for digital forensic analysis. Best for
   investigating hard drives and smartphones.
 ● Sleuth Kit (TSK): Command-line tools for recovering deleted files and
   partitions.
 ● Volatility: Advanced memory forensics tool used for analyzing RAM.
 ● Wireshark: Network packet analyzer for capturing and inspecting network
   traffic.
Advantages
 ● Free and customizable.
 ● Community-driven development and support.
 ● Useful for smaller organizations or specific forensic tasks.
Popular Proprietary Tools in Digital Forensics
●  EnCase: One of the most widely used forensic tools. It supports disk imaging,
   data recovery, and analysis.
 ● FTK (Forensic Toolkit): A comprehensive suite for disk imaging, email
   analysis, and registry analysis.
 ● X-Ways Forensics: Lightweight but powerful forensic software for advanced
   data recovery and analysis.
 ● Magnet AXIOM: A tool for deep analysis of mobile devices, cloud storage,
   and social media.
Advantages
 ● Advanced features for large-scale investigations.
 ● Official customer support and regular updates.
 ● Integration with other commercial forensic hardware.
Hardware Tools in Digital Forensics
●   Write Blockers: Prevent modification of data on digital storage devices while
    allowing read access for analysis (e.g., Tableau Write Blocker).
●   Forensic Duplicators/Imagers: Tools that create bit-for-bit copies of digital
    storage media for analysis (e.g., Logicube Falcon, Image MASSter).
●   Portable Forensic Workstations: Rugged laptops or workstations equipped
    with forensic software for use in the field (e.g., Forensic Laptop Workstations
    from Digital Intelligence).
Advantages
 ● Physical preservation of evidence.
 ● Support for high-speed data transfer and analysis.
 ● Rugged designs for fieldwork.
Comparison: Open Source vs Proprietary Tools
 Feature         Open Source                              Proprietary
Key Consideration:
●   Open source tools are ideal for budget-conscious cases or smaller tasks,
    while proprietary tools are necessary for large, complex investigations that
    require advanced features and official support.
Integration of Open Source and Proprietary Tools
Hybrid Approach
 ● Many forensic investigators use a combination of open-source and proprietary
   tools to balance cost, functionality, and the complexity of the investigation.
Example Workflow
 ● Use Sleuth Kit for initial file system recovery and follow up with FTK for
   deeper registry analysis and email extraction.
Cost Efficiency
 ● A hybrid approach allows smaller organizations to handle large cases
   effectively without fully relying on expensive proprietary tools.
Choosing the Right Tool for the Investigation
Criteria to Consider
 ● Budget: Can the organization afford the proprietary tool?
 ● Investigation Scope: How complex is the investigation? Does it involve
     advanced data like cloud or mobile forensics?
 ● Legal Requirements: Are there specific legal frameworks or certifications
     that require certain tools?
 ● Team Expertise: Do investigators have the skill set to use open-source tools
     effectively?
Key Considerations
●   Selecting the right forensic tools, whether open-source or proprietary,
    depends on the specific needs and resources of the investigation.
●   Open source tools are valuable for basic forensic tasks and smaller
    organizations, while proprietary tools provide advanced features and
    professional support for complex cases.
●   A combination of both types of tools can optimize both cost and efficiency in
    digital forensic investigations.
Challenges in Cyber-Crime Investigation
Anonymity of Cyber Criminals:
 ● Criminals can hide their identity using VPNs, Tor networks, and encryption,
    making it difficult to trace their actions.
Global Jurisdictional or Legal Issues:
 ● Cybercrime often crosses international borders, complicating legal processes,
    cooperation between countries, and evidence collection.
Data Volume and Complexity:
 ● Investigators must deal with huge amounts of data, making it challenging to
    analyze and extract relevant information quickly.
Rapid Evolution of Technology:
 ● New technologies and attack techniques, like ransomware or zero-day
    exploits, make it difficult for forensic tools and methods to keep pace.
Challenges in Digital Forensics
Encryption:
 ● Widespread use of encryption on digital devices makes it difficult to access
     evidence without the appropriate decryption keys.
Anti-forensic Techniques:
 ● Cybercriminals use methods like data obfuscation, steganography, or wiping tools
     to make forensic analysis more difficult.
Volatile Evidence:
 ● Digital evidence can be fragile and easily altered or deleted, especially in live
     systems (e.g., RAM data).
Diverse Platforms and Devices:
 ● Investigations span across multiple device types (computers, mobile phones, IoT
     devices) and operating systems (Windows, Linux, Android, etc.), each requiring
     specific tools and expertise.
Thank You