noc20 cs02 lec01 Introduction
This lecture introduces the foundations of cryptography, outlining its importance in data security
in the digital age, the goals of cryptography, and its advanced applications such as
cryptocurrencies, secure communication, and more. It emphasizes the need for a strong
foundation in cryptography for aspiring information security experts, highlights the course
objectives and expected outcomes, and introduces various cryptographic primitives and their
properties.
Key Points:
Importance of Cryptography
Cryptography is crucial for ensuring data security in an increasingly digital world. It serves as a
defense against unauthorized access and is indispensable for secure communication, especially
amidst rising cyber threats.
Course Objectives
The course will cover modern cryptography's paradigms and principles, focusing on formal
security definitions and constructions of cryptographic primitives like encryption, hash functions,
and digital signatures.
Three Stage Process
Each cryptographic primitive will be developed through a three-stage process: defining formal
security requirements, designing algorithmic constructions, and providing rigorous proofs of their
security.
Secure Communication Properties
Cryptography aims to establish secure communications characterized by three core properties:
privacy (ensuring only the sender and receiver know the communication), authenticity (verifying
sender identities), and integrity (confirming message contents have not been tampered with).
Advanced Applications of Cryptography
The lecture explores advanced cryptographic applications, including cryptocurrency (solving the
double spending problem), secret sharing to prevent single points of failure, zero-knowledge
proofs that allow for validation without revealing underlying secrets, outsourcing computation
safely, distributed consensus among untrusting parties, and secure multi-party computation
(MPC) which simulates a trusted third party's role among mutually distrusting entities.
noc20 cs02 lec02 Symmetric key Encryption
This lecture provides an in-depth discussion of symmetric key encryption, its principles, and its
security considerations. It covers core problems of cryptography such as key agreement and
secure communication, before detailing the structure and function of symmetric encryption,
including the definitions of various attack models, and concludes with the significance of
Kerckhoffs' principle.
Key Points:
Core Problems of Cryptography
The lecture introduces the two central problems addressed by cryptography: key agreement and
secure communication. Key agreement involves two parties who exchange information to
compute a shared secret key over a public channel without prior knowledge, ensuring that
eavesdroppers are unable to ascertain the key. Secure communication ensures confidentiality
and integrity of the messages exchanged using the agreed-upon key.
Types of Cryptographic Primitives
There are two primary types of cryptographic primitives: symmetric key (or private key) and
asymmetric key (or public key) encryption schemes. In symmetric encryption, both sender and
receiver use the same secret key, while in asymmetric encryption, one key is public, and the
other is private. Each approach has its advantages and disadvantages, such as efficiency
versus security.
Symmetric Key Encryption Mechanism
The symmetric encryption process involves three main algorithms: key generation (producing a
secure key), encryption (turning plaintext into ciphertext using the key), and decryption (turning
ciphertext back to plaintext using the same key). The security of this process hinges on the
secrecy of the key, not the encryption algorithm itself.
Attack Models in Cryptography
The lecture explains four major attack models in cryptography: Ciphertext Only Attack (COA),
Known Plaintext Attack (KPA), Chosen Plaintext Attack (CPA), and Chosen Ciphertext Attack
(CCA). Each model highlights the varying capabilities of an attacker concerning what
information they can access and manipulate during encryption.
Kerckhoffs’ Principle
This principle asserts that a cryptographic system should remain secure even if everything
about it, other than the key, is made public. This emphasizes that the key must be kept secret
while allowing broad awareness of the algorithms used. This approach facilitates better
algorithm scrutiny and security assurances, minimizing reliance on proprietary or obscure
methods.
noc20 cs02 lec03 Historical Ciphers and their Cryptanalysis
This lecture covers historical ciphers such as the shift cipher, monoalphabetic substitution
cipher, and polyalphabetic substitution cipher, along with their cryptanalysis. It discusses how
these ciphers were easily broken and highlights lessons learned that can inform the
development of more secure modern cryptography.
Key Points:
Shift Cipher
The shift cipher is a symmetric encryption process where each letter is shifted by a fixed
number 'k'. The method is simple yet susceptible to brute force attacks, as there are only 26
possible keys. This indicates the importance of having a sufficiently large key space for security.
Monoalphabetic Substitution Cipher
This cipher improves on the shift cipher by using a one-to-one mapping of plaintext characters
to ciphertext characters via a secret permutation. However, it remains vulnerable to frequency
analysis attacks due to the predictable nature of letter frequencies in natural languages.
Poly-Alphabetic Substitution Cipher
The Vigenere cipher addresses the weaknesses of its predecessors by employing multiple
instances of the shift cipher to encrypt messages, using a key of variable length. While it
complicates the relationship between plaintext and ciphertext characters, frequency analysis
can still be applied to break it.
Lessons Learned
The lecture concludes with several lessons, including the necessity for ciphers to have a large
key space to resist brute force attacks, the ease with which historical ciphers could be broken,
and the distinction between classical and modern cryptography. Modern cryptography is based
on scientific principles and formal security definitions.
Modern Cryptography Principles
Modern cryptography adheres to principles such as formal security definitions, transparent
assumptions in security constructions, and proof of construction validity. These principles aim to
ensure that cryptographic systems are systematically verified and secured against various
attack models.
noc20 cs02 lec04 Perfect Security
This lecture covers the concept of perfect secrecy as defined by Claude Shannon, emphasizing
its mathematical foundations and implications. Perfect security ensures that observing a
ciphertext reveals no additional information about the plaintext to an adversary, even if they
have unlimited computational resources. The discussion includes equivalent definitions of
perfect secrecy and how it can be mathematically formulated using probability distributions over
plaintext, ciphertext, and keys.
Key Points:
Definition of Perfect Security
Perfect security, introduced by Claude Shannon, is described as a condition where knowledge
of the ciphertext does not provide any additional information about the plaintext. This holds even
when the adversary possesses computationally unbounded power.
Probability Distributions
The lecture explores three key probability distributions: over the plaintext space (which models
prior information the adversary has), the key space (the random keys generated), and the
ciphertext space (reflecting how plaintext and keys combine to produce ciphertext).
Understanding these distributions is crucial for demonstrating perfect secrecy.
Mathematical Formalization
Perfect security is formally defined by an equality in conditional probabilities that ensures the
adversary’s knowledge remains unchanged before and after seeing the ciphertext, reflecting the
independence of the ciphertext distribution from the underlying plaintext.
Alternate Definitions
Three equivalent definitions of perfect secrecy are discussed: Shannon's original definition, a
condition based on independence of ciphertext from plaintext, and a game-based definition
evaluating indistinguishability of messages based on ciphertext analysis.
Example: Vigenere Cipher
The lecture concludes with an example using the Vigenere cipher to illustrate the failure of
achieving perfect security. The presenter demonstrates an attack showing that the cipher allows
an adversary to distinguish between two messages, thus failing to meet the criteria for perfect
secrecy.
noc20 cs02 lec05 Limitations of Perfect Security
This lecture discusses the limitations of perfectly secure encryption, specifically focusing on the
one time pad or Vernam cipher. It explains the properties, encryption, and decryption processes
of the one time pad, and proves its perfect security while highlighting the impractical constraints
of requiring large keys and non-reusable keys for every message. The lecture concludes that
these limitations are inherent to any perfectly secure cipher, not just the one time pad.
Key Points:
One Time Pad Encryption Process
The one time pad, also known as the Vernam cipher, utilizes an encryption algorithm that
combines plaintext and a randomly generated key using an XOR operation. Both the plaintext
and the key must be of the same length, ensuring that each bit of the plaintext is masked by the
corresponding bit of the key. The decryption process is the reverse, using XOR to recover the
original message.
Proof of Perfect Security
The lecture presents a proof demonstrating that the one time pad ensures perfect secrecy,
meaning that the ciphertext does not reveal any information about the plaintext. For any two
possible plaintext messages, the probability of either being the correct message, given the
ciphertext, remains equal, affirming that an adversary cannot infer which plaintext was used.
Limitations of One Time Pad
Despite its security, the one time pad has significant limitations: the key must be at least as
large as the plaintext, which is impractical for large files, and the same key cannot be reused for
encrypting different messages. Each encryption instance must utilize a fresh, random key to
maintain security.
Inherent Limitations of Perfectly Secure Ciphers
The lecture argues that the limitations of key size and key reuse are not unique to the one time
pad but are inherent to all perfectly secure encryption schemes. The necessary condition for
perfect secrecy dictates that the key space must match the size of the plaintext space, leading
to the requirement for fresh keys for each encryption instance.
noc20 cs02 lec06 Introduction to Computational Security
This lecture covers the principles of modern cryptography, focusing on the birth of
computational security and the necessary trade-offs associated with it. It outlines how modern
cryptography differs from the concept of perfect secrecy by relaxing assumptions about
adversaries and how this allows for the use of shorter keys for multiple messages while
maintaining practical security measures. The mathematical definitions of efficient algorithms and
negligible probabilities are introduced as key components of this approach.
Key Points:
Transition from Perfect Secrecy to Computational Security
The lecture discusses how perfect secrecy requires a key as large as the message and
prohibits key reuse, making it impractical. Modern cryptography instead aims for computational
security, where adversaries are computationally bounded, allowing for key reusability while
accepting that the adversary might learn some negligible information about the plaintext.
Relaxation of Assumptions about Adversaries
In modern cryptography, the adversary is assumed to have limited computational resources,
contrasting with the perfect secrecy model where the adversary was unbounded. This change
accommodates practical encryption processes that can reuse keys while ensuring the security
probability is exceedingly small.
Definitions of Efficient Algorithms and Negligible Probability
Efficient algorithms are defined as those with polynomial running times concerning a security
parameter. Negligible probability is described as a function that becomes insignificant as the
security parameter increases, allowing a measure of security that could be practically
acceptable.
The Importance of Security Parameters
The choice of security parameter size (often the key size) is crucial in deploying cryptographic
schemes. Increasing the parameter enhances security but also raises the computational
requirements of legitimate operations, necessitating a careful balance.
Consequences of Relaxations in Security Models
Accepting a small probability of an adversary learning some information, alongside defining
efficient adversaries, resolves potential attacks like brute-force and guessing attacks, which
would otherwise compromise security.
noc20 cs02 lec07 Semantic Security
This lecture discusses semantic security in the context of the ciphertext only attack model. It
defines semantic security, explores its formalization, and presents an equivalent definition
based on indistinguishability. The lecture highlights the challenges of proving semantic security,
introduces reduction-based proofs, and emphasizes the equivalence of the two definitions of
security. It also illustrates the implications of indistinguishability on an adversary's ability to
glean information about an encrypted plaintext. Key examples and explanations are provided
throughout the lecture.
Key Points:
Definition of Semantic Security
Semantic security ensures that an encrypted ciphertext does not reveal any information about
the plaintext, even if the attacker has prior knowledge related to it. The lecture describes the
formal definition of semantic security under a ciphertext-only attack model, emphasizing the
necessity of a computationally bounded adversary.
Indistinguishability Game
The lecture presents the indistinguishability game as an equivalent way to define semantic
security. If a sender encrypts two messages, the adversary should not distinguish which
message was encrypted better than random guessing. This form simplifies the computational
requirements for proving security.
Reduction-based Proofs in Cryptography
Reduction-based proofs are central to cryptographic security. The lecture details how proving a
system’s security can often be achieved by demonstrating that any successful adversary
against it can be transformed into an adversary capable of solving a known hard problem, which
is instrumental for showing indistinguishability.
Equivalence of Definitions
The two definitions of semantic security, one based on computing additional information and the
other based on the indistinguishability of ciphertexts, are shown to be equivalent. By
demonstrating how each condition can lead to the other, the lecture establishes a robust
framework for understanding security in cryptographic systems.
Implications of Indistinguishability
If an encryption process is indistinguishable, it implies an adversary cannot determine any bits
of the plaintext accurately. The lecture discusses how this conclusion can be proven using a
reduction-based approach, reinforcing the importance of indistinguishability in cryptographic
security.
noc20 cs02 lec08 Pseudo random Generators PRGs
This lecture discusses the concept of pseudorandom generators (PRGs) and their role in
encrypting long messages using short keys. It begins by reviewing the one-time pad encryption
method, then introduces PRGs as a solution to the limitations of using longer keys. The speaker
details various definitions and properties of PRGs, including their security features and how they
can generate outputs indistinguishable from random strings with a polynomially bounded secure
time. The lecture concludes with a discussion on the conditions that define an effective PRG
and an example of a construction that fails to meet these criteria.
Key Points:
Introduction to PRGs
The lecture starts with an introduction to pseudorandom generators as a method to encrypt long
messages using shorter keys. This section emphasizes the limitations of conventional one-time
pad encryption in terms of key size.
One-time Pad Review
The one-time pad encryption scheme is reviewed, focusing on how it uses uniformly random
keys of the same length as the message to ensure perfect secrecy, providing the strongest level
of data confidentiality against eavesdroppers.
Modification for PRGs
The modification to the one-time pad is introduced, where instead of a key the same length as
the message, a shorter randomly generated input is used. The encryption now involves XORing
the message with bits generated by the pseudorandom function G.
Pseudorandom Generator Function G
The function G is defined as taking a short input and producing a longer pseudorandom output.
Its properties include efficient execution time and a significant increase in output size compared
to the input, crucial for maintaining security.
Indistinguishability Requirement
A key requirement for a pseudorandom generator is that its output should be indistinguishable
from that of a true random generator. This is evaluated using distinction experiments that check
computationally bounded adversaries' capability to determine the source of generated strings.
Next-bit Prediction Experiment
The next-bit prediction experiment defines the unpredictability of a PRG, where an adversary
should not be able to predict the next output bit given the previous bits, thus maintaining
security under randomized conditions.
Examples and Failure Cases
An example is given of a proposed pseudorandom generator that fails to meet the requirements
for being considered a secure PRG, demonstrating how an efficient statistical test can
distinguish its output from truly random strings.
Conclusion and Future Lectures
The lecture concludes with a summary of the importance of good pseudorandom generator
design in cryptography and indicates topics for future lectures related to computational security.
noc20 cs02 lec09 Operations on Pseudorandom Generators
The lecture discusses operations on pseudorandom generators (PRGs), focusing on composing
multiple instances of a secure PRG to create a new one. The process of parallel composition is
explained, where multiple independent executions of an existing PRG are combined to expand
input and output sizes. The lecture introduces a proof strategy called a hybrid argument to
demonstrate that if the original PRG is secure, then the composed PRG remains secure as well.
Key Points:
Introduction to PRGs
The lecture begins with an overview of pseudorandom generators (PRGs) and their importance
in cryptography, specifically emphasizing the operation of composing PRGs to enhance their
security.
Parallel Composition of PRGs
The lecturer explains how to perform parallel composition on a secure PRG to increase both the
input size and output size. By executing the PRG multiple times independently, a new PRG can
be generated that expands the data it can securely produce.
Hybrid Argument Proof Strategy
A new proof strategy, the hybrid argument, is presented to demonstrate that the new PRG
created through composition retains security. This involves comparing two experiments the
PRG could generate, showing they are indistinguishable to any polynomial-time distinguisher.
Indistinguishability in Experiments
Two experiments (H0 and H1) are set up to challenge the ability of distinguishers to tell the
difference between uniformly random outputs and those generated by the newly composed
PRG, establishing conditions under which they should be considered indistinguishable.
General Case of PRG Composition
The lecture extends the discussion to the general case of composing the original PRG multiple
times. By introducing polynomial intermediate hybrid experiments, it is shown that the final
composed PRG remains secure.
Reversal of PRG Output
The lecturer concludes the discussion by exploring the idea of reversing the output of a secure
PRG to construct another secure PRG, asserting that this transformation preserves the
essential properties of pseudorandomness.
Summary of Key Concepts
In the conclusion, the lecturer summarizes the key takeaways regarding pseudorandom
generators, their definitions, and their ability to produce outputs that closely mimic truly random
outputs, ultimately affirming the importance of their secure composition.