176 results sorted by ID
Simultaneous Diophantine Approximation for Compact Discrete Gaussian Sampling
Ke Ma, Jiabo Wang, Shanxiang Lyu, Junzuo Lai, Zsolt Lángi
Implementation
Discrete Gaussian Sampling (DGS) over the integers—also known as integer Gaussian sampling— is used to generate integer values that statistically follow the discrete Gaussian distribution and plays a central role in lattice-based cryptography. Among existing approaches for integer DGS, the cumulative distribution table (CDT) method is widely adopted. However, CDT sampling typically incurs substantial storage costs due to the need to store high-precision fixed-point probability tables, where...
Unconditionally secure encryption algorithm with unified confidentiality and integrity
Zhen-Hu Ning
Foundations
One-Time Pad (OTP), introduced by Shannon, is well-known as an unconditionally secure encryption algorithm and has become the cornerstone of modern cryptography. However, the unconditional security of OTP applies solely to confidentiality and does not extend to integrity. Hash functions such as SHA2, SHA3 or SM3 applies only to integrity but not to confidentiality and also can not obtain unconditional security. Encryption and digital signatures based on asymmetric cryptography can provide...
Cryptography meets worst-case complexity: Optimal security and more from iO and worst-case assumptions
Rahul Ilango, Alex Lombardi
Foundations
We study several problems in the intersection of cryptography and complexity theory based on the following high-level thesis.
1) Obfuscation can serve as a general-purpose worst-case to average-case reduction, reducing the existence of various forms of cryptography to corresponding worst-case assumptions.
2) We can therefore hope to overcome barriers in cryptography and average-case complexity by (i) making worst-case hardness assumptions beyond $\mathsf{P}\neq \mathsf{NP}$, and...
A Critique on Average-Case Noise Analysis in RLWE-Based Homomorphic Encryption
Mingyu Gao, Hongren Zheng
Public-key cryptography
Homomorphic encryption schemes based on the Ring-Learning-with-Errors problem require accurate ciphertext noise analysis to ensure correctness and security. However, ring multiplications during homomorphic computations make the noise in the result ciphertexts difficult to characterize. Existing average-case noise analyses derive a bound on the noise by either assuming it follows a Gaussian distribution, or giving empirical formulae, with strong independence assumption and the Central Limit...
Side-Channel Analysis Revisited and Evaluated
Jiangshan Long, Changhai Ou, Yukun Cheng, Kexin Qiao, Wei Cheng, Fan Zhang
Attacks and cryptanalysis
Side-channel analysis recovers a secret by exploiting the key-dependent characteristics of the leakages. Practical techniques, such as Distance-of-Means analysis (DoM), Kolmogorov-Smirnov analysis (KSA) and Cramér-von Mises analysis (CvMA), provide valuable insights about the secret from the indirect perspectives of statistical moment and cumulative distribution function (CDF) respectively, circumventing the direct and costly estimation of leakage probability densities and therefore enabling...
A Meta-Complexity Characterization of Quantum Cryptography
Bruno P. Cavalar, Eli Goldin, Matthew Gray, Peter Hall
Foundations
We prove the first meta-complexity characterization of a quantum cryptographic primitive. We show that one-way puzzles exist if and only if there is some quantum samplable distribution of binary strings over which it is hard to approximate Kolmogorov complexity. Therefore, we characterize one-way puzzles by the average-case hardness of a uncomputable problem. This brings to the quantum setting a recent line of work that characterizes classical cryptography with the average-case hardness of...
Trail-Estimator: An Automated Verifier for Differential Trails in Block Ciphers
Thomas Peyrin, Quan Quan Tan, Hongyi Zhang, Chunning Zhou
Attacks and cryptanalysis
Differential cryptanalysis is a powerful technique for attacking block ciphers, wherein the Markov cipher assumption and stochastic hypothesis are commonly employed to simplify the search and probability estimation of differential trails. However, these assumptions often neglect inherent algebraic constraints, potentially resulting in invalid trails and inaccurate probability estimates. Some studies identified violations of these assumptions and explored how they impose constraints on key...
Uniformly Most Powerful Tests for Ad Hoc Transactions in Monero
Brandon Goodell, Rigo Salazar, Freeman Slaughter
Cryptographic protocols
We introduce a general, low-cost, low-power statistical test for transactions in transaction protocols with small anonymity set authentication (TPSASAs), such as Monero. The test classifies transactions as ad hoc (spontaneously constructed to spend a deterministically selected key) or self-churned (constructed from a probability distribution very close to that of the default wallet software, and with the same sender and receiver). The test is a uniformly most powerful (UMP) likelihood ratio...
Distributional Private Information Retrieval
Ryan Lehmkuhl, Alexandra Henzinger, Henry Corrigan-Gibbs
Cryptographic protocols
A private-information-retrieval (PIR) scheme lets a client fetch a record from a remote database without revealing which record it fetched. Classic PIR schemes treat all database records the same but, in practice, some database records are much more popular (i.e., commonly fetched) than others. We introduce distributional PIR, a new type of PIR that can run faster than classic PIR---both asymptotically and concretely---when the popularity distribution is skewed. Distributional PIR provides...
Compact Lattice Signatures via Iterative Rejection Sampling
Joel Gärtner
Public-key cryptography
One of the primary approaches for constructing lattice-based signature schemes is through the “Fiat-Shamir with aborts” methodology. Schemes constructed using this approach may abort and restart during signing, corresponding to rejection sampling produced signatures in order to ensure that they follow a distribution that is independent of the secret key. This rejection sampling is only feasible when the output distribution is sufficiently wide, limiting how compact this type of signature...
On the Security of LWE-based KEMs under Various Distributions: A Case Study of Kyber
Mingyao Shao, Yuejun Liu, Yongbin Zhou, Yan Shao
Public-key cryptography
Evaluating the security of LWE-based KEMs involves two crucial metrics: the hardness of the underlying LWE problem and resistance to decryption failure attacks, both significantly influenced by the secret key and error distributions. To mitigate the complexity and timing vulnerabilities of Gaussian sampling, modern LWE-based schemes often adopt either the uniform or centered binomial distribution (CBD).
This work focuses on Kyber to evaluate its security under both distributions. Compared...
Quantum Cryptography from Meta-Complexity
Taiga Hiroka, Tomoyuki Morimae
Foundations
In classical cryptography, one-way functions (OWFs) are the minimal assumption, while recent active studies have demonstrated that OWFs are not necessarily the minimum assumption in quantum cryptography. Several new primitives have been introduced such as pseudorandom unitaries (PRUs), pseudorandom function-like state generators (PRFSGs), pseudorandom state generators (PRSGs), one-way state generators (OWSGs), one-way puzzles (OWPuzzs), and EFI pairs. They are believed to be weaker than...
Founding Quantum Cryptography on Quantum Advantage, or, Towards Cryptography from $\#\mathsf{P}$-Hardness
Dakshita Khurana, Kabir Tomer
Foundations
Recent oracle separations [Kretschmer, TQC'21, Kretschmer et. al., STOC'23] have raised the tantalizing possibility of building quantum cryptography from sources of hardness that persist even if the polynomial hierarchy collapses. We realize this possibility by building quantum bit commitments and secure computation from unrelativized, well-studied mathematical problems that are conjectured to be hard for $\mathsf{P}^{\#\mathsf{P}}$ -- such as approximating the permanents of complex Gaussian...
Generalized one-way function and its application
Hua-Lei Yin
Cryptographic protocols
One-way functions are fundamental to classical cryptography and their existence remains a longstanding problem in computational complexity theory. Recently, a provable quantum one-way function has been identified, which maintains its one-wayness even with unlimited computational resources. Here, we extend the mathematical definition of functions to construct a generalized one-way function by virtually measuring the qubit of provable quantum one-way function and randomly assigning the...
Unconditionally secure key distribution without quantum channel
Hua-Lei Yin
Cryptographic protocols
Key distribution plays a fundamental role in cryptography. Currently, the quantum scheme stands as the only known method for achieving unconditionally secure key distribution. This method has been demonstrated over distances of 508 and 1002 kilometers in the measurement-device-independent and twin-field configurations, respectively. However, quantum key distribution faces transmission distance issues and numerous side channel attacks since the basic physical picture requires the use of...
Randomized Distributed Function Computation with Semantic Communications: Applications to Privacy
Onur Gunlu
Foundations
Randomized distributed function computation refers to remote function computation where transmitters send data to receivers which compute function outputs that are randomized functions of the inputs. We study the applications of semantic communications in randomized distributed function computation to illustrate significant reductions in the communication load, with a particular focus on privacy. The semantic communication framework leverages generalized remote source coding methods, where...
Probabilistic Attacks and Enhanced Security for "Private Set Intersection in the Internet Setting from Lightweight Oblivious PRF"
Zhuang Shan, Leyou Zhang, Qing Wu, Qiqi Lai
Cryptographic protocols
Privacy Set Intersection (PSI) has been an important research topic within privacy computation. Its main function is to allow two parties to compute the intersection of their private sets without revealing any other private information. Therefore, PSI can be applied to various real-world scenarios.
Chase and Miao presented an impressive construction ``Private set intersection in the Internet setting from lightweight oblivious prf'' (CM20 for short) at Crypto 2020, highlighting its...
Hidden $\Delta$-fairness: A Novel Notion for Fair Secure Two-Party Computation
Saskia Bayreuther, Robin Berger, Felix Dörre, Jeremias Mechler, Jörn Müller-Quade
Cryptographic protocols
Secure two-party computation allows two mutually distrusting parties to compute a joint function over their inputs, guaranteeing properties such as input privacy or correctness.
For many tasks, such as joint computation of statistics, it is important that when one party receives the result of the computation, the other party also receives the result.
Unfortunately, this property, which is called fairness, is unattainable in the two-party setting for arbitrary functions. So weaker...
Theoretical Explanation and Improvement of Deep Learning-aided Cryptanalysis
Weixi Zheng, Liu Zhang, Zilong Wang
Attacks and cryptanalysis
At CRYPTO 2019, Gohr demonstrated that differential-neural distinguishers (DNDs) for Speck32/64 can learn more features than classical cryptanalysis's differential distribution tables (DDT). Furthermore, a non-classical key recovery procedure is devised by combining the Upper Confidence Bound (UCB) strategy and the BayesianKeySearch algorithm. Consequently, the time complexity of 11-round key recovery attacks on Speck32/64 is significantly reduced compared with the state-of-the-art results...
Mirrored Commitment: Fixing ``Randomized Partial Checking'' and Applications
Paweł Lorek, Moti Yung, Filip Zagórski
Cryptographic protocols
Randomized Partial Checking (RPC} was proposed by Jakobsson, Juels, and Rivest and attracted attention as an efficient method of verifying the correctness of the mixing process in numerous applied scenarios. In fact,
RPC is a building block for many electronic voting schemes, including Prêt à Voter, Civitas, Scantegrity II as well as voting-systems used in real-world elections (e.g., in Australia). Mixing is also used in anonymous transfers of cryptocurrencies.
It turned out, however,...
Lossy Cryptography from Code-Based Assumptions
Quang Dao, Aayush Jain
Public-key cryptography
Over the past few decades, we have seen a proliferation of advanced cryptographic primitives with lossy or homomorphic properties built from various assumptions such as Quadratic Residuosity, Decisional Diffie-Hellman, and Learning with Errors. These primitives imply hard problems in the complexity class $\mathcal{SZK}$ (statistical zero-knowledge); as a consequence, they can only be based on assumptions that are broken in $\mathcal{BPP}^{\mathcal{SZK}}$. This poses a barrier for building...
Too Hot To Be True: Temperature Calibration for Higher Confidence in NN-assisted Side-channel Analysis
Seyedmohammad Nouraniboosjin, Fatemeh Ganji
Attacks and cryptanalysis
The past years have witnessed a considerable increase in research efforts put into neural network-assisted profiled side-channel analysis (SCA). Studies have also identified challenges, e.g., closing the gap between metrics for machine learning (ML) classification and side-channel attack evaluation. In fact, in the context of NN-assisted SCA, the NN’s output distribution forms the basis for successful key recovery. In this respect, related work has covered various aspects of integrating...
A Refined Hardness Estimation of LWE in Two-step Mode
Wenwen Xia, Leizhang Wang, Geng Wang, Dawu Gu, Baocang Wang
Public-key cryptography
Recently, researchers have proposed many LWE estimators, such as lattice-estimator (Albrecht et al, Asiacrypt 2017) and leaky-LWE-Estimator (Dachman-Soled et al, Crypto 2020), while the latter has already been used in estimating the security level of Kyber and Dilithium using only BKZ. However, we prove in this paper that solving LWE by combining a lattice reduction step (by LLL or BKZ) and a
target vector searching step (by enumeration or sieving), which we call a Two-step mode, is more...
Cryptanalysis of the Peregrine Lattice-Based Signature Scheme
Xiuhan Lin, Moeto Suzuki, Shiduo Zhang, Thomas Espitau, Yang Yu, Mehdi Tibouchi, Masayuki Abe
Attacks and cryptanalysis
The Peregrine signature scheme is one of the candidates in the ongoing Korean post-quantum cryptography competition. It is proposed as a high-speed variant of Falcon, which is a hash-and-sign signature scheme over NTRU lattices and one of the schemes selected by NIST for standardization. To this end, Peregrine replaces the lattice Gaussian sampler in the Falcon signing procedure with a new sampler based on the centered binomial distribution. While this modification offers significant...
Formal Analysis of Non-profiled Deep-learning Based Side-channel Attacks
Akira Ito, Rei Ueno, Rikuma Tanaka, Naofumi Homma
Attacks and cryptanalysis
This paper formally analyzes two major non-profiled deep-learning-based side-channel attacks (DL-SCAs): differential deep-learning analysis (DDLA) by Timon and collision DL-SCA by Staib and Moradi. These DL-SCAs leverage supervised learning in non-profiled scenarios. Although some intuitive descriptions of these DL-SCAs exist, their formal analyses have been rarely conducted yet, which makes it unclear why and when the attacks succeed and how the attack can be improved. In this paper, we...
More Insight on Deep Learning-aided Cryptanalysis
Zhenzhen Bao, Jinyu Lu, Yiran Yao, Liu Zhang
Attacks and cryptanalysis
In CRYPTO 2019, Gohr showed that well-trained neural networks could perform cryptanalytic distinguishing tasks superior to differential distribution table (DDT)-based distinguishers. This suggests that the differential-neural distinguisher (ND) may use additional information besides pure ciphertext differences. However, the explicit knowledge beyond differential distribution is still unclear. In this work, we provide explicit rules that can be used alongside DDTs to enhance the effectiveness...
Oracle Recording for Non-Uniform Random Oracles, and its Applications
Minki Hhan, Aaram Yun
Foundations
In Crypto 2019, Zhandry showed how to define compressed oracles, which record quantum superposition queries to the quantum random oracle. In this paper, we extend Zhandry's compressed oracle technique to non-uniformly distributed functions with independently sampled outputs. We define two quantum oracles $\mathsf{CStO}_D$ and $\mathsf{CPhsO}_D$, which are indistinguishable to the non-uniform quantum random oracle where quantum access is given to a random function $H$ whose images $H(x)$...
Key-Agreement with Perfect Completeness from Random Oracles
Noam Mazor
Foundations
In the Random Oracle Model (ROM) all parties have oracle access to a common random function, and the parties are limited in the number of queries they can make to the oracle. The Merkle’s Puzzles protocol, introduced by Merkle [CACM ’78], is a key-agreement protocol in the ROM with a quadratic gap between the query complexity of the honest parties and the eavesdropper. This quadratic gap is known to be optimal, by the works of Impagliazzo and Rudich [STOC ’89] and Barak and Mahmoody [Crypto...
Not optimal but efficient: a distinguisher based on the Kruskal-Wallis test
Yan Yan, Arnab Roy, Elisabeth Oswald
Implementation
Research about the theoretical properties of side channel distinguishers revealed the rules by which to maximise the probability of first order success (``optimal distinguishers'') under different assumptions about the leakage model and noise distribution. Simultaneously, research into bounding first order success (as a function of the number of observations) has revealed universal bounds, which suggest that (even optimal) distinguishers are not able to reach theoretically possible success...
Generalized Hybrid Search and Applications
Alexandru Cojocaru, Juan Garay, Fang Song
Cryptographic protocols
In this work we first examine the hardness of solving various search problems by hybrid quantum-classical strategies, namely, by algorithms that have both quantum and classical capabilities. We then construct a hybrid quantum-classical search algorithm and analyze its success probability.
Regarding the former, for search problems that are allowed to have multiple solutions and in which the input is sampled according to arbitrary distributions we establish their hybrid quantum-classical...
Super-Quadratic Quantum Speed-Ups and Guessing Many Likely Keys
Timo Glaser, Alexander May, Julian Nowakowski
Foundations
We study the fundamental problem of guessing cryptographic keys, drawn from some non-uniform probability distribution $\mathcal D$, as e.g. in LPN, LWE or for passwords. The optimal classical algorithm enumerates keys in decreasing order of likelihood. The optimal quantum algorithm, due to Montanaro (2011), is a sophisticated Grover search.
We give the first tight analysis for Montanaro's algorithm, showing that its runtime is $2^{\operatorname{H}_{2/3}(\mathcal D)/2}$, where...
2023/667
Last updated: 2023-09-18
New Bounds on the Accuracy of Majority Voting for Multi-Class Classification
Sina Aeeneh
Foundations
Majority voting is a simple mathematical function that returns the value that appears most often in a set. As a popular decision fusion technique, the majority voting function (MVF) finds applications in resolving conflicts, where a number of independent voters report their opinions on a classification problem. Despite its importance and its various applications in ensemble learning, data crowd-sourcing, remote sensing, and data oracles for blockchains, the accuracy of the MVF for the...
Unconditionally secure ciphers with a short key for a source with unknown statistics
Boris Ryabko
Secret-key cryptography
We consider the problem of constructing an unconditionally secure cipher with a short key for the case where the probability distribution of encrypted messages is unknown. Note that unconditional security means that an adversary with no computational constraints can obtain only a negligible amount of information ("leakage") about an encrypted message (without knowing the key).
Here we consider the case of a priori (partially) unknown message source statistics.
More specifically, the...
CNF Characterization of Sets over $\mathbb{Z}_2^n$ and Its Applications in Cryptography
Hu Xiaobo, Xu Shengyuan, Tu Yinzi, Feng Xiutao
Attacks and cryptanalysis
In recent years, the automatic search has been widely used to search differential characteristics and linear approximations with high probability/correlation. Among these methods, the automatic search with the Boolean Satisfiability Problem (SAT, in short) gradually becomes a powerful cryptanalysis tool in symmetric ciphers. A key problem in the automatic search method is how to fully characterize a set $S \subseteq \{0,1\}^n$ with as few Conjunctive Normal Form (CNF, in short) clauses as...
Impossibility of Efficient Information-Theoretic Fuzzy Extraction
Benjamin Fuller
Foundations
Fuzzy extractors convert noisy signals from the physical world into reliable cryptographic keys. Fuzzy min-entropy is an important measure of the ability of a fuzzy extractor to distill keys from a distribution: in particular, it bounds the length of the key that can be derived (Fuller, Reyzin, and Smith, IEEE Transactions on Information Theory 2020).
In general, fuzzy min-entropy that is superlogarithmic in the security parameter is required for a noisy distribution to be suitable for key...
Mind Your Path: On (Key) Dependencies in Differential Characteristics
Thomas Peyrin, Quan Quan Tan
Attacks and cryptanalysis
Cryptanalysts have been looking for differential characteristics in ciphers for
decades and it remains unclear how the subkey values and more generally the Markov
assumption impacts exactly their probability estimation. There were theoretical
efforts considering some simple linear relationships between differential characteristics
and subkey values, but the community has not yet explored many possible nonlinear
dependencies one can find in differential characteristics. Meanwhile, the...
Extensible Decentralized Secret Sharing and Application to Schnorr Signatures
Michele Battagliola, Riccardo Longo, Alessio Meneghetti
Cryptographic protocols
Starting from links between coding theory and secret sharing we develop an extensible and decentralized version of Shamir Secret Sharing, that allows the addition of new users after the initial share distribution.
On top of it we design a totally decentralized $(t,n)$-threshold Schnorr signature scheme that needs only $t$ users online during the key generation phase, while the others join later.
Under standard assumptions we prove our scheme secure against adaptive malicious adversaries....
How to Enumerate LWE Keys as Narrow as in Kyber/Dilithium
Timo Glaser, Alexander May
Attacks and cryptanalysis
In the Learning with Errors (LWE) problem we are given a matrix $A \in \mathbb{Z}_q^{N \times N}$ and a target vector $\vec{t} \in \mathbb{Z}_q^N$ such that there exists small-norm $\vec{s}, \vec{e}\in \mathbb{Z}_q^N$ satisfying $A\cdot\vec{s} = \vec{t} + \vec{e} \bmod q$.
Modern cryptosystems often sample $\vec{s}, \vec{e}$ from narrow distributions that take integer values in a small range $[-\eta, \eta].$ Kyber and Dilithium both choose $\eta=2$ and $\eta=3$ using either a Centered...
How to Sample a Discrete Gaussian (and more) from a Random Oracle
George Lu, Brent Waters
Foundations
The random oracle methodology is central to the design of many practical
cryptosystems. A common challenge faced in several systems is the need to have
a random oracle that outputs from a structured distribution $\mathcal{D}$, even though most
heuristic implementations such as SHA-3 are best suited for outputting bitstrings.
Our work explores the problem of sampling from discrete Gaussian (and related) distributions in a manner that they can be programmed into random oracles. We...
On Module Unique-SVP and NTRU
Joël Felderhoff, Alice Pellet-Mary, Damien Stehlé
Foundations
The NTRU problem can be viewed as an instance of finding a short non-zero vector in a lattice, under the promise that it contains an exceptionally short vector. Further, the lattice under scope has the structure of a rank-2 module over the ring of integers of a number field. Let us refer to this problem as the module unique Shortest Vector Problem,or mod-uSVP for short. We exhibit two reductions that together provide evidence the NTRU problem is not just a particular case of mod-uSVP, but...
Lattice Enumeration with Discrete Pruning: Improvement, Cost Estimation and Optimal Parameters
Luan Luan, Chunxiang Gu, Yonghui Zheng, Yanan Shi
Foundations
Lattice enumeration is a linear-space algorithm for solving the shortest lattice vector problem(SVP). Extreme pruning is a practical technique for accelerating lattice enumeration, which has mature theoretical analysis and practical implementation. However, these works are still remain to be done for discrete pruning. In this paper, we improve the discrete pruned enumeration (DP enumeration), and give a solution to the problem proposed by Leo Ducas et Damien Stehle about the cost estimation...
Differential Cryptanalysis in the Fixed-Key Model
Tim Beyne, Vincent Rijmen
Secret-key cryptography
A systematic approach to the fixed-key analysis of differential probabilities is proposed. It is based on the propagation of 'quasidifferential trails', which keep track of probabilistic linear relations on the values satisfying a differential characteristic in a theoretically sound way. It is shown that the fixed-key probability of a differential can be expressed as the sum of the correlations of its quasidifferential trails.
The theoretical foundations of the method are based on an...
On the Success Rate of Side-Channel Attacks on Masked Implementations: Information-Theoretical Bounds and Their Practical Usage
Akira Ito, Rei Ueno, Naofumi Homma
Secret-key cryptography
This study derives information-theoretical bounds of the success rate (SR) of side-channel attacks on masked implementations. We first develop a communication channel model representing side-channel attacks on masked implementations. We then derive two SR bounds based on the conditional probability distribution and mutual information of shares. The basic idea is to evaluate the upper-bound of the mutual information between the non-masked secret value and the side-channel trace by the...
Secure Sampling of Constant-Weight Words – Application to BIKE
Nicolas Sendrier
Public-key cryptography
The pseudorandom sampling of constant weight words, as it is currently implemented in cryptographic schemes like BIKE or HQC, is prone to the leakage of information on the seed being used for the pseudorandom number generation. This creates a vulnerability when the semantic security conversion requires a deterministic re-encryption. This observation was first made in [HLS21] about HQC and a timing attack was presented to recover the secret key. As suggested in [HLS21] a similar attack...
On Quantum Query Complexities of Collision-Finding in Non-Uniform Random Functions
Tianci Peng, Shujiao Cao, Rui Xue
Foundations
Collision resistance and collision finding are now extensively exploited in Cryptography, especially in the case of quantum computing. For any function $f:[M]\to[N]$ with $f(x)$ uniformly distributed over $[N]$, Zhandry has shown that the number $\Theta(N^{1/3})$ of queries is both necessary and sufficient for finding a collision in $f$ with constant probability. However, there is still a gap between the upper and the lower bounds of query complexity in general non-uniform...
A Note on the Pseudorandomness of Low-Degree Polynomials over the Integers
Aayush Jain, Alexis Korb, Paul Lou, Amit Sahai
Foundations
We initiate the study of a problem called the Polynomial Independence Distinguishing Problem (PIDP). The problem is parameterized by a set of polynomials $\mathcal{Q}=(q_1,\ldots, q_m)$ where each $q_i:\mathbb{R}^n\rightarrow \mathbb{R}$ and an input distribution $\mathcal{D}$ over the reals. The goal of the problem is to distinguish a tuple of the form $\{ q_i,q_i(\mathbf{x})\}_{i\in [m]} $ from $\{ q_i,q_i(\mathbf{x}_i)\}_{i\in [m]} $ where $\mathbf{x}, \mathbf{x}_1,\ldots ,...
Modeling Large S-box in MILP and a (Related-key) Differential Attack on Full Round PIPO-64/128
Tarun Yadav, Manoj Kumar
Secret-key cryptography
Mixed integer linear programming (MILP) based tools are used to estimate the strength of block ciphers against the cryptanalytic attacks. The existing tools use partial difference distribution table (p-DDT) approach to optimize the probability of differential characteristics for large (≥8-bit) S-box based ciphers. We propose to use the full difference distribution table (DDT) with the probability of each possible propagation for MILP modeling of large S-boxes. This requires more than 16...
Anonymous Whistleblowing over Authenticated Channels
Thomas Agrikola, Geoffroy Couteau, Sven Maier
Foundations
The goal of anonymous whistleblowing is to publicly disclose a message while at the same time hiding the identity of the sender in a way that even if suspected of being the sender, this cannot be proven.
While many solutions to this problem have been proposed over the years, they all require some form of interaction with trusted or non-colluding parties. In this work, we ask whether this is fundamentally inherent. We put forth the notion of anonymous transfer as a primitive allowing to...
Clustering Effect in Simon and Simeck
Gaëtan Leurent, Clara Pernot, André Schrottenloher
Secret-key cryptography
Simon and Simeck are two lightweight block ciphers with a simple round function using only word rotations and a bit-wise AND operation. Previous work has shown a strong clustering effect for differential and linear cryptanalysis, due to the existence of many trails with the same inputs and outputs.
In this paper, we explore this clustering effect by exhibiting a class of high probability differential and linear trails where the active bits stay in a fixed window of $w$ bits. Instead of...
Improve Neural Distinguisher for Cryptanalysis
Zezhou Hou, Jiongjiong Ren, Shaozhen Chen
At CRYPTO'19, Gohr built a bridge between deep learning and cryptanalysis. Based on deep neural networks, he trained neural distinguishers of Speck32/64 using a plaintext difference and single ciphertext pair. Compared with purely differential distinguishers, neural distinguishers successfully use features of the ciphertext pairs. Besides, with the help of neural distinguishers, he attacked 11-round Speck32/64 using Bayesian optimization. At EUROCRYPTO'21, Benamira proposed a detailed...
Generalized Galbraith's Test: Characterization and Applications to Anonymous IBE Schemes
Paul Cotan, George Teseleanu
Public-key cryptography
The main approaches currently used to construct identity based encryption (IBE) schemes are based on bilinear mappings, quadratic residues and lattices. Among them, the most attractive approach is the one based on quadratic residues, due to the fact that the underlying security assumption is a well understood hard problem. The first such IBE scheme was constructed by Cocks and some of its deficiencies were addressed in subsequent works. In this paper, we will focus on two constructions that...
Polar Coding for Ring-LWE-Based Public Key Encryption
Jiabo Wang, Cong Ling
Public-key cryptography
Cryptographic constructions based on $\textit{ring learning with errors}$ (RLWE) have emerged as one of the front runners for the standardization of post quantum public key cryptography. As the standardization process continues, optimizing specific parts of proposed schemes becomes a worthwhile endeavor. In this work we focus on using error correcting codes to alleviate a natural trade-off present in most schemes; namely, we would like a wider error distribution to increase security, but a...
Upslices, Downslices, and Secret-Sharing with Complexity of $1.5^n$
Benny Applebaum, Oded Nir
A secret-sharing scheme allows to distribute a secret $s$ among $n$ parties such that only some predefined ``authorized'' sets of parties can reconstruct the secret, and all other ``unauthorized'' sets learn nothing about $s$.
The collection of authorized/unauthorized sets can be captured by a monotone function $f:\{0,1\}^n\rightarrow \{0,1\}$.
In this paper, we focus on monotone functions that all their min-terms are sets of size $a$, and on their duals -- monotone functions whose max-terms...
Limitations on Uncloneable Encryption and Simultaneous One-Way-to-Hiding
Christian Majenz, Christian Schaffner, Mehrdad Tahmasbi
Foundations
We study uncloneable quantum encryption schemes for classical messages as recently proposed by Broadbent and Lord. We focus on the information-theoretic setting and give several limitations on the structure and security of these schemes: Concretely, 1) We give an explicit cloning-indistinguishable attack that succeeds with probability $\frac12 + \mu/16$ where $\mu$ is related to the largest eigenvalue of the resulting quantum ciphertexts. 2) For a uniform message distribution, we partially...
On Security Properties of All-or-nothing Transforms
Navid Nasr Esfahani, Douglas R. Stinson
Foundations
All-or-nothing transforms have been defined as bijective mappings on all s-tuples over a specified finite alphabet. These mappings are required to satisfy certain "perfect security" conditions specified using entropies of the probability distribution defined on the input s-tuples. Alternatively, purely combinatorial definitions of AONTs have been given, which involve certain kinds of "unbiased arrays". However, the combinatorial definition makes no reference to probability definitions.
In...
Rank Estimation with Bounded Error via Exponential Sampling
Liron David, Avishai Wool
Applications
Efficient rank estimation algorithms are of prime interest in security evaluation against side-channel attacks (SCA) and recently also for password strength estimators. In a side channel setting it allows estimating the remaining security after an attack has been performed, quantified as the time complexity and the memory consumption required to brute force the key given the leakages as probability distributions over $d$ subkeys (usually key bytes). In password strength estimators the rank...
Leakage-resilience of the Shamir Secret-sharing Scheme against Physical-bit Leakages
Hemanta K. Maji, Hai H. Nguyen, Anat Paskin-Cherniavsky, Tom Suad, Mingyuan Wang
Foundations
Efficient Reed-Solomon code reconstruction algorithms, for example, by Guruswami and Wootters (STOC--2016), translate into local leakage attacks on Shamir secret-sharing schemes over characteristic-2 fields. However, Benhamouda, Degwekar, Ishai, and Rabin (CRYPTO--2018) showed that the Shamir secret sharing scheme over prime-fields is leakage resilient to one-bit local leakage if the reconstruction threshold is roughly 0.87 times the total number of parties. In several application scenarios,...
Replacing Probability Distributions in Security Games via Hellinger Distance
Kenji Yasunaga
Foundations
Security of cryptographic primitives is usually proved by assuming ``ideal'' probability distributions. We need to replace them with approximated ``real'' distributions in the real-world systems without losing the security level. We demonstrate that the Hellinger distance is useful for this problem, while the statistical distance is mainly used in the cryptographic literature. First, we show that for preserving $\lambda$-bit security of a given security game, the closeness of...
Efficient Lattice Gadget Decomposition Algorithm with Bounded Uniform Distribution
Sohyun Jeon, Hyang-Sook Lee, Jeongeun Park
Foundations
A gadget decomposition algorithm is commonly used in many advanced lattice cryptography applications which support homomorphic operation over ciphertexts to control the noise growth. For a special structure of a gadget, the algorithm is digit decomposition. If such algorithm samples from a subgaussian distribution, that is, the output is randomized, it gives more benefits on output quality. One of important advantages is Pythagorean additivity which makes resulting noise contained in a...
$P_4$-free Partition and Cover Numbers and Application
Alexander R. Block, Simina Branzei, Hemanta K. Maji, Himanshi Mehta, Tamalika Mukherjee, Hai H. Nguyen
$P_4$-free graphs-- also known as cographs, complement-reducible graphs, or hereditary Dacey graphs--have been well studied in graph theory.
Motivated by computer science and information theory applications, our work encodes (flat) joint probability distributions and Boolean functions as bipartite graphs and studies bipartite $P_4$-free graphs.
For these applications, the graph properties of edge partitioning and covering a bipartite graph using the minimum number of these graphs are...
Key Dependency of Differentials: Experiments in the Differential Cryptanalysis of Block Ciphers Using Small S-boxes
Howard M. Heys
Secret-key cryptography
In this paper, we investigate the key dependency of differentials in block ciphers by examining the results of numerous experiments applied to the substitution-permutation network (SPN) structure using 4-bit S-boxes. In particular, we consider two cipher structures: a toy 16-bit SPN and a realistic 64-bit SPN. For both ciphers, we generate many different experimental results by inserting the S-boxes used in many lightweight cipher proposals and applying different forms of round key...
Multi-stage Proof-of-Works: Properties and Vulnerabilities
Paolo D'Arco, Zahra Ebadi Ansaroudi, Francesco Mogavero
Applications
Since its appearance in 2008, Bitcoin has attracted considerable attention. So far, it has been the most successful cryptocurrency, with the highest market capitalization. Nevertheless, due to the method it uses to append new transactions and blocks to the blockchain, based on a Proof-of-Work, Bitcoin suffers from poor scalability, which strongly limits the number of transactions per second and, hence, its adoption as a global payment layer for everyday uses. In this paper we analyze some...
Bit Security Estimation Using Various Information-Theoretic Measures
Dong-Hoon Lee, Young-Sik Kim, Jong-Seon No
Foundations
In this paper, various quantitative information-theoretic security reductions which correlate statistical difference between two probability distributions with security level's gap for two cryptographic schemes are proposed. Security is the most important prerequisite for cryptographic primitives. In general, there are two kinds of security; one is computational security, and the other is information-theoretic security. We focus on the latter one in this paper, especially the view point of...
A New Code Based Signature Scheme without Trapdoors
Zhe Li, Chaoping Xing, Sze Ling Yeo
Public-key cryptography
We present a signature scheme for Hamming metric random linear codes via the Schnorr-Lyubashevsky framework that employs the rejection sampling on appropriate probability distributions instead of using trapdoors. Such an approach has been widely believed to be more challenging for linear codes as compared to lattices with Gaussian distributions. We prove that our signature scheme achieves EUF-CMA security under the assumption of the decoding one out of many problem or achieves strong EUF-CMA...
A Scalable Simulation of the BB84 Protocol Involving Eavesdropping
Mihai-Zicu Mina, Emil Simion
In this article we present the BB84 quantum key distribution scheme from two perspectives. First, we provide a theoretical discussion of the steps Alice and Bob take to reach a shared secret using this protocol, while an eavesdropper Eve is either involved or not. Then, we offer and discuss two distinct implementations that simulate BB84 using IBM’s Qiskit framework, the first being an exercise solved during the “IBM Quantum Challenge” event in early May 2020, while the other was...
Persistent Fault Analysis With Few Encryptions
Sebastien Carre, Sylvain Guilley, Olivier Rioul
Secret-key cryptography
Persistent fault analysis (PFA) consists in guessing block
cipher secret keys by biasing their substitution box. This paper improves
the original attack of Zhang et al. on AES-128 presented at CHES
2018. By a thorough analysis, the exact probability distribution of the
ciphertext (under a uniformly distributed plaintext) is derived, and the
maximum likelihood key recovery estimator is computed exactly. Its
expression is turned into an attack algorithm, which is shown to be
twice more...
SILVER - Statistical Independence and Leakage Verification
David Knichel, Pascal Sasdrich, Amir Moradi
Implementation
Implementing cryptographic functions securely in the presence of physical adversaries is still a challenge although a lion's share of research in the physical security domain has been put in development of countermeasures. Among several protection schemes, masking has absorbed the most attention of research in both academic and industrial communities, due to its theoretical foundation allowing to provide proofs or model the achieved security level. In return, masking schemes are difficult to...
Optimally-secure Coin-tossing against a Byzantine Adversary
Hamidreza Amini Khorasgani, Hemanta K. Maji, Mingyuan Wang
Foundations
In their seminal work, Ben-Or and Linial (1985) introduced the full information model for collective coin-tossing protocols involving $n$ processors with unbounded computational power using a common broadcast channel for all their communications. The design and analysis of coin-tossing protocols in the full information model have close connections to diverse fields like extremal graph theory, randomness extraction, cryptographic protocol design, game theory, distributed protocols, and...
Cryptography from Information Loss
Marshall Ball, Elette Boyle, Akshay Degwekar, Apoorvaa Deshpande, Alon Rosen, Vinod Vaikuntanathan, Prashant Nalini Vasudevan
Foundations
Reductions between problems, the mainstay of theoretical computer science, efficiently map an instance of one problem to an instance of another in such a way that solving the latter allows solving the former. The subject of this work is ``lossy'' reductions, where the reduction loses some information about the input instance. We show that such reductions, when they exist, have interesting and powerful consequences for lifting hardness into ``useful'' hardness, namely cryptography.
Our...
An n/2 byzantine node tolerated blockchain sharding approach
Yibin Xu, Yangyu Huang
Cryptographic protocols
Traditional Blockchain Sharding approaches can only tolerate up to n/3 of nodes being adversary because they rely on the hypergeometric distribution to make a failure (an adversary does not have n/3 of nodes globally but can manipulate the consensus of a Shard) hard to happen. The system must maintain a large Shard size (the number of nodes inside a Shard) to sustain the low failure probability so that only a small number of Shards may exist. In this paper, we present a new approach of...
Secure k-ish nearest neighbors classifier
Hayim Shaul, Dan Feldman, Daniela Rus
Applications
The $k$-nearest neighbors ($k$NN) classifier predicts a class of a query, $q$, by taking the majority class of its $k$ neighbors in an existing (already classified) database, $S$.
In secure $k$NN, $q$ and $S$ are owned by two different parties and $q$ is classified without sharing data.
In this work we present a classifier based on $k$NN, that is more efficient to implement with homomorphic encryption (HE).
The efficiency of our classifier comes from a relaxation we make to consider $\kappa$...
NIZK from LPN and Trapdoor Hash via Correlation Intractability for Approximable Relations
Zvika Brakerski, Venkata Koppula, Tamer Mour
Cryptographic protocols
We present new non-interactive zero-knowledge argument systems (NIZK), based on standard assumptions that were previously not known to imply it. In particular, we rely on the hardness of both the learning parity with noise (LPN) assumption, and the existence of trapdoor hash functions (TDH, defined by Döttling et al., Crypto 2019). Such TDH can be based on a number of standard assumptions, including DDH, QR, DCR, and LWE. We revisit the correlation intractability (CI) framework for...
SynFi: Automatic Synthetic Fingerprint Generation
M. Sadegh Riazi, Seyed M. Chavoshian, Farinaz Koushanfar
Applications
Authentication and identification methods based on human fingerprints are ubiquitous in several systems ranging from government organizations to consumer products. The performance and reliability of such systems directly rely on the volume of data on which they have been verified. Unfortunately, a large volume of fingerprint databases is not publicly available due to many privacy and security concerns. In this paper, we introduce a new approach to automatically generate high-fidelity...
Hardness of LWE on General Entropic Distributions
Zvika Brakerski, Nico Döttling
Foundations
The hardness of the Learning with Errors (LWE) problem is by now a cornerstone of the cryptographic landscape. In many of its applications the so called ``LWE secret'' is not sampled uniformly, but comes from a distribution with some min-entropy. This variant, known as ``Entropic LWE'', has been studied in a number of works, starting with Goldwasser et al. (ICS 2010). However, so far it was only known how to prove the hardness of Entropic LWE for secret distributions supported inside a ball...
Analysis on Aigis-Enc: asymmetrical and symmetrical
Yupu Hu, Siyue Dong, Xingting Dong
Public-key cryptography
Aigis-Enc is an encryption algorithm based on asymmetrical LWE. In this algorithm, the compression process is utilized during both key generation and encryption (which is equivalent to add some LWR noise). Then encapsulation is realized by FO transformation. It is well known that FO transformation is not considered for discussing CPA security. On the other hand, since the security reduction of LWR is hard to proceed, it is not considered for discussing the CPA security of Aigis-Enc. But...
The Influence of LWE/RLWE Parameters on the Stochastic Dependence of Decryption Failures
Georg Maringer, Tim Fritzmann, Johanna Sepúlveda
Public-key cryptography
Learning with Errors (LWE) and Ring-LWE (RLWE) problems allow the construction of efficient key exchange and public-key encryption schemes. However, while improving the security through the use of error distributions with large standard deviations, the decryption failure rate increases as well. Currently, the independence of individual coefficient failures is assumed to estimate the overall decryption failure rate of many LWE/RLWE schemes. However, previous work has shown that this...
Key Recovery from Gram-Schmidt Norm Leakage in Hash-and-Sign Signatures over NTRU Lattices
Pierre-Alain Fouque, Paul Kirchner, Mehdi Tibouchi, Alexandre Wallet, Yang Yu
Public-key cryptography
In this paper, we initiate the study of side-channel leakage in hash-and-sign lattice-based signatures, with particular emphasis on the two efficient implementations of the original GPV lattice-trapdoor paradigm for signatures, namely NIST second-round candidate Falcon and its simpler predecessor DLP. Both of these schemes implement the GPV signature scheme over NTRU lattices, achieving great speed-ups over the general lattice case. Our results are mainly threefold.
First, we identify a...
Another Look at Key Randomisation Hypotheses
Subhabrata Samajder, Palash Sarkar
Secret-key cryptography
In the context of linear cryptanalysis of block ciphers, let $p_0$ (resp. $p_1$) be the probability that a particular linear approximation holds for the right (resp. a wrong) key choice. The standard right key randomisation hypothesis states that $p_0$ is a constant $p\neq 1/2$ and the standard wrong key randomisation hypothesis states that $p_1=1/2$. Using these hypotheses, the success probability $P_S$ of the attack can be expressed in terms of the data complexity $N$. The resulting...
Cryptanalysis of Reduced-Round SipHash
Le He, Hongbo Yu
Secret-key cryptography
SipHash is a family of ARX-based MAC algorithms optimized for short inputs. Already, a lot of implementations and applications for SipHash have been proposed, whereas the cryptanalysis of SipHash still lags behind. In this paper, we study the property of truncated differential in SipHash and find out the output bits with the most imbalanced differential biases. Making use of these results, we construct distinguishers with practical complexity $2^{10}$ for SipHash-2-1 and $2^{36}$ for...
Estimating Gaps in Martingales and Applications to Coin-Tossing: Constructions and Hardness
Hamidreza Amini Khorasgani, Hemanta Maji, Tamalika Mukherjee
Foundations
Consider the representative task of designing a distributed coin-tossing protocol for $n$ processors such that the probability of heads is $X_0\in[0,1]$.
This protocol should be robust to an adversary who can reset one processor to change the distribution of the final outcome.
For $X_0=1/2$, in the information-theoretic setting, no adversary can deviate the probability of the outcome of the well-known Blum's ``majority protocol'' by more than $\frac1{\sqrt{2\pi n}}$, i.e., it is...
A Short Note on a Weight Probability Distribution Related to SPNs
Sondre Rønjom
Secret-key cryptography
We report on a simple technique that supports some recent developments on AES by Grassi and Rechberger and Bao, Guo and List. We construct a weight transition probability matrix related to AES that characterises fixed configurations of active bytes in differences of ciphertexts when plaintext differences are fixed to some (possibly other) configuration of active bytes. The construction is very simple and requires only a little bit of linear algebra. The derived probabilities are essentially...
Neural Network Model Assessment for Side-Channel Analysis
Guilherme Perin, Baris Ege, Lukasz Chmielewski
Applications
Leakage assessment of cryptographic implementations with side-channel analysis relies on two important assumptions: leakage model and the number of side-channel traces. In the context of profiled side-channel attacks, having these assumptions correctly defined is a sufficient first step to evaluate the security of a crypto implementation with template attacks. This method assumes that the features (leakages or points of interest) follow a univariate or multi-variate Gaussian distribution for...
On the Distribution of Quadratic Residues and Non-residues Modulo Composite Integers and Applications to Cryptography
Ferucio Laurentiu Tiplea, Sorin Iftene, George Teseleanu, Anca-Maria Nica
Public-key cryptography
We develop exact formulas for the distribution of quadratic residues and non-residues in sets of the form $a+X=\{(a+x)\bmod n\mid x\in X\}$, where $n$ is a prime or the product of two primes and $X$ is a subset of integers with given Jacobi symbols modulo prime factors of $n$. We then present applications of these formulas to Cocks' identity-based encryption scheme and statistical indistinguishability.
Discretisation and Product Distributions in Ring-LWE
Sean Murphy, Rachel Player
A statistical framework applicable to Ring-LWE was outlined by Murphy and Player (IACR eprint 2019/452). Its applicability was demonstrated with an analysis of the decryption failure probability for degree-1 and degree-2 ciphertexts in the homomorphic encryption scheme of Lyubashevsky, Peikert and Regev (IACR eprint 2013/293). In this paper, we clarify and extend results presented by Murphy and Player. Firstly, we make precise the approximation of the discretisation of a Normal random...
Stopping time signatures for some algorithms in cryptography
Percy Deift, Stephen D. Miller, Thomas Trogdon
Foundations
We consider the normalized distribution of the overall running times of some cryptographic algorithms, and what information they reveal about the algorithms. Recent work of Deift, Menon, Olver, Pfrang, and Trogdon has shown that certain numerical algorithms applied to large random matrices exhibit a characteristic distribution of running times, which depends only on the algorithm but are independent of the choice of probability distributions for the matrices. Different algorithms often...
A Note on Sub-Gaussian Random Variables
Benjamin M. Case, Colin Gallagher, Shuhong Gao
Foundations
A sub-Gaussian distribution is any probability distribution that has tails bounded
by a Gaussian and has a mean of zero. It is well known that the sum of independent
sub-Gaussians is again sub-Gaussian. This note generalizes this result to sums of sub-
Gaussians that may not be independent, under the assumption a certain conditional
distribution is also sub-Gaussian. This general result is useful in the study of noise
growth in (fully) homomorphic encryption schemes [CGHX19, CGGI17], and...
On MILP-Based Automatic Search for Differential Trails Through Modular Additions with Application to Bel-T
Muhammad ElSheikh, Ahmed Abdelkhalek, Amr M. Youssef
Secret-key cryptography
Using modular addition as a source of nonlinearity is frequently used in many symmetric-key structures such as ARX and Lai--Massey schemes. At FSE'16, Fu \etal proposed a Mixed Integer Linear Programming (MILP)-based method to handle the propagation of differential trails through modular additions assuming
that the two inputs to the modular addition and the consecutive rounds are independent. However, this assumption does not necessarily hold. In this paper, we study the propagation of the...
A Central Limit Framework for Ring-LWE Noise Analysis
Sean Murphy, Rachel Player
This paper develops Central Limit arguments for analysing the noise in ciphertexts in two homomorphic encryption schemes that are based on Ring-LWE. The first main contribution of this paper is to present and evaluate an average-case noise analysis for the BGV scheme. Our approach relies on the recent work of Costache et al. (SAC 2023) that gives the approximation of a polynomial product as a multivariate Normal distribution. We show how this result can be applied in the BGV context and...
Exploring the Monero Peer-to-Peer Network
Tong Cao, Jiangshan Yu, Jérémie Decouchant, Xiapu Luo, Paulo Verissimo
As of September 2019, Monero is the most capitalized privacy-
preserving cryptocurrency, and is ranked tenth among all cryptocurren-
cies. Monero’s on-chain data privacy guarantees, i.e., how mixins are
selected in each transaction, have been extensively studied. However, de-
spite Monero’s prominence, the network of peers running Monero clients
has not been analyzed. Such analysis is of prime importance, since po-
tential vulnerabilities in the peer-to-peer network may lead to attacks...
Full Database Reconstruction with Access and Search Pattern Leakage
Evangelia Anna Markatou, Roberto Tamassia
Applications
The widespread use of cloud computing has enabled several database
providers to store their data on servers in the cloud and answer
queries from those servers. In order to protect the confidentiality
of data in the cloud, a database can be stored in
encrypted form and all queries can be executed on the
encrypted database. Recent research results suggest that a curious cloud provider may be able to decrypt some of the items in the database after seeing a large number of queries and their...
Achieving secure and efficient lattice-based public-key encryption: the impact of the secret-key distribution
Sauvik Bhattacharya, Oscar Garcia-Morchon, Rachel Player, Ludo Tolhuizen
Public-key cryptography
Lattice-based public-key encryption has a large
number of design choices that can be combined in
diverse ways to obtain different tradeoffs. One of these choices is
the distribution from which secret keys are sampled.
Numerous secret-key distributions exist in the state of the
art, including (discrete) Gaussian, binomial, ternary, and fixed-weight ternary.
Although the secret-key distribution
impacts both the concrete security and the performance of
the schemes, it has not been compared in a...
Leakage Certification Revisited: Bounding Model Errors in Side-Channel Security Evaluations
Olivier Bronchain, Julien M. Hendrickx, Clément Massart, Alex Olshevsky, François-Xavier Standaert
Implementation
Leakage certification aims at guaranteeing that the statistical models used in side-channel security evaluations are close to the true statistical distribution of the leakages, hence can be used to approximate a worst-case security level. Previous works in this direction were only qualitative: for a given amount of measurements available to an evaluation laboratory, they rated a model as "good enough" if the model assumption errors (i.e., the errors due to an incorrect choice of model...
Encrypted Multi-Maps with Computationally-Secure Leakage
Seny Kamara, Tarik Moataz
We initiate the study of structured encryption schemes with computationally-secure leakage. Specifically, we focus on the design of volume-hiding encrypted multi-maps; that is, of encrypted multi-maps that hide the response length to computationally-bounded adversaries. We describe the first volume-hiding STE schemes that do not rely on naive padding; that is, padding all tuples to the same length. Our first construction has efficient query complexity and storage but can be lossy. We...
Information Entropy Based Leakage Certification
Changhai Ou, Xinping Zhou, Siew-Kei Lam
Implementation
Side-channel attacks and evaluations typically utilize leakage models to extract sensitive information from measurements of cryptographic implementations. Efforts to establish a true leakage model is still an active area of research since Kocher proposed Differential Power Analysis (DPA) in 1999. Leakage certification plays an important role in this aspect to address the following question: "how good is my leakage model?". However, existing leakage certification methods still need to...
Secure Certification of Mixed Quantum States with Application to Two-Party Randomness Generation
Frédéric Dupuis, Serge Fehr, Philippe Lamontagne, Louis Salvail
We investigate sampling procedures that certify that an arbitrary quantum state on $n$ subsystems is close to an ideal mixed state $\varphi^{\otimes n}$ for a given reference state $\varphi$, up to errors on a few positions. This task makes no sense classically: it would correspond to certifying that a given bitstring was generated according to some desired probability distribution. However, in the quantum case, this is possible if one has access to a prover who can supply a purification of...
Estimation of the Success Probability of Random Sampling by the Gram-Charlier Approximation
Yoshitatsu Matsuda, Tadanori Teruya, Kenji Kasiwabara
Public-key cryptography
The lattice basis reduction algorithm is a method for solving the
Shortest Vector Problem (SVP) on lattices. There are many variants of
the lattice basis reduction algorithm such as LLL, BKZ, and RSR. Though
BKZ has been used most widely, it is shown recently that some variants
of RSR are quite efficient for solving a high-dimensional SVP (they
achieved many best scores in TU Darmstadt SVP challenge). RSR repeats
alternately the generation of new very short lattice vectors from the
current...
Mind the Gap - A Closer Look at the Security of Block Ciphers against Differential Cryptanalysis
Ralph Ankele, Stefan Kölbl
Secret-key cryptography
Resistance against differential cryptanalysis is an important design criteria for any modern block cipher and most designs rely on finding some upper bound on probability of single differential characteristics. However, already at EUROCRYPT'91, Lai et al. comprehended that differential cryptanalysis rather uses differentials instead of single characteristics.
In this paper, we consider exactly the gap between these two approaches and investigate this gap in the context of recent lightweight...
PRank: Fast Analytical Rank Estimation via Pareto Distributions
Liron David, Avishai Wool
Secret-key cryptography
Rank estimation is an important tool for a side-channel evaluations laboratories. It allows estimating the remaining security after an attack has been performed, quantified as the time complexity and the memory consumption required to brute force the key given the leakages as probability distributions over $d$ subkeys (usually key bytes). These estimations are particularly useful where the key is not reachable with exhaustive search. We propose a new method called PRank for rank estimation,...
Continuous-Source Fuzzy Extractors: Source uncertainty and security
Benjamin Fuller, Lowen Peng
Secret-key cryptography
Fuzzy extractors (Dodis et al., Eurocrypt 2004) convert repeated noisy readings of a high-entropy source into the same uniformly distributed key. The functionality of a fuzzy extractor outputs the key when provided with a value close to the original reading of the source. A necessary condition for security, called fuzzy min-entropy, is that the probability of every ball of values of the noisy source is small.
Many noisy sources are best modeled using continuous metric spaces. To build...
Masking the GLP Lattice-Based Signature Scheme at Any Order
Gilles Barthe, Sonia Belaïd, Thomas Espitau, Pierre-Alain Fouque, Benjamin Grégoire, Mélissa Rossi, Mehdi Tibouchi
Recently, numerous physical attacks have been demonstrated against lattice-based schemes, often exploiting their unique properties such as the reliance on Gaussian distributions, rejection sampling and FFT-based polynomial multiplication. As the call for concrete implementations and deployment of postquantum cryptography becomes more pressing, protecting against those attacks is an important problem. However, few countermeasures have been proposed so far. In particular, masking has been...
Discrete Gaussian Sampling (DGS) over the integers—also known as integer Gaussian sampling— is used to generate integer values that statistically follow the discrete Gaussian distribution and plays a central role in lattice-based cryptography. Among existing approaches for integer DGS, the cumulative distribution table (CDT) method is widely adopted. However, CDT sampling typically incurs substantial storage costs due to the need to store high-precision fixed-point probability tables, where...
One-Time Pad (OTP), introduced by Shannon, is well-known as an unconditionally secure encryption algorithm and has become the cornerstone of modern cryptography. However, the unconditional security of OTP applies solely to confidentiality and does not extend to integrity. Hash functions such as SHA2, SHA3 or SM3 applies only to integrity but not to confidentiality and also can not obtain unconditional security. Encryption and digital signatures based on asymmetric cryptography can provide...
We study several problems in the intersection of cryptography and complexity theory based on the following high-level thesis. 1) Obfuscation can serve as a general-purpose worst-case to average-case reduction, reducing the existence of various forms of cryptography to corresponding worst-case assumptions. 2) We can therefore hope to overcome barriers in cryptography and average-case complexity by (i) making worst-case hardness assumptions beyond $\mathsf{P}\neq \mathsf{NP}$, and...
Homomorphic encryption schemes based on the Ring-Learning-with-Errors problem require accurate ciphertext noise analysis to ensure correctness and security. However, ring multiplications during homomorphic computations make the noise in the result ciphertexts difficult to characterize. Existing average-case noise analyses derive a bound on the noise by either assuming it follows a Gaussian distribution, or giving empirical formulae, with strong independence assumption and the Central Limit...
Side-channel analysis recovers a secret by exploiting the key-dependent characteristics of the leakages. Practical techniques, such as Distance-of-Means analysis (DoM), Kolmogorov-Smirnov analysis (KSA) and Cramér-von Mises analysis (CvMA), provide valuable insights about the secret from the indirect perspectives of statistical moment and cumulative distribution function (CDF) respectively, circumventing the direct and costly estimation of leakage probability densities and therefore enabling...
We prove the first meta-complexity characterization of a quantum cryptographic primitive. We show that one-way puzzles exist if and only if there is some quantum samplable distribution of binary strings over which it is hard to approximate Kolmogorov complexity. Therefore, we characterize one-way puzzles by the average-case hardness of a uncomputable problem. This brings to the quantum setting a recent line of work that characterizes classical cryptography with the average-case hardness of...
Differential cryptanalysis is a powerful technique for attacking block ciphers, wherein the Markov cipher assumption and stochastic hypothesis are commonly employed to simplify the search and probability estimation of differential trails. However, these assumptions often neglect inherent algebraic constraints, potentially resulting in invalid trails and inaccurate probability estimates. Some studies identified violations of these assumptions and explored how they impose constraints on key...
We introduce a general, low-cost, low-power statistical test for transactions in transaction protocols with small anonymity set authentication (TPSASAs), such as Monero. The test classifies transactions as ad hoc (spontaneously constructed to spend a deterministically selected key) or self-churned (constructed from a probability distribution very close to that of the default wallet software, and with the same sender and receiver). The test is a uniformly most powerful (UMP) likelihood ratio...
A private-information-retrieval (PIR) scheme lets a client fetch a record from a remote database without revealing which record it fetched. Classic PIR schemes treat all database records the same but, in practice, some database records are much more popular (i.e., commonly fetched) than others. We introduce distributional PIR, a new type of PIR that can run faster than classic PIR---both asymptotically and concretely---when the popularity distribution is skewed. Distributional PIR provides...
One of the primary approaches for constructing lattice-based signature schemes is through the “Fiat-Shamir with aborts” methodology. Schemes constructed using this approach may abort and restart during signing, corresponding to rejection sampling produced signatures in order to ensure that they follow a distribution that is independent of the secret key. This rejection sampling is only feasible when the output distribution is sufficiently wide, limiting how compact this type of signature...
Evaluating the security of LWE-based KEMs involves two crucial metrics: the hardness of the underlying LWE problem and resistance to decryption failure attacks, both significantly influenced by the secret key and error distributions. To mitigate the complexity and timing vulnerabilities of Gaussian sampling, modern LWE-based schemes often adopt either the uniform or centered binomial distribution (CBD). This work focuses on Kyber to evaluate its security under both distributions. Compared...
In classical cryptography, one-way functions (OWFs) are the minimal assumption, while recent active studies have demonstrated that OWFs are not necessarily the minimum assumption in quantum cryptography. Several new primitives have been introduced such as pseudorandom unitaries (PRUs), pseudorandom function-like state generators (PRFSGs), pseudorandom state generators (PRSGs), one-way state generators (OWSGs), one-way puzzles (OWPuzzs), and EFI pairs. They are believed to be weaker than...
Recent oracle separations [Kretschmer, TQC'21, Kretschmer et. al., STOC'23] have raised the tantalizing possibility of building quantum cryptography from sources of hardness that persist even if the polynomial hierarchy collapses. We realize this possibility by building quantum bit commitments and secure computation from unrelativized, well-studied mathematical problems that are conjectured to be hard for $\mathsf{P}^{\#\mathsf{P}}$ -- such as approximating the permanents of complex Gaussian...
One-way functions are fundamental to classical cryptography and their existence remains a longstanding problem in computational complexity theory. Recently, a provable quantum one-way function has been identified, which maintains its one-wayness even with unlimited computational resources. Here, we extend the mathematical definition of functions to construct a generalized one-way function by virtually measuring the qubit of provable quantum one-way function and randomly assigning the...
Key distribution plays a fundamental role in cryptography. Currently, the quantum scheme stands as the only known method for achieving unconditionally secure key distribution. This method has been demonstrated over distances of 508 and 1002 kilometers in the measurement-device-independent and twin-field configurations, respectively. However, quantum key distribution faces transmission distance issues and numerous side channel attacks since the basic physical picture requires the use of...
Randomized distributed function computation refers to remote function computation where transmitters send data to receivers which compute function outputs that are randomized functions of the inputs. We study the applications of semantic communications in randomized distributed function computation to illustrate significant reductions in the communication load, with a particular focus on privacy. The semantic communication framework leverages generalized remote source coding methods, where...
Privacy Set Intersection (PSI) has been an important research topic within privacy computation. Its main function is to allow two parties to compute the intersection of their private sets without revealing any other private information. Therefore, PSI can be applied to various real-world scenarios. Chase and Miao presented an impressive construction ``Private set intersection in the Internet setting from lightweight oblivious prf'' (CM20 for short) at Crypto 2020, highlighting its...
Secure two-party computation allows two mutually distrusting parties to compute a joint function over their inputs, guaranteeing properties such as input privacy or correctness. For many tasks, such as joint computation of statistics, it is important that when one party receives the result of the computation, the other party also receives the result. Unfortunately, this property, which is called fairness, is unattainable in the two-party setting for arbitrary functions. So weaker...
At CRYPTO 2019, Gohr demonstrated that differential-neural distinguishers (DNDs) for Speck32/64 can learn more features than classical cryptanalysis's differential distribution tables (DDT). Furthermore, a non-classical key recovery procedure is devised by combining the Upper Confidence Bound (UCB) strategy and the BayesianKeySearch algorithm. Consequently, the time complexity of 11-round key recovery attacks on Speck32/64 is significantly reduced compared with the state-of-the-art results...
Randomized Partial Checking (RPC} was proposed by Jakobsson, Juels, and Rivest and attracted attention as an efficient method of verifying the correctness of the mixing process in numerous applied scenarios. In fact, RPC is a building block for many electronic voting schemes, including Prêt à Voter, Civitas, Scantegrity II as well as voting-systems used in real-world elections (e.g., in Australia). Mixing is also used in anonymous transfers of cryptocurrencies. It turned out, however,...
Over the past few decades, we have seen a proliferation of advanced cryptographic primitives with lossy or homomorphic properties built from various assumptions such as Quadratic Residuosity, Decisional Diffie-Hellman, and Learning with Errors. These primitives imply hard problems in the complexity class $\mathcal{SZK}$ (statistical zero-knowledge); as a consequence, they can only be based on assumptions that are broken in $\mathcal{BPP}^{\mathcal{SZK}}$. This poses a barrier for building...
The past years have witnessed a considerable increase in research efforts put into neural network-assisted profiled side-channel analysis (SCA). Studies have also identified challenges, e.g., closing the gap between metrics for machine learning (ML) classification and side-channel attack evaluation. In fact, in the context of NN-assisted SCA, the NN’s output distribution forms the basis for successful key recovery. In this respect, related work has covered various aspects of integrating...
Recently, researchers have proposed many LWE estimators, such as lattice-estimator (Albrecht et al, Asiacrypt 2017) and leaky-LWE-Estimator (Dachman-Soled et al, Crypto 2020), while the latter has already been used in estimating the security level of Kyber and Dilithium using only BKZ. However, we prove in this paper that solving LWE by combining a lattice reduction step (by LLL or BKZ) and a target vector searching step (by enumeration or sieving), which we call a Two-step mode, is more...
The Peregrine signature scheme is one of the candidates in the ongoing Korean post-quantum cryptography competition. It is proposed as a high-speed variant of Falcon, which is a hash-and-sign signature scheme over NTRU lattices and one of the schemes selected by NIST for standardization. To this end, Peregrine replaces the lattice Gaussian sampler in the Falcon signing procedure with a new sampler based on the centered binomial distribution. While this modification offers significant...
This paper formally analyzes two major non-profiled deep-learning-based side-channel attacks (DL-SCAs): differential deep-learning analysis (DDLA) by Timon and collision DL-SCA by Staib and Moradi. These DL-SCAs leverage supervised learning in non-profiled scenarios. Although some intuitive descriptions of these DL-SCAs exist, their formal analyses have been rarely conducted yet, which makes it unclear why and when the attacks succeed and how the attack can be improved. In this paper, we...
In CRYPTO 2019, Gohr showed that well-trained neural networks could perform cryptanalytic distinguishing tasks superior to differential distribution table (DDT)-based distinguishers. This suggests that the differential-neural distinguisher (ND) may use additional information besides pure ciphertext differences. However, the explicit knowledge beyond differential distribution is still unclear. In this work, we provide explicit rules that can be used alongside DDTs to enhance the effectiveness...
In Crypto 2019, Zhandry showed how to define compressed oracles, which record quantum superposition queries to the quantum random oracle. In this paper, we extend Zhandry's compressed oracle technique to non-uniformly distributed functions with independently sampled outputs. We define two quantum oracles $\mathsf{CStO}_D$ and $\mathsf{CPhsO}_D$, which are indistinguishable to the non-uniform quantum random oracle where quantum access is given to a random function $H$ whose images $H(x)$...
In the Random Oracle Model (ROM) all parties have oracle access to a common random function, and the parties are limited in the number of queries they can make to the oracle. The Merkle’s Puzzles protocol, introduced by Merkle [CACM ’78], is a key-agreement protocol in the ROM with a quadratic gap between the query complexity of the honest parties and the eavesdropper. This quadratic gap is known to be optimal, by the works of Impagliazzo and Rudich [STOC ’89] and Barak and Mahmoody [Crypto...
Research about the theoretical properties of side channel distinguishers revealed the rules by which to maximise the probability of first order success (``optimal distinguishers'') under different assumptions about the leakage model and noise distribution. Simultaneously, research into bounding first order success (as a function of the number of observations) has revealed universal bounds, which suggest that (even optimal) distinguishers are not able to reach theoretically possible success...
In this work we first examine the hardness of solving various search problems by hybrid quantum-classical strategies, namely, by algorithms that have both quantum and classical capabilities. We then construct a hybrid quantum-classical search algorithm and analyze its success probability. Regarding the former, for search problems that are allowed to have multiple solutions and in which the input is sampled according to arbitrary distributions we establish their hybrid quantum-classical...
We study the fundamental problem of guessing cryptographic keys, drawn from some non-uniform probability distribution $\mathcal D$, as e.g. in LPN, LWE or for passwords. The optimal classical algorithm enumerates keys in decreasing order of likelihood. The optimal quantum algorithm, due to Montanaro (2011), is a sophisticated Grover search. We give the first tight analysis for Montanaro's algorithm, showing that its runtime is $2^{\operatorname{H}_{2/3}(\mathcal D)/2}$, where...
Majority voting is a simple mathematical function that returns the value that appears most often in a set. As a popular decision fusion technique, the majority voting function (MVF) finds applications in resolving conflicts, where a number of independent voters report their opinions on a classification problem. Despite its importance and its various applications in ensemble learning, data crowd-sourcing, remote sensing, and data oracles for blockchains, the accuracy of the MVF for the...
We consider the problem of constructing an unconditionally secure cipher with a short key for the case where the probability distribution of encrypted messages is unknown. Note that unconditional security means that an adversary with no computational constraints can obtain only a negligible amount of information ("leakage") about an encrypted message (without knowing the key). Here we consider the case of a priori (partially) unknown message source statistics. More specifically, the...
In recent years, the automatic search has been widely used to search differential characteristics and linear approximations with high probability/correlation. Among these methods, the automatic search with the Boolean Satisfiability Problem (SAT, in short) gradually becomes a powerful cryptanalysis tool in symmetric ciphers. A key problem in the automatic search method is how to fully characterize a set $S \subseteq \{0,1\}^n$ with as few Conjunctive Normal Form (CNF, in short) clauses as...
Fuzzy extractors convert noisy signals from the physical world into reliable cryptographic keys. Fuzzy min-entropy is an important measure of the ability of a fuzzy extractor to distill keys from a distribution: in particular, it bounds the length of the key that can be derived (Fuller, Reyzin, and Smith, IEEE Transactions on Information Theory 2020). In general, fuzzy min-entropy that is superlogarithmic in the security parameter is required for a noisy distribution to be suitable for key...
Cryptanalysts have been looking for differential characteristics in ciphers for decades and it remains unclear how the subkey values and more generally the Markov assumption impacts exactly their probability estimation. There were theoretical efforts considering some simple linear relationships between differential characteristics and subkey values, but the community has not yet explored many possible nonlinear dependencies one can find in differential characteristics. Meanwhile, the...
Starting from links between coding theory and secret sharing we develop an extensible and decentralized version of Shamir Secret Sharing, that allows the addition of new users after the initial share distribution. On top of it we design a totally decentralized $(t,n)$-threshold Schnorr signature scheme that needs only $t$ users online during the key generation phase, while the others join later. Under standard assumptions we prove our scheme secure against adaptive malicious adversaries....
In the Learning with Errors (LWE) problem we are given a matrix $A \in \mathbb{Z}_q^{N \times N}$ and a target vector $\vec{t} \in \mathbb{Z}_q^N$ such that there exists small-norm $\vec{s}, \vec{e}\in \mathbb{Z}_q^N$ satisfying $A\cdot\vec{s} = \vec{t} + \vec{e} \bmod q$. Modern cryptosystems often sample $\vec{s}, \vec{e}$ from narrow distributions that take integer values in a small range $[-\eta, \eta].$ Kyber and Dilithium both choose $\eta=2$ and $\eta=3$ using either a Centered...
The random oracle methodology is central to the design of many practical cryptosystems. A common challenge faced in several systems is the need to have a random oracle that outputs from a structured distribution $\mathcal{D}$, even though most heuristic implementations such as SHA-3 are best suited for outputting bitstrings. Our work explores the problem of sampling from discrete Gaussian (and related) distributions in a manner that they can be programmed into random oracles. We...
The NTRU problem can be viewed as an instance of finding a short non-zero vector in a lattice, under the promise that it contains an exceptionally short vector. Further, the lattice under scope has the structure of a rank-2 module over the ring of integers of a number field. Let us refer to this problem as the module unique Shortest Vector Problem,or mod-uSVP for short. We exhibit two reductions that together provide evidence the NTRU problem is not just a particular case of mod-uSVP, but...
Lattice enumeration is a linear-space algorithm for solving the shortest lattice vector problem(SVP). Extreme pruning is a practical technique for accelerating lattice enumeration, which has mature theoretical analysis and practical implementation. However, these works are still remain to be done for discrete pruning. In this paper, we improve the discrete pruned enumeration (DP enumeration), and give a solution to the problem proposed by Leo Ducas et Damien Stehle about the cost estimation...
A systematic approach to the fixed-key analysis of differential probabilities is proposed. It is based on the propagation of 'quasidifferential trails', which keep track of probabilistic linear relations on the values satisfying a differential characteristic in a theoretically sound way. It is shown that the fixed-key probability of a differential can be expressed as the sum of the correlations of its quasidifferential trails. The theoretical foundations of the method are based on an...
This study derives information-theoretical bounds of the success rate (SR) of side-channel attacks on masked implementations. We first develop a communication channel model representing side-channel attacks on masked implementations. We then derive two SR bounds based on the conditional probability distribution and mutual information of shares. The basic idea is to evaluate the upper-bound of the mutual information between the non-masked secret value and the side-channel trace by the...
The pseudorandom sampling of constant weight words, as it is currently implemented in cryptographic schemes like BIKE or HQC, is prone to the leakage of information on the seed being used for the pseudorandom number generation. This creates a vulnerability when the semantic security conversion requires a deterministic re-encryption. This observation was first made in [HLS21] about HQC and a timing attack was presented to recover the secret key. As suggested in [HLS21] a similar attack...
Collision resistance and collision finding are now extensively exploited in Cryptography, especially in the case of quantum computing. For any function $f:[M]\to[N]$ with $f(x)$ uniformly distributed over $[N]$, Zhandry has shown that the number $\Theta(N^{1/3})$ of queries is both necessary and sufficient for finding a collision in $f$ with constant probability. However, there is still a gap between the upper and the lower bounds of query complexity in general non-uniform...
We initiate the study of a problem called the Polynomial Independence Distinguishing Problem (PIDP). The problem is parameterized by a set of polynomials $\mathcal{Q}=(q_1,\ldots, q_m)$ where each $q_i:\mathbb{R}^n\rightarrow \mathbb{R}$ and an input distribution $\mathcal{D}$ over the reals. The goal of the problem is to distinguish a tuple of the form $\{ q_i,q_i(\mathbf{x})\}_{i\in [m]} $ from $\{ q_i,q_i(\mathbf{x}_i)\}_{i\in [m]} $ where $\mathbf{x}, \mathbf{x}_1,\ldots ,...
Mixed integer linear programming (MILP) based tools are used to estimate the strength of block ciphers against the cryptanalytic attacks. The existing tools use partial difference distribution table (p-DDT) approach to optimize the probability of differential characteristics for large (≥8-bit) S-box based ciphers. We propose to use the full difference distribution table (DDT) with the probability of each possible propagation for MILP modeling of large S-boxes. This requires more than 16...
The goal of anonymous whistleblowing is to publicly disclose a message while at the same time hiding the identity of the sender in a way that even if suspected of being the sender, this cannot be proven. While many solutions to this problem have been proposed over the years, they all require some form of interaction with trusted or non-colluding parties. In this work, we ask whether this is fundamentally inherent. We put forth the notion of anonymous transfer as a primitive allowing to...
Simon and Simeck are two lightweight block ciphers with a simple round function using only word rotations and a bit-wise AND operation. Previous work has shown a strong clustering effect for differential and linear cryptanalysis, due to the existence of many trails with the same inputs and outputs. In this paper, we explore this clustering effect by exhibiting a class of high probability differential and linear trails where the active bits stay in a fixed window of $w$ bits. Instead of...
At CRYPTO'19, Gohr built a bridge between deep learning and cryptanalysis. Based on deep neural networks, he trained neural distinguishers of Speck32/64 using a plaintext difference and single ciphertext pair. Compared with purely differential distinguishers, neural distinguishers successfully use features of the ciphertext pairs. Besides, with the help of neural distinguishers, he attacked 11-round Speck32/64 using Bayesian optimization. At EUROCRYPTO'21, Benamira proposed a detailed...
The main approaches currently used to construct identity based encryption (IBE) schemes are based on bilinear mappings, quadratic residues and lattices. Among them, the most attractive approach is the one based on quadratic residues, due to the fact that the underlying security assumption is a well understood hard problem. The first such IBE scheme was constructed by Cocks and some of its deficiencies were addressed in subsequent works. In this paper, we will focus on two constructions that...
Cryptographic constructions based on $\textit{ring learning with errors}$ (RLWE) have emerged as one of the front runners for the standardization of post quantum public key cryptography. As the standardization process continues, optimizing specific parts of proposed schemes becomes a worthwhile endeavor. In this work we focus on using error correcting codes to alleviate a natural trade-off present in most schemes; namely, we would like a wider error distribution to increase security, but a...
A secret-sharing scheme allows to distribute a secret $s$ among $n$ parties such that only some predefined ``authorized'' sets of parties can reconstruct the secret, and all other ``unauthorized'' sets learn nothing about $s$. The collection of authorized/unauthorized sets can be captured by a monotone function $f:\{0,1\}^n\rightarrow \{0,1\}$. In this paper, we focus on monotone functions that all their min-terms are sets of size $a$, and on their duals -- monotone functions whose max-terms...
We study uncloneable quantum encryption schemes for classical messages as recently proposed by Broadbent and Lord. We focus on the information-theoretic setting and give several limitations on the structure and security of these schemes: Concretely, 1) We give an explicit cloning-indistinguishable attack that succeeds with probability $\frac12 + \mu/16$ where $\mu$ is related to the largest eigenvalue of the resulting quantum ciphertexts. 2) For a uniform message distribution, we partially...
All-or-nothing transforms have been defined as bijective mappings on all s-tuples over a specified finite alphabet. These mappings are required to satisfy certain "perfect security" conditions specified using entropies of the probability distribution defined on the input s-tuples. Alternatively, purely combinatorial definitions of AONTs have been given, which involve certain kinds of "unbiased arrays". However, the combinatorial definition makes no reference to probability definitions. In...
Efficient rank estimation algorithms are of prime interest in security evaluation against side-channel attacks (SCA) and recently also for password strength estimators. In a side channel setting it allows estimating the remaining security after an attack has been performed, quantified as the time complexity and the memory consumption required to brute force the key given the leakages as probability distributions over $d$ subkeys (usually key bytes). In password strength estimators the rank...
Efficient Reed-Solomon code reconstruction algorithms, for example, by Guruswami and Wootters (STOC--2016), translate into local leakage attacks on Shamir secret-sharing schemes over characteristic-2 fields. However, Benhamouda, Degwekar, Ishai, and Rabin (CRYPTO--2018) showed that the Shamir secret sharing scheme over prime-fields is leakage resilient to one-bit local leakage if the reconstruction threshold is roughly 0.87 times the total number of parties. In several application scenarios,...
Security of cryptographic primitives is usually proved by assuming ``ideal'' probability distributions. We need to replace them with approximated ``real'' distributions in the real-world systems without losing the security level. We demonstrate that the Hellinger distance is useful for this problem, while the statistical distance is mainly used in the cryptographic literature. First, we show that for preserving $\lambda$-bit security of a given security game, the closeness of...
A gadget decomposition algorithm is commonly used in many advanced lattice cryptography applications which support homomorphic operation over ciphertexts to control the noise growth. For a special structure of a gadget, the algorithm is digit decomposition. If such algorithm samples from a subgaussian distribution, that is, the output is randomized, it gives more benefits on output quality. One of important advantages is Pythagorean additivity which makes resulting noise contained in a...
$P_4$-free graphs-- also known as cographs, complement-reducible graphs, or hereditary Dacey graphs--have been well studied in graph theory. Motivated by computer science and information theory applications, our work encodes (flat) joint probability distributions and Boolean functions as bipartite graphs and studies bipartite $P_4$-free graphs. For these applications, the graph properties of edge partitioning and covering a bipartite graph using the minimum number of these graphs are...
In this paper, we investigate the key dependency of differentials in block ciphers by examining the results of numerous experiments applied to the substitution-permutation network (SPN) structure using 4-bit S-boxes. In particular, we consider two cipher structures: a toy 16-bit SPN and a realistic 64-bit SPN. For both ciphers, we generate many different experimental results by inserting the S-boxes used in many lightweight cipher proposals and applying different forms of round key...
Since its appearance in 2008, Bitcoin has attracted considerable attention. So far, it has been the most successful cryptocurrency, with the highest market capitalization. Nevertheless, due to the method it uses to append new transactions and blocks to the blockchain, based on a Proof-of-Work, Bitcoin suffers from poor scalability, which strongly limits the number of transactions per second and, hence, its adoption as a global payment layer for everyday uses. In this paper we analyze some...
In this paper, various quantitative information-theoretic security reductions which correlate statistical difference between two probability distributions with security level's gap for two cryptographic schemes are proposed. Security is the most important prerequisite for cryptographic primitives. In general, there are two kinds of security; one is computational security, and the other is information-theoretic security. We focus on the latter one in this paper, especially the view point of...
We present a signature scheme for Hamming metric random linear codes via the Schnorr-Lyubashevsky framework that employs the rejection sampling on appropriate probability distributions instead of using trapdoors. Such an approach has been widely believed to be more challenging for linear codes as compared to lattices with Gaussian distributions. We prove that our signature scheme achieves EUF-CMA security under the assumption of the decoding one out of many problem or achieves strong EUF-CMA...
In this article we present the BB84 quantum key distribution scheme from two perspectives. First, we provide a theoretical discussion of the steps Alice and Bob take to reach a shared secret using this protocol, while an eavesdropper Eve is either involved or not. Then, we offer and discuss two distinct implementations that simulate BB84 using IBM’s Qiskit framework, the first being an exercise solved during the “IBM Quantum Challenge” event in early May 2020, while the other was...
Persistent fault analysis (PFA) consists in guessing block cipher secret keys by biasing their substitution box. This paper improves the original attack of Zhang et al. on AES-128 presented at CHES 2018. By a thorough analysis, the exact probability distribution of the ciphertext (under a uniformly distributed plaintext) is derived, and the maximum likelihood key recovery estimator is computed exactly. Its expression is turned into an attack algorithm, which is shown to be twice more...
Implementing cryptographic functions securely in the presence of physical adversaries is still a challenge although a lion's share of research in the physical security domain has been put in development of countermeasures. Among several protection schemes, masking has absorbed the most attention of research in both academic and industrial communities, due to its theoretical foundation allowing to provide proofs or model the achieved security level. In return, masking schemes are difficult to...
In their seminal work, Ben-Or and Linial (1985) introduced the full information model for collective coin-tossing protocols involving $n$ processors with unbounded computational power using a common broadcast channel for all their communications. The design and analysis of coin-tossing protocols in the full information model have close connections to diverse fields like extremal graph theory, randomness extraction, cryptographic protocol design, game theory, distributed protocols, and...
Reductions between problems, the mainstay of theoretical computer science, efficiently map an instance of one problem to an instance of another in such a way that solving the latter allows solving the former. The subject of this work is ``lossy'' reductions, where the reduction loses some information about the input instance. We show that such reductions, when they exist, have interesting and powerful consequences for lifting hardness into ``useful'' hardness, namely cryptography. Our...
Traditional Blockchain Sharding approaches can only tolerate up to n/3 of nodes being adversary because they rely on the hypergeometric distribution to make a failure (an adversary does not have n/3 of nodes globally but can manipulate the consensus of a Shard) hard to happen. The system must maintain a large Shard size (the number of nodes inside a Shard) to sustain the low failure probability so that only a small number of Shards may exist. In this paper, we present a new approach of...
The $k$-nearest neighbors ($k$NN) classifier predicts a class of a query, $q$, by taking the majority class of its $k$ neighbors in an existing (already classified) database, $S$. In secure $k$NN, $q$ and $S$ are owned by two different parties and $q$ is classified without sharing data. In this work we present a classifier based on $k$NN, that is more efficient to implement with homomorphic encryption (HE). The efficiency of our classifier comes from a relaxation we make to consider $\kappa$...
We present new non-interactive zero-knowledge argument systems (NIZK), based on standard assumptions that were previously not known to imply it. In particular, we rely on the hardness of both the learning parity with noise (LPN) assumption, and the existence of trapdoor hash functions (TDH, defined by Döttling et al., Crypto 2019). Such TDH can be based on a number of standard assumptions, including DDH, QR, DCR, and LWE. We revisit the correlation intractability (CI) framework for...
Authentication and identification methods based on human fingerprints are ubiquitous in several systems ranging from government organizations to consumer products. The performance and reliability of such systems directly rely on the volume of data on which they have been verified. Unfortunately, a large volume of fingerprint databases is not publicly available due to many privacy and security concerns. In this paper, we introduce a new approach to automatically generate high-fidelity...
The hardness of the Learning with Errors (LWE) problem is by now a cornerstone of the cryptographic landscape. In many of its applications the so called ``LWE secret'' is not sampled uniformly, but comes from a distribution with some min-entropy. This variant, known as ``Entropic LWE'', has been studied in a number of works, starting with Goldwasser et al. (ICS 2010). However, so far it was only known how to prove the hardness of Entropic LWE for secret distributions supported inside a ball...
Aigis-Enc is an encryption algorithm based on asymmetrical LWE. In this algorithm, the compression process is utilized during both key generation and encryption (which is equivalent to add some LWR noise). Then encapsulation is realized by FO transformation. It is well known that FO transformation is not considered for discussing CPA security. On the other hand, since the security reduction of LWR is hard to proceed, it is not considered for discussing the CPA security of Aigis-Enc. But...
Learning with Errors (LWE) and Ring-LWE (RLWE) problems allow the construction of efficient key exchange and public-key encryption schemes. However, while improving the security through the use of error distributions with large standard deviations, the decryption failure rate increases as well. Currently, the independence of individual coefficient failures is assumed to estimate the overall decryption failure rate of many LWE/RLWE schemes. However, previous work has shown that this...
In this paper, we initiate the study of side-channel leakage in hash-and-sign lattice-based signatures, with particular emphasis on the two efficient implementations of the original GPV lattice-trapdoor paradigm for signatures, namely NIST second-round candidate Falcon and its simpler predecessor DLP. Both of these schemes implement the GPV signature scheme over NTRU lattices, achieving great speed-ups over the general lattice case. Our results are mainly threefold. First, we identify a...
In the context of linear cryptanalysis of block ciphers, let $p_0$ (resp. $p_1$) be the probability that a particular linear approximation holds for the right (resp. a wrong) key choice. The standard right key randomisation hypothesis states that $p_0$ is a constant $p\neq 1/2$ and the standard wrong key randomisation hypothesis states that $p_1=1/2$. Using these hypotheses, the success probability $P_S$ of the attack can be expressed in terms of the data complexity $N$. The resulting...
SipHash is a family of ARX-based MAC algorithms optimized for short inputs. Already, a lot of implementations and applications for SipHash have been proposed, whereas the cryptanalysis of SipHash still lags behind. In this paper, we study the property of truncated differential in SipHash and find out the output bits with the most imbalanced differential biases. Making use of these results, we construct distinguishers with practical complexity $2^{10}$ for SipHash-2-1 and $2^{36}$ for...
Consider the representative task of designing a distributed coin-tossing protocol for $n$ processors such that the probability of heads is $X_0\in[0,1]$. This protocol should be robust to an adversary who can reset one processor to change the distribution of the final outcome. For $X_0=1/2$, in the information-theoretic setting, no adversary can deviate the probability of the outcome of the well-known Blum's ``majority protocol'' by more than $\frac1{\sqrt{2\pi n}}$, i.e., it is...
We report on a simple technique that supports some recent developments on AES by Grassi and Rechberger and Bao, Guo and List. We construct a weight transition probability matrix related to AES that characterises fixed configurations of active bytes in differences of ciphertexts when plaintext differences are fixed to some (possibly other) configuration of active bytes. The construction is very simple and requires only a little bit of linear algebra. The derived probabilities are essentially...
Leakage assessment of cryptographic implementations with side-channel analysis relies on two important assumptions: leakage model and the number of side-channel traces. In the context of profiled side-channel attacks, having these assumptions correctly defined is a sufficient first step to evaluate the security of a crypto implementation with template attacks. This method assumes that the features (leakages or points of interest) follow a univariate or multi-variate Gaussian distribution for...
We develop exact formulas for the distribution of quadratic residues and non-residues in sets of the form $a+X=\{(a+x)\bmod n\mid x\in X\}$, where $n$ is a prime or the product of two primes and $X$ is a subset of integers with given Jacobi symbols modulo prime factors of $n$. We then present applications of these formulas to Cocks' identity-based encryption scheme and statistical indistinguishability.
A statistical framework applicable to Ring-LWE was outlined by Murphy and Player (IACR eprint 2019/452). Its applicability was demonstrated with an analysis of the decryption failure probability for degree-1 and degree-2 ciphertexts in the homomorphic encryption scheme of Lyubashevsky, Peikert and Regev (IACR eprint 2013/293). In this paper, we clarify and extend results presented by Murphy and Player. Firstly, we make precise the approximation of the discretisation of a Normal random...
We consider the normalized distribution of the overall running times of some cryptographic algorithms, and what information they reveal about the algorithms. Recent work of Deift, Menon, Olver, Pfrang, and Trogdon has shown that certain numerical algorithms applied to large random matrices exhibit a characteristic distribution of running times, which depends only on the algorithm but are independent of the choice of probability distributions for the matrices. Different algorithms often...
A sub-Gaussian distribution is any probability distribution that has tails bounded by a Gaussian and has a mean of zero. It is well known that the sum of independent sub-Gaussians is again sub-Gaussian. This note generalizes this result to sums of sub- Gaussians that may not be independent, under the assumption a certain conditional distribution is also sub-Gaussian. This general result is useful in the study of noise growth in (fully) homomorphic encryption schemes [CGHX19, CGGI17], and...
Using modular addition as a source of nonlinearity is frequently used in many symmetric-key structures such as ARX and Lai--Massey schemes. At FSE'16, Fu \etal proposed a Mixed Integer Linear Programming (MILP)-based method to handle the propagation of differential trails through modular additions assuming that the two inputs to the modular addition and the consecutive rounds are independent. However, this assumption does not necessarily hold. In this paper, we study the propagation of the...
This paper develops Central Limit arguments for analysing the noise in ciphertexts in two homomorphic encryption schemes that are based on Ring-LWE. The first main contribution of this paper is to present and evaluate an average-case noise analysis for the BGV scheme. Our approach relies on the recent work of Costache et al. (SAC 2023) that gives the approximation of a polynomial product as a multivariate Normal distribution. We show how this result can be applied in the BGV context and...
As of September 2019, Monero is the most capitalized privacy- preserving cryptocurrency, and is ranked tenth among all cryptocurren- cies. Monero’s on-chain data privacy guarantees, i.e., how mixins are selected in each transaction, have been extensively studied. However, de- spite Monero’s prominence, the network of peers running Monero clients has not been analyzed. Such analysis is of prime importance, since po- tential vulnerabilities in the peer-to-peer network may lead to attacks...
The widespread use of cloud computing has enabled several database providers to store their data on servers in the cloud and answer queries from those servers. In order to protect the confidentiality of data in the cloud, a database can be stored in encrypted form and all queries can be executed on the encrypted database. Recent research results suggest that a curious cloud provider may be able to decrypt some of the items in the database after seeing a large number of queries and their...
Lattice-based public-key encryption has a large number of design choices that can be combined in diverse ways to obtain different tradeoffs. One of these choices is the distribution from which secret keys are sampled. Numerous secret-key distributions exist in the state of the art, including (discrete) Gaussian, binomial, ternary, and fixed-weight ternary. Although the secret-key distribution impacts both the concrete security and the performance of the schemes, it has not been compared in a...
Leakage certification aims at guaranteeing that the statistical models used in side-channel security evaluations are close to the true statistical distribution of the leakages, hence can be used to approximate a worst-case security level. Previous works in this direction were only qualitative: for a given amount of measurements available to an evaluation laboratory, they rated a model as "good enough" if the model assumption errors (i.e., the errors due to an incorrect choice of model...
We initiate the study of structured encryption schemes with computationally-secure leakage. Specifically, we focus on the design of volume-hiding encrypted multi-maps; that is, of encrypted multi-maps that hide the response length to computationally-bounded adversaries. We describe the first volume-hiding STE schemes that do not rely on naive padding; that is, padding all tuples to the same length. Our first construction has efficient query complexity and storage but can be lossy. We...
Side-channel attacks and evaluations typically utilize leakage models to extract sensitive information from measurements of cryptographic implementations. Efforts to establish a true leakage model is still an active area of research since Kocher proposed Differential Power Analysis (DPA) in 1999. Leakage certification plays an important role in this aspect to address the following question: "how good is my leakage model?". However, existing leakage certification methods still need to...
We investigate sampling procedures that certify that an arbitrary quantum state on $n$ subsystems is close to an ideal mixed state $\varphi^{\otimes n}$ for a given reference state $\varphi$, up to errors on a few positions. This task makes no sense classically: it would correspond to certifying that a given bitstring was generated according to some desired probability distribution. However, in the quantum case, this is possible if one has access to a prover who can supply a purification of...
The lattice basis reduction algorithm is a method for solving the Shortest Vector Problem (SVP) on lattices. There are many variants of the lattice basis reduction algorithm such as LLL, BKZ, and RSR. Though BKZ has been used most widely, it is shown recently that some variants of RSR are quite efficient for solving a high-dimensional SVP (they achieved many best scores in TU Darmstadt SVP challenge). RSR repeats alternately the generation of new very short lattice vectors from the current...
Resistance against differential cryptanalysis is an important design criteria for any modern block cipher and most designs rely on finding some upper bound on probability of single differential characteristics. However, already at EUROCRYPT'91, Lai et al. comprehended that differential cryptanalysis rather uses differentials instead of single characteristics. In this paper, we consider exactly the gap between these two approaches and investigate this gap in the context of recent lightweight...
Rank estimation is an important tool for a side-channel evaluations laboratories. It allows estimating the remaining security after an attack has been performed, quantified as the time complexity and the memory consumption required to brute force the key given the leakages as probability distributions over $d$ subkeys (usually key bytes). These estimations are particularly useful where the key is not reachable with exhaustive search. We propose a new method called PRank for rank estimation,...
Fuzzy extractors (Dodis et al., Eurocrypt 2004) convert repeated noisy readings of a high-entropy source into the same uniformly distributed key. The functionality of a fuzzy extractor outputs the key when provided with a value close to the original reading of the source. A necessary condition for security, called fuzzy min-entropy, is that the probability of every ball of values of the noisy source is small. Many noisy sources are best modeled using continuous metric spaces. To build...
Recently, numerous physical attacks have been demonstrated against lattice-based schemes, often exploiting their unique properties such as the reliance on Gaussian distributions, rejection sampling and FFT-based polynomial multiplication. As the call for concrete implementations and deployment of postquantum cryptography becomes more pressing, protecting against those attacks is an important problem. However, few countermeasures have been proposed so far. In particular, masking has been...