Pour recevoir par e-mail les annonces de ce séminaire,
vous pouvez vous inscrire sur la
mailing list crypto.
Séminaires à Venir
Jeudi 28 mars 2013: Itai Dinur
The Weizmann Institute - Israel
10h30 dans l'Amphi Evariste Galois - NIR
Collision Attacks on Up to 5 Rounds of SHA-3 Using Generalized Internal Differentials - Abstract
- On October 2-nd 2012 NIST announced its selection of the Keccak scheme as the new SHA-3 hash standard. In this talk I will present the first published collision finding attacks on reduced-round versions of Keccak-384 and Keccak-512, providing actual collisions for 3-round versions, and describing an attack which is $2^{45}$ times faster than birthday attacks for 4-round Keccak-384. For Keccak-256, we increase the number of rounds which can be attacked to 5. All these results are based on a generalized internal differential attack (introduced by Peyrin at Crypto 2010), and use it to map a large number of Keccak inputs into a relatively small subset of possible outputs with a surprisingly large probability. In such a squeeze attack it is easier to find random collisions in the reduced target subset by a standard birthday argument.
Joint work with Orr Dunkelman and Adi Shamir.
|
Vendredi 5 avril 2013: Nico Döttling
Karlsruher Institut für Technologie - Germany
10h30 en Salle U/V - Niveau -2 - Passage Rouge
Lossy Codes and a New Variant of the Learning-With-Errors Problem
- Abstract
- The hardness of the Learning-with-errors (LWE) Problem has become one of the most useful assumptions in cryptography. Many applications, like public key cryptography, hierarchical identity-based encryption, and fully homomorphic encryption can be based on this assumption. Furthermore, the LWE Problem exhibits a worst-to-average-case reduction making the LWE assumption very plausible.
This worst-to-average-case reduction is based on a Fourier argument and therefore the error distribution for current applications of LWE must be chosen with a gaussian distribution. Sampling from a gaussian distribution is cumbersome and it is hence highly desirable to prove worst-to-average-case reductions for other distributions, in particular for the uniform distribution which can be sampled very efficiently.
We present the first worst-to-average case reduction for an LWE problem with uniformly distributed errors. This new worst-to-average-case connection comes with a slight drawback and we need to use a bounded variant of the LWE problem, where the number of samples to be learned from is fixed in advance. The proof is based on a new tool called lossy codes, which might be of interest in the context other lattice/coding-based hardness assumptions.
Séminaires Passés
Programme du premier trimestre 2012/2013
Jeudi 27 septembre 2012: Soutenance de thèse de Olivier Blazy
ENS
10 heures dans l'Amphi Évariste Galois - NIR
Preuves de connaissances interactives et non-interactives.
Jury :
- David Pointcheval (École normale supérieure / CNRS, directeur de thèse)
- Jean-Sébastien Coron (Université du Luxembourg, rapporteur)
- Fabien Laguillaumie (Université Lyon 1 / IFSA, rapporteur)
- Marc Fischlin (Université de Darmstadt, rapporteur)
- Michel Abdalla (École normale supérieure / CNRS, examinateur)
- Eike Kiltz (Université de Bochum, examinateur)
- Antoine Joux (Université de Versailles Saint-Quentin, examinateur)
- Damien Vergnaud (École normale supérieure, examinateur)
Résumé :
Dans cette thèse, nous proposons et utilisons de nouvelles briques conduisant à des protocoles efficaces dans le cadre d'une approche modulaire de la cryptographie. Tout d'abord dans un contexte non-interactif en s'appuyant sur les preuves Groth-Sahai, puis avec des interactions en permettant aux Smooth Projective Hash Functions de gérer de nouveaux langages.
Dans un premier temps cette thèse s'appuie sur la méthodologie introduite par Groth et Sahai pour des preuves de connaissance non-interactives pour développer divers protocoles de signatures de groupe dans le modèle standard, puis de signatures en blanc. Pour cela, nous proposons un système permettant de signer un chiffré de manière telle que, sous réserve de connaissance d'un secret indépendant du protocole de signature, il soit possible de retrouver une signature sur un clair. Cette approche nous permet entre autre de proposer un protocole de signatures en blanc avec un nombre optimal d'interactions à la fin duquel le demandeur peut utiliser une signature usuelle et ce sous des hypothèses classiques dans le modèle standard.
Ensuite nous proposons une nouvelle méthodologie pour faire des preuves implicites de connaissance dans un contexte interactif sans oracle aléatoire. Pour cela nous utilisons les smooth projective hash functions, dans un premier temps pour faire des Oblivious Signature-Based Envelopes, puis dans des protocoles d'authentification et de mise en accord de clés. Ce faisant nous précisons la notion de langage, et élargissons grandement le spectre des langages pouvant être traités à l'aide de ces SPHF. Grâce à ce résultat nous introduisons le concept de LAKE (Language Authenticated Key Exchange) ou encore Échange de clés authentifié par un langage : un moyen pour deux utilisateurs de se mettre d'accord sur une clé si chacun possède un secret vérifiant une contrainte espérée par l'autre. Nous montrons alors comment instancier plusieurs protocoles d'échange de clé sous ce regard plus efficacement qu'avec les techniques actuelles, et nous prouvons la sécurité de nos instanciations dans le modèle UC sous des hypothèses usuelles.
Abstract:
In this thesis, we create new building blocks and use them to present new efficient protocols via a modular design.
We first begin by using the Groth-Sahai methodology for non-interactive proofs to design various group signature protocols in the standard model. We also present a new approach allowing to sign ciphertext and then under the knowledge of a secret independent from the signature protocol we show how a user can recover the signature on the plaintext, creating this way some sort of commutative property between signature and encryption where a decryption of a signature on a ciphertext provides a signature on the associated plaintext. This approach allows us to build a Round-Optimal Blind Signature scheme where the user can ultimately exploit a regular signature. We prove the security of this construction under classical hypotheses in the standard model.
We then present a new methodology for implicit proofs of knowledge in an interactive environment without random oracle. For that we use Smooth Projective Hash Functions, first to instantiate Oblivious Signature-Based Envelope schemes, and then to create Authenticated Key Exchange scheme. Throughout this process we further refine the notion of language, and greatly widen the set of languages manageable via SPHF. This last result allows us to introduce the concept of LAKE Language Authenticated Key Exchange), a new AKE design where two users will be able to share a common key if they both possess a secret word in a language expected by the other. We then show how to build standard AKE schemes (like Password Authenticated Key Exchange) using our framework, and show that our design leads to an increment in efficiency from pre existing solutions. We prove the security of our design in the UC framework under regular hypotheses.
Mercredi 17 octobre 2012: Adi Shamir
The Weizmann Institute - Israel
14h dans l'Amphi Evariste Galois - NIR
Improved Algorithms for Composite Search Problems
Jeudi 18 octobre 2012: Hoeteck Wee
George Washington University
10h30 dans l'Amphi Evariste Galois - NIR
Functional Encryption with Bounded Collusions via Multi-Party Computation
- Abstract
- We construct functional encryption schemes for polynomial-time computable functions secure against an a-priori bounded polynomial number of collusions. Our constructions require only semantically secure public-key encryption schemes and pseudorandom generators computable by small-depth circuits (known to be implied by most concrete intractability assumptions). For certain special cases such as predicate encryption schemes with public index, the construction requires only semantically secure encryption schemes.
Along the way, we show a "bootstrapping theorem" that builds a q-query functional encryption scheme for arbitrary functions starting from a q-query functional encryption scheme for bounded-degree functions. All our constructions rely heavily on techniques from secure multi-party computation and randomized encodings.
Our constructions are secure under a strong simulation-based definition of functional encryption.
Joint work with Sergey Gorbunov and Vinod Vaikuntanathan
Jeudi 15 novembre 2012: David Xiao
LIAFA, Université Paris Diderot
10h30 dans l'Amphi Evariste Galois - NIR
Languages with Zero-Knowledge PCP's are in SZK
- Abstract
- Probabilistically Checkable Proofs (PCPs) allow a verifier to check the validity of a proof by querying very few random positions in the proof string. Zero Knowledge (ZK) Proofs allow a prover to convince a verifier of a statement without revealing any information beyond the validity of the statement. We study for what class of languages it is possible to achieve both, namely to build ZK-PCPs, where additionally we require that the proof be generated efficiently. Such ZK-PCPs could potentially be useful for building UC-secure protocols in the tamper-proof token model.
We show that all languages with efficient statistical ZK-PCPs (i.e. where the ZK property holds against unbounded verifiers) must be in SZK (the class of languages with interactive statistical ZK proofs). We do this by reducing any ZK-PCP to an instance of the Conditional Entropy Approximation problem, which is known to be in SZK. This implies in particular that such ZK-PCPs are unlikely to exist for NP-complete problems such as SAT.
This is joint work with Mohammad Mahmoody.
Mercredi 21 novembre 2012: Soutenance de thèse de Roch Lescuyer
Orange Labs & ENS
10 heures en salle W
Outils cryptographiques pour les accréditations anonymes
Jury :
- David Pointcheval (DR CNRS, ENS), directeur
- Louis Goubin (Professeur, Université de Versailles), rapporteur
- Marc Joye (Ingénieur de recherche, HDR, Technicolor), rapporteur
- Sébastien Canard (Ingénieur de recherche, HDR, Orange Labs), examinateur
- Fabien Laguillaumie (Professeur, Université Lyon I), examinateur
- David Lubicz (Ingénieur de recherche, HDR, DGA-MI, chercheur associé IRMAR Université Rennes I), examinateur
- David Naccache (Professeur, Université Paris II, ENS Paris), examinateur
- Damien Vergnaud (Maître de conférences, ENS Paris), examinateur
Résumé :
L'un des rôles de la cryptographie moderne est d'assurer l'authentification pour l'accès aux services numériques. Dans ce contexte, la traçabilité des personnes constitue bien souvent l'envers de la médaille. Afin de répondre à cette problématique majeure du respect de la vie privée, tout en maintenant des politiques de droits d'accès, il serait ainsi souhaitable de concilier authentification et anonymat. Parmi les outils que la cryptographie propose pour répondre à ce besoin, les accréditations anonymes permettent un usage anonyme d'attributs certifiés pour accéder à un service de façon authentifiée. Ce mémoire présente plusieurs contributions au domaine des accréditations anonymes.
Nous proposons dans un premier temps d'utiliser le concept de signatures rectifiables dans le cadre des accréditations anonymes. Ces signatures permettent la modification contrôlée, par une personne habilitée, d'un document signé après la génération de la signature. Nous proposons ici de contrôler les données personnelles certifiées qui sont données au fournisseur de service lors du protocole d'authentification, grâce à l'usage de ces signatures rectifiables. Nous proposons dans un deuxième temps d'utiliser le concept de signatures agrégeables dans le cadre des accréditations anonymes. Les signatures agrégeables permettent la réunion de plusieurs signatures individuelles en un agrégat de taille constante. Leur utilisation dans les accréditations anonymes permet ainsi de simplifier l'utilisation de plusieurs accréditations au sein d'un même protocole d'authentification. Nous répondons dans un dernier temps par la positive à un problème ouvert en exhibant une construction multi-usage d'un système d'accréditation anonyme dans lequel les attributs certifiés sont chiffrés.
Abstract:
Modern cryptography aims among others to provide authentication means for access to digital services. In this context, the users' traceability is often the flipside of the coin. To address this major issue of privacy, while maintaining access rights policies, it may be desirable to combine authentication and anonymity. From among the tools offered by cryptography, anonymous credentials allow anonymous use of certified attributes to access services. This thesis presents several contributions to the field of anonymous credential.
Firstly, we propose to use sanitizable signatures in the context of anonymous credentials. Sanitizable signatures allow the modification, by an authorized person, of a signed document after the signature generation. We propose to control the certified personal data that are revealed to the service provider during an authentication protocol through the use of these sanitizable signatures. We then propose to use aggregate signatures within anonymous credential systems. Aggregate signatures allow to combine several individual signatures into an aggregate of constant size. Their use in an anonymous credential system allows to simplify the use of multiple accreditations within the same authentication protocol. Finally, we answer positively to an open problem by showing a construction of a multi-show anonymous credential system in which certified attributes are encrypted.
Programme du second trimestre 2012/2013
Jeudi 21 février 2013: Christophe Petit
UCL, Belgique
10h30 dans l'Amphi Evariste Galois - NIR
On polynomial systems arising from a Weil descent
- Abstract
- Polynomial systems of equations appearing in cryptography tend to have special structures that simplify their resolution. In this talk, we discuss a class of polynomial systems arising after deploying a multivariate polynomial equation over an extension field into a system of polynomial equations over the ground prime field (a technique commonly called Weil descent).
We provide theoretical and experimental evidence that the degrees of regularity of these systems are very low, in fact only slightly larger than the maximal degrees of the equations.
We then discuss cryptographic applications of (particular instances of) these systems to the hidden field equation (HFE) cryptosystem, to the factorization problem in SL(2,2^n) and to the elliptic curve discrete logarithm over binary fields. In particular, we show (under a classical heuristic assumption in algebraic cryptanalysis) that an elliptic curve index calculus algorithm due to Claus Diem has subexponential time complexity O(2^{c\,n^{2/3}\log n})\) over the binary field GF(2^n), where c is a constant smaller than 2.
Based on joint work with Jean-Charles Faugère, Ludovic Perret, Jean-Jacques Quisquater and Guénaël Renault.
Jeudi 14 mars 2013: Gregory Neven
IBM Zurich, Suisse
10h00 dans l'Amphi Evariste Galois - NIR
One Person, One Password: Practical Yet Universally Composable Two-Server Password-Authenticated Secret Sharing
- Abstract
- Password-authenticated secret sharing (PASS) schemes, first introduced by Bagherzandi et al. at CCS 2011, allow users to distribute data among several servers so that the data can be recovered using a single human-memorizable password, but no single server (or even no collusion of servers up to a certain size) can mount an off-line dictionary attack on the password or learn anything about the data. We propose a new, universally composable (UC) security definition for the two-server case (2PASS) in the public-key setting that addresses a number of relevant limitations of the previous, non-UC definition. For example, our definition makes no prior assumptions on the distribution of passwords, preserves security when honest users mistype their passwords, and guarantees secure composition with other protocols in spite of the unavoidable non-negligible success rate of online dictionary attacks. We further present a concrete 2PASS protocol and prove that it meets our definition. Given the strong security guarantees, our protocol is surprisingly efficient: in its most efficient instantiation under the DDH assumption in the random-oracle model, it requires fewer than twenty elliptic-curve exponentiations on the user’s device. We achieve our results by careful protocol design and by exclusively focusing on the two-server public-key setting.
Joint work with Jan Camenisch and Anna Lysyanskaya, appeared at ACM CCS 2012.
Jeudi 14 mars 2013: Hoeteck Wee
George Washington University
11h00 dans l'Amphi Evariste Galois - NIR
Attribute-Based Encryption for Circuits
- Abstract
- In an attribute-based encryption (ABE) scheme, a ciphertext is associated with descriptive values x, a secret key is associated with functions f, and a secret key decrypts the ciphertext to recover the plaintext iff f(x) = 1. Moreover, the scheme should be secure against collusions of users, namely, given secret keys for polynomially many functions, an adversary learns nothing about the message if none of the secret keys can individually decrypt the ciphertext.
We present ABE schemes for arbitrary polynomial size circuits, where the public parameters and ciphertext grow linearly with the depth of the circuit. Our construction is secure under the standard learning with errors (LWE) assumption. Previous constructions of attribute-based encryption were for Boolean formulas, captured by the complexity class NC1.
In the course of our construction, we present a novel framework for constructing ABE schemes based on a new primitive, a two-to-one recoding scheme.
Joint work with Sergey Gorbunov and Vinod Vaikuntanathan