|Speaker:||Craig Gentry (IBM Research)|
|Title:||Computing on Encrypted Data|
|Abstract:||What if you want to query a search engine, but don't want to tell the search engine what you are looking for? Is there a way that you can encrypt your query, such that the search engine can process your query without your decryption key, and send back an (encrypted) response that is well-formed and concise (up to some upper bound on length that you specify)? The answer is yes, if you use a "fully homomorphic" encryption scheme. As another application, you can store your encrypted data in the "cloud", and later ask the server to retrieve only those files that contain a particular (boolean) combination of keywords, without the server being able to "see" either these keywords or your files.
We will present a recent fully homomorphic encryption scheme. In particular, we will highlight the main ideas of the construction, discuss issues concerning the scheme's performance, and mention other applications.
|Biography:||Craig Gentry is a research staff member in the Cryptography group at IBM T.J. Watson Research Center. His research tends towards the mathematical side of applied cryptography, both constructive (designing efficient and highly-functional cryptosystems) and destructive (cryptanalysis). From 2000 to 2005, he worked as a senior research engineer at DoCoMo USA Labs on the security and cryptography project. He recently obtained his Ph.D. in computer science from Stanford, with Dan Boneh as his advisor.|
|Speaker:||Adrian Perrig (Carnegie Mellon University)|
|Title:||Building Secure Networked Systems with Code Attestation|
|Abstract:||Attestation is a promising approach for building secure systems. The recent development of a Trusted Platform Module (TPM) by the Trusted Computing Group (TCG) that is starting to be deployed in common laptop and desktop platforms is fueling research in attestation mechanisms. In this talk, I will present approaches on how to build secure systems with advanced TPM architectures. In particular, we have designed an approach for fine-grained attestation that enables the design of efficient secure distributed systems, and other network protocols. We demonstrate this approach by designing a secure routing protocols.|
|Biography:||Adrian Perrig is a Professor in Electrical and Computer Engineering, Engineering and Public Policy, and Computer Science at Carnegie Mellon University. Adrian also serves as the technical director for Carnegie Mellon's Cybersecurity Laboratory (CyLab) and for the iCast project. He earned his Ph.D. degree in Computer Science from Carnegie Mellon University, and spent three years during his Ph.D. degree at University of California at Berkeley where he worked with his advisor Doug Tygar. He received his B.Sc. degree in Computer Engineering from the Swiss Federal Institute of Technology in Lausanne (EPFL). Adrian's research revolves around building secure systems and includes network security, trustworthy computing and security for social networks. More specifically, he is interested in trust establishment, trustworthy code execution in the presence of malware, and how to design secure next-generation networks. More information about his research is available on Adrian's web page. He is a recipient of the NSF CAREER award in 2004, IBM faculty fellowships in 2004 and 2005, and the Sloan research fellowship in 2006.|
|Speaker:||Adam Smith (Penn State Computer Science & Engineering)|
|Title:||A Cryptographer's-eye View of Privacy in Statistical Databases|
|Abstract:||Consider an agency holding a large database of sensitive personal information (perhaps medical records, census survey answers, or web search records). The agency would like to discover and publicly release global characteristics of the data (say, to inform policy and business decisions) while protecting the privacy of individuals' records. This problem is known variously as "statistical disclosure control", "privacy-preserving data mining" or simply "database privacy".
We will review recent progress on establishing rigorous foundations for privacy in such databases. Time permitting, we will discuss both "lower bounds", namely algorithmic reconstruction results that imply the impossibility of releasing very accurate statistics privately, and "upper bounds", namely new publication mechanisms that satisfy clear and strong definitions of privacy such as differential privacy.
|Biography:||Adam Smith is an assistant professor in the Department of Computer Science and Engineering at the Pennsylvania State University. His research interests lie in cryptography, privacy and their connections to information theory, quantum computing and statistics. He received his Ph.D. from MIT in 2004 and was subsequently a visiting scholar at the Weizmann Institute of Science and UCLA.|