Program

Tuesday July 9

PETools: 1st Workshop on Privacy Enhancing Tools (Program)

7:30am–Noon and 7:00pm–8:30pm Registration (Whittenberger Auditorium)

7:30pm Welcome Cocktail (University Club in IMU)

Wednesday July 10

8:00–5:00 Registration (Whittenberger Auditorium)

8:00–8:30 Breakfast (Georgian Room, First Floor)

8:45 Opening Remarks (Whittenberger Auditorium)

9:00 Session 1: Privacy-oriented Cryptography (Chair: Aniket Kate)
10:15 Coffee Break (Georgian Room, First Floor)

10:45 PETS Keynote Address (Chair: Kelly Caine)
12:00 Lunch (Frangipani Room, Mezzanine Floor)

2:00 Panel:  Inferring Privacy Expectations (Abstract)
3:30 Coffee Break (Georgian Room, First Floor)

4:00 Session 2: Data Privacy (Chair: Claudia Diaz)
4:50 Mini-Break

5:00 Session 3: Location Privacy (Chair: Janne Lindqvist)
6:00 PET Award Presentation

6:15 PET Award Reception (IU Art Museum)

Thursday July 11

8:00–5:00 Registration (Whittenberger Auditorium)

8:00–8:30 Breakfast (Georgian Room, First Floor)

8:40 Session 4: Tor Performance (Chair: Roger Dingledine)
9:30 Mini-Break

9:40 Session 5: Censorship Evasion and Traffic Analysis (Chair: Paul Syverson)
10:30 Coffee Break (Georgian Room, First Floor)

11:00 Panel: PETS Publishing (Abstract)
12:30 Lunch (Frangipani Room, Mezzanine Floor)

2:30 Session 6: User-Related Privacy Perspectives (Chair: Jean Camp)
3:20 Coffee Break (Georgian Room, First Floor)

3:50 Rump session 1 (Chair: Andrei Serjantov)

4:40 Mini-break

4:50 Rump session 2 (Chair: Roger Dingledine)

7:00 Social Event and Gala Dinner (Presidents Hall in Franklin Hall)

Friday July 12 (HotPETS)

8:00–12 Registration (Whittenberger Auditorium)

8:00–8:30 Breakfast (Georgian Room, First Floor)

9:15 Opening Remarks

9:30 Session 1: Privacy in Action (Chair: Konstantinos Chatzikokolakis)
10:45 Coffee Break (Georgian Room, First Floor)

11:15 Invited Speaker (Chair: Paul Syverson)
12:30 Lunch (Frangipani Room, Mezzanine Floor)

2:00 Session 2: Anonymous Communication (Chair: Aaron Johnson)
2:50 Coffee Break (Georgian Room, First Floor)

3:10 Session 3: Data Privacy (Chair: Claudia Diaz)
4:00 Coffee Break (Georgian Room, First Floor)

4:20 Session 4: Censorship Resistance (Chair: Eugene Vasserman)
5:10 Closing Remarks

Saturday July 13

8:30am–9:30am Light Breakfast (Tree Suite Lounge, Mezzanine Floor IMU)

10:00am–2:30pm Group Hike "Walk in the Woods"

Invited Speakers and Panels

Lorrie Cranor: Privacy Notice and Choice in Practice

Abstract: "Notice and choice" are key principles of modern information privacy protection. The various sets of fair information practice principles and the privacy laws based on these principles include requirements for providing notice about data practices and allowing individuals to exercise control over those practices. In the United States, privacy self-regulatory efforts focus heavily on notice and choice. However, there has been little follow-up to evaluate the effectiveness of notice and choice efforts in practice -- to determine whether individuals provided with notice are able to make informed choices that align with their expectations of privacy. The Cylab Usable Privacy and Security Laboratory (CUPS) at Carnegie Mellon has conducted empirical evaluations of a variety of notice and choice mechanisms, including privacy policies, the Platform for Privacy Preferences (P3P), online behavioral advertising opt-out tools, privacy nutrition labels, the AdChoices icon, and the standard form privacy notice for financial institutions. It this talk I will provide background on notice and choice and present several of our empirical studies, highlighting both areas where notice and choice approaches show promise, as well as areas where existing notice and choice tools appear to be largely ineffective. Besides discussing features of the notices and tools themselves, I will discuss the problems of incentives and enforcement, which continue to plague notice and choice efforts.

Bio: Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, and other topics. She has played a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability (O'Reilly 2005) and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P (O'Reilly 2002). She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. She was previously a researcher at AT&T-Labs Research and taught in the Stern School of Business at New York University. Lorrie has spent the 2012-13 academic year on sabbatical as a fellow in the Frank-Ratchye STUDIO for Creative Inquiry at CMU, working on fiber arts projects that combine her interests in privacy and security, quilting, computers, and technology.

Panel: Inferring Privacy Expectations (Moderator: Julien Freudiger)

Abstract: The deep social implications of information technology make it difficult to predict user privacy expectations. Without such understanding, corporations (albeit well intentioned) might make erroneous decisions, leading to "privacy-allergic" designs - designs which inadvertently affect privacy. Similarly, privacy activists face the risk of lobbying for privacy conservationism. This panel seeks to discuss the science behind understanding user privacy expectations, how companies develop privacy practices, and how to successfully deploy technologies with privacy in mind.

Panel: PETS Publishing (Moderator: Aaron Johnson)

Abstract: The nature of academic publishing is currently undergoing significant change, and in particular publishing in computer science. There has been much activity in the open-access movement, including the Elsevier boycott, the Research Without Walls initiative, ACM "Author-izer", and the recent White House Memorandum on research accessibility. Also, many in the computer science community are pushing for or experimenting with alternative structures for reviewing, publication, and conferences.

During this panel, experienced members of the research community discuss these issues as they relate to PETS and to research in computer security and privacy more broadly. They consider the current options for open-access publishing being considered by the PETS Advisory Board. They explore alternate forms of reviewing and publication taken by conferences other than PETS. They consider longer-term trends in research publishing and the opportunities and hazards for computer security research organizations. Finally, the results of a survey of the thoughts and opinions of the PETS community are presented.

Helen Nissenbaum: DIY Privacy with Obfuscation

Abstract: In limited domains, data obfuscation promises relief against powerful machinations of surveillance, aggregation, mining, and profiling. Whether it can withstand countervailing data analytics remains an open question; equally important are charges that it is unethical, illegitimate, or, at best, ungenerous. My talk explores the potential of obfuscation as a "weapon-of-the-weak" as it reveals technical, moral, and political vulnerabilities. It locates sources of these vulnerabilities, in particular, exploring the extent of our obligation to provide information about ourselves to others, sometimes in the name of the common good.

Bio: Helen Nissenbaum is Professor of Media, Culture and Communication, and Computer Science, and Director of the Information Law Institute at New York University. Her work, focusing on social, ethical, and political implications of information technology and digital media, has appeared in journals of philosophy, politics, law, media studies, information studies, and computer science. She has written and edited four books, including Privacy in Context: Technology, Policy, and the Integrity of Social Life, which was published in 2010 by Stanford University Press. The National Science Foundation, Air Force Office of Scientific Research, Ford Foundation, U.S. Department of Homeland Security, the U.S. Department of Health and Human Services Office of the National Coordinator, Intel and Microsoft have supported her work on privacy, trust online, and security, as well as several studies of values embodied in computer system design, including search engines, digital games, facial recognition technology, and health information systems. Nissenbaum holds a Ph.D. in philosophy from Stanford University and a B.A. (Hons) from the University of the Witwatersrand. Before joining the faculty at NYU, she served as Associate Director of the Center for Human Values at Princeton University.