Privacy Through Accountability
Privacy through accountability refers to the principle that entities that hold personal information about individuals are accountable for adopting measures that protect the privacy of the data subjects . This article focuses on computational treatments of this principle. This research area has produced precise definitions of privacy properties and computational accountability mechanisms to aid in their enforcement.
Formally, privacy properties impose restrictions on personal information flows. Information flow types encompass context-specific direct flows (e.g., transfer of health information from a hospital to an insurance company) [2, 3, 4], implicit flows (e.g., the use of users’ location in a web advertising system) , and flows of noisy statistics from databases of personal information (e.g., the use of customers’ ratings to recommend movies) . The restrictions on these types of information flow include role-based restrictions (e.g., permitting certain types of...
- 1.OECD. Fair information practices principles. http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm
- 2.Barth A, Datta A, Mitchell JC, Nissenbaum H. Privacy and contextual integrity: framework and applications. In: Proceedings of the 2006 IEEE Symposium on Security and Privacy; 2006. p. 184–98.Google Scholar
- 3.DeYoung H, Garg D, Jia L, Kaynar DK, Datta A. Experiences in the logical specification of the HIPAA and GLBA privacy laws. In: Proceedings of the 2010 ACM Workshop on Privacy in the Electronic Society; 2010. p. 73–82.Google Scholar
- 4.May MJ, Gunter CA, Lee I. Privacy APIs: access control techniques to analyze and verify legal privacy policies. In: Proceedings of the 19th IEEE Computer Security Foundations Workshop; 2006. p. 85–97.Google Scholar
- 5.Sen S, Guha S, Datta A, Rajamani S, Tsai J, Wing JM. Bootstrapping privacy compliance in big data systems. In: Proceedings of the 2010 IEEE Symposium on Security and Privacy; 2014.Google Scholar
- 6.Dwork C. Differential privacy. In: Proceedings of the 33rd International Colloquium on Automata, Languages, and Programming; 2006. p. 1–12.Google Scholar
- 7.Tschantz MC. Formalizing and enforcing purpose restrictions. PhD thesis, Computer Science Department, Carnegie Mellon University, Technical Report CMU-CS-12-117, May 2012.Google Scholar
- 8.Garg D, Jia L, Datta A. Policy auditing over incomplete logs: theory, implementation and applications. In: Proceedings of the 18th ACM Conference on Computer and Communication Security; 2011. p. 151–62.Google Scholar
- 9.Basin DA, Klaedtke F, Muller S, Pfitzmann B. Runtime monitoring of metric first-order temporal properties. In: Proceedings of the 28th International Conference on Foundations of Software Technology and Theoretical Computer Science; 2008. p. 49–60.Google Scholar
- 10.Oh SE, Chun JY, Jia L, Garg D, Gunter CA, Datta A. Privacy-preserving audit for broker-based health information exchange. In: Proceedings of the 4th ACM Conference on Data and Application Security and Privacy; 2014. p. 313–20.Google Scholar
- 11.Tschantz MC, Datta A, Datta A, Wing JM. A methodology for information flow experiments. CoRR abs/1405.2376. 2014.Google Scholar
- 12.Lecuyer M, Ducoffe G, Lan F, Papancea A, Petsios T, Spahn R, Chaintreau A, Geambasu R. XRay: increasing the web's transparency with differential correlation. In: Proceedings of the 23rd USENIX Security Symposium; 2014.Google Scholar
- 13.Reed J, Pierce BC. Distance makes the types grow stronger: a calculus for differential privacy. In: Proceeding of the 15th ACM SIGPLAN International Conference on Functional Programming; 2010. p. 157–68.Google Scholar