Abstract
This chapter focuses on big data analytics and, in this context, investigates the opportunity to consider informational privacy and data protection as collective rights. From this perspective, privacy and data protection are not interpreted as referring to a given individual, but as common to the individuals that are grouped into various categories by data gatherers.
The peculiar nature of the groups generated by big data analytics requires an approach that cannot be exclusively based on individual rights. The new scale of data collection entails the recognition of a new layer, represented by groups’ need for the safeguard of their collective privacy and data protection rights.
This dimension requires a specific regulatory framework, which should be mainly focused on the legal representation of these collective interests, on the provision of a mandatory multiple-impact assessment of the use of big data analytics and on the role played by data protection authorities.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
See below Sect. 8.2.
- 2.
- 3.
- 4.
See Brandeis’ opinions in Olmstead v. United States, 277 U.S. 438, 471 (1928). See also Sweezy v. New Hampshire 354 US 234 (1957); NAACP v. Alabama 357 US 449 (1958); Massiah v. U.S. 377 US 201 (1964); Griswold v. Connecticut, 381 US 479 (1965); Roe v. Wade 410 US 113 (1973).
- 5.
See, e.g., Trib. civ. Seine, 16 June 1858, D.P., 1858.3.62; see also Whitman (2004).
- 6.
See the influential decision adopted by the Federal German Constitutional Court (Bundesverfassungsgericht), 15 December 1983, Neue Juristische Wochenschrift, 1984. https://www.zensus2011.de/SharedDocs/Downloads/DE/Gesetze/Volkszaehlungsurteil_1983.pdf?__blob=publicationFile&v=9. Accessed 25 June 2014.
- 7.
See inter alia Article 29 Data Protection Working Party. 2013. Letter to Mr. Larry Page, Chief Executive Officer http://ec.europa.eu/justice/data-protection/article-29/documentation/other-document/files/2013/20130618_letter_to_google_glass_en.pdf. Accessed 27 February 2014; Irish Data Protection Commissioner. 2012. Facebook Ireland Ltd. Report of Re-Audit http://dataprotection.ie/documents/press/Facebook_Ireland_Audit_Review_Report_21_Sept_2012.pdf. Accessed 27 February 2014; Italian Data Protection Authority. 2013. Injunction and Order Issued Against Google Inc. http://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/3133945. Accessed 27 February 2014. Only in a few hypotheses, this collective dimension is recognised as autonomous and different from the individual one. This happens in labour law, where the representatives of employees concur on the adoption of the decisions concerning surveillance in the workplace on behalf of the workers, accepting limits to privacy in these contexts. See, European Commission. Undated. Second stage consultation of social partners on the protection of workers’ personal data, 7, 10, 16–17 http://ec.europa.eu/social/main.jsp?catId=708. Accessed 10 January 2015. See also specific references to the provisions of European national labour laws in Freedland (1999); Hendrickx (Undated). See also Article 4 of the Italian labour statute (L. 300/1970).
- 8.
- 9.
See above fn. 7.
- 10.
See Bloustein (1978). In the description of the different contexts in which the right to privacy is relevant with regard to the group dimension, the author considers marital, priest-penitent, lawyer-client and physician-patient relationships. In these contexts, the right to privacy is mainly related to intimacy and secrecy.
- 11.
See Bloustein (1978: 125): “Group privacy is an extension of individual privacy. The interest protected by group privacy is the desire and need of people to come together, to exchange information, share feelings, make plans and act in concert to attain their objectives”. This notion of group privacy focuses on secrecy and intimacy and, for this reason, it is fundamentally based on the level of trust existing among the members of a group. The consequence is a duty of confidentiality. The right concerns the nature of this duty and the breach of this obligation.
- 12.
See Westin (1970). Moreover, the author points out the dimension of privacy concerning the communications among different groups.
- 13.
The dynamics related to group privacy draw their origin from individuals, who are aware of their level of interaction and of the related social or legal consequences (Bloustein 1978). Therefore, group privacy becomes the aggregation of individual rights in the specific context of a group. This approach is consistent with sociological individualistic theories, which consider the group as an aggregation in which individuals interact with each other in a continuous and relatively stable manner. Moreover, the members of a group have the consciousness of being part of a group and usually the group is also recognised as an autonomous social structure by third parties.
- 14.
On the debate regarding the application of privacy concept to collective entities, see Bygrave (2002).
- 15.
It should be noted that only few data protection laws take into account the issues related to group privacy, mainly in terms of protection of information about legal entities. See Article 4 (original wording) of the Italian Data Protection Code (D. Lgs. 196/2003) (“ ‘data processor’ shall mean any natural or legal person, public administration, body, association or other agency that processes personal data on the controller’s behalf”). The article was amended in 2011, deleting any reference to legal persons. See also Article 2(4) of the Austrian data protection law, Datenschutzgesetz 2000 – DSG 2000 (“ ‘Data Subjectì’ [‘Betroffener’]: any natural or legal person or group of natural persons not identical with the controller, whose data are processed (sub-para. 8)”).
- 16.
See above Sect. 8.2.
- 17.
- 18.
See also Mantelero (2014a).
- 19.
Big data analytics identify patterns in collective behaviours, also without identifying single individuals.
- 20.
In many cases private companies and governments have no interests in profiling single customs or citizens, but are interested in the attitudes of clusters of individuals. Their main goal is to predict future behaviours of given segments of population and, consequently, actively act to reach economic or political purposes. See Bollier (2010).
- 21.
As mentioned before, the notions of (individual) privacy and data protection have an influence on the definition of the boundaries of the collective dimension of privacy, but the larger scale affects the morphology of the related interests and their enforcement. At the same time, the notion of group privacy as hitherto described by legal scholars represents the dimension of privacy that is closer to the idea of collective privacy. For this reason, previous theoretical studies on group privacy can provide further elements to define a new set of rules to protect the collective dimension of privacy.
- 22.
Criticisms about the notion of collective privacy have been expressed by Vedder (1997).
- 23.
See Newman (2004: 128): “We can distinguish a collectivity from a set. A set is a collection of persons that we would identify as a different set were the persons included in the set to change. A collectivity is a collection of persons such that we would still identify it as the same collectivity were some or all of the persons in the collectivity to change (provided that the collectivity continued to meet some other conditions) and such that the persons who are in the collectivity identify themselves in some non-trivial way as members of this collectivity”.
- 24.
In this sense, a commercial discrimination that affects a given set of users is relevant due to the fact that the set represents a small portion of consumers, which, in general, have a collective right not to be discriminated in negotiations.
- 25.
See Federal Trade Commission (2014).
- 26.
For example, the fact that a consumer belongs to a data segment for “Biker Enthusiasts” give him/her more chance to receive consumer coupons from motorcycle dealerships, but the same information may have a negative impact on his/her insurance profile, due to the supposed high probability to be engaged in risky behavior. See Federal Trade Commission (2014): “Similarly, while data brokers have a data category for “Diabetes Interest” that a manufacturer of sugar-free products could use to offer product discounts, an insurance company could use that same category to classify a consumer as higher risk”.
- 27.
These individuals are divided into clusters on the basis of information retrieved from dozens of different sources and using hundreds of variables for their assessment. See Dixon and Gellman (2014).
- 28.
See Newman. Collective Interests, 131.
- 29.
On the contrary, an aggregative approach seems to be consistent with the notion of group privacy described by Bloustein (1978).
- 30.
This distinction between aggregative and non-aggregative interests is made by Newman (2004), who defines these two categories of interests respectively as “shared” and “collective” interests. As observed by Finnis (1984), a collective interest in which the contrast is attenuate may become a shared interest.
- 31.
The same divergence of interests exists with regard to government social surveillance for crime prevention and national security.
- 32.
In the light of the above, the rights related to the collective dimension of privacy assume the nature of “group-differentiated rights”, which are held by members of groups on account of their group membership (Kymlicka 1995). From this perspective, groups have not only collective interests that represent the aggregation of individual interests, but also different specific interests focused on the group itself rather than on each of its members.
- 33.
Bloustein. Individual and Group Privacy, 123–186.
- 34.
Bygrave. Data Protection Law, 173–298.
- 35.
An example is represented by the predictive software adopted by U.S. police departments to predict and prevent crimes on the basis of extensive collection of information about previous crimes. Regarding this big data application, there have been cases in which people were enrolled in the lists of potential offenders due to merely remote connections with authors of serious crimes (Gorner 2013; Perry, W.L. et al. 2013; Koss 2015; Mantelero and Vaciago 2014).
- 36.
Nevertheless, this collective interest, which is relevant with regard to collective privacy, is not necessarily a shared interest. As mentioned above, single data subjects may accept invasive scrutiny of their behaviours to receive more customised services or for security reasons.
- 37.
Bloustein. Individual and Group Privacy, 182.
- 38.
It should be noted that big data analytics can also extract predictive inferences and correlations from publicly available information and from data voluntarily disclosed by data subjects. On the risks related to the interplay between private (commercial) surveillance and public surveillance conducted by government agencies, see also Mantelero and Vaciago (2014).
- 39.
See the Italian Statute of the Workers’ Rights, Articles 4 and 8, Act 300, 20 May 1970.
- 40.
See above fn. 7. See also Bygrave and Schartum (2009)
- 41.
On the role of group actions, in order to protect individual and collective interest concerning personal information, see Bygrave (2002).
- 42.
It should be noted that, in the field of big data analytics, the partially hidden nature of processes and their complexity make often it difficult to bring timely class actions, unlike the case of product liability, where the nature of the damage is more evident and this facilitates the reaction of the victims. As demonstrated by the recent NSA revelations, people are not usually aware of being under surveillance, and only the leak of information can disclose this practices and open a debate on their legitimacy, as well give the chance for individuals to bring legal actions. See also European Parliament (2014).
- 43.
See Article 80 Regulation (EU) 2016/679.
- 44.
In this sense, a major favor for a marked-oriented society or for government surveillance deeply affects the quantity and quality of remedies provided by the law.
- 45.
See below Sect. 8.5.
- 46.
See above fn. 7.
- 47.
See above fn. 7.
- 48.
See below Sect. 8.5.
- 49.
In their role of representatives of collective interests, these entities could also bring legal actions for non-pecuniary damages, as well as they should be able to exercise the traditional individual rights on behalf of data subjects. See also Article 80 Regulation (EU) 2016/679.
- 50.
In this sense, the stakeholders may have the right to access to the documents that describe the architecture and general purposes of big data processing. Nevertheless, in order to protect the legitimate interests of companies and governments, DPAs can limit this disclosure to third parties. In the big data context, these issues are also related to the transparency of the algorithms used by companies (Citron and Pasquale 2014). See also Mayer-Schönberger and Cukier (2013), who suggest a model based on independent internal and external audits. A wider access to the logic of the algorithms is required by Article 29 Data Protection Working Party. 2013. Opinion 03/2013 on purpose limitation, 47. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf. Accessed 27 February 2014. See also Gillespie (2014).
- 51.
See Schwartz (2011); Wright (2011); Floridi (2014); Nissenbaum (2010); Calo (2013); Dwork and Mulligan (2013); Bygrave (2002); Cohen (2013); Hofmann (2005); Richards and King (2013); Article 29 Data Protection Working Party. 2014. Statement on the role of a risk-based approach in data protection legal frameworks, 4. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp218_en.pdf. Accessed 27 February 2014.
- 52.
In this sense, for example, there are legal systems that give broad relevance to national and security interests, which in many cases prevail over individual and collective privacy. On the contrary, there are countries where extensive forms of social surveillance are considered disproportionate and invasive.
- 53.
See, e.g., the different attitude of U.S. government with regard to surveillance, before and after the September 11, 2001 terrorist attacks. See also Bygrave (2004).
- 54.
On the data protection impact assessment see Article 35 Regulation (EU) 2016/679. It should be noted that the data protection impact assessment does not represent a new approach to data protection, as the privacy impact assessment exists in different experiences around the world and has represented an important tool since mid-1990s. On the origins of the notion of privacy impact assessment, see Clarke (2009). Nevertheless, the exiting difference between privacy and data protection necessarily affects the extent of these different assessments, which investigate fields that are not completely overlapped.
- 55.
On this multi-critera risk analysis, see more extensively Mantelero (2014b).
- 56.
See above fn. 67.
- 57.
See also Article 29 Data Protection Working Party. Statement on the role of a risk-based approach.
- 58.
It is worth pointing out that the social and ethical assessments are similar to the data protection impact assessment in their nature, since they are prior assessments based on risk analysis. Nevertheless, in these cases, the wide range of interests that should be considered requires the involvement of different stakeholders and experts.
- 59.
See Mantelero (2014b). The article, which deals with personal data and big data analytics, suggests adopting a new paradigm based on a mandatory multiple assessment coupled with an opt-out regime. In this model, although this assessment represents an economic burden for companies, it allows those who pass to use data for complex and multiple purposes, without requiring users to opt-in. At the same time, a prior assessment conducted by independent authorities and an opt-out model seem to offer more safeguards to users than the apparent, but inconsistent, user’s self-determination based on the opt-in model.
- 60.
See also Wright (2011); Citron (2008). A different assessment exclusively based on the adoption of security standards or corporate self-regulation would not have the same extent and independence. This does not mean that, in this framework, forms of standardization or co-regulation cannot be adopted (Calo 2013).
- 61.
- 62.
- 63.
It should be noted that regulations that require extensive prior risk assessments under the supervision of independent authorities are not new. They are already into force in other fields that are characterized by the presence of risks for individuals and society (e.g. authorization procedure for human medicines, mandatory security standards adopted by product liability laws).
- 64.
For a more detailed description of the model here proposed, which also implies a review of the opt-in model in the big data context, see Mantelero (2014b).
Bibliography
Article 29 Data Protection Working Party. 2013. Letter to Mr. Larry Page, Chief Executive Officer. http://ec.europa.eu/justice/data-protection/article-29/documentation/other-document/files/2013/20130618_letter_to_google_glass_en.pdf. Accessed 27 Feb 2014.
Article 29 Data Protection Working Party. 2013. Opinion 03/2013 on purpose limitation. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf. Accessed 27 Feb 2014.
Article 29 Data Protection Working Party. 2014. Statement on the role of a risk-based approach in data protection legal frameworks. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp218_en.pdf. Accessed 27 Feb 2014.
Bennett, C.J. 1992. Regulating privacy: Data protection and public policy in Europe and the United States. Ithaca: Cornell University Press.
Bloustein, E.J. 1977. Group privacy: The right to huddle. Rutgers School of Law 8: 219–283.
Bloustein, E.J. 1978. Individual and group privacy. New Brunswick: Transaction Books.
Bollier, D. 2010. The promise and perils of big data Washington, DC: Aspen Institute, Communications and Society Program http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/The_Promise_and_Peril_of_Big_Data.pdf. Accessed 27 Feb 2014.
Breckenridge, A.C. 1970. The right to privacy. Lincoln: University of Nebraska Press.
Brenton, M. 1964. The privacy invaders. New York: Coward-McCann.
Bygrave, L.A. 2002. Data protection law. Approaching its rationale, logic and limits. The Hague: Kluwer Law International.
Bygrave, L. 2004. Privacy protection in a global context. A comparative overview. Scandinavian Studies in Law 7(319): 319–348.
Bygrave, L.A., and D.W. Schartum. 2009. Consent, proportionality and collective power. In Reinventing data protection? ed. Serge Gutwirth et al., 157–173. Dordrecht: Springer.
Calo, R.M. 2013. Consumer subject review boards: A thought experiment. Standford Law Review Online 66: 97–102.
Cate, F.H., and Mayer-Schönberger, V. 2013. Data use and impact. Global Workshop The Center for Information Policy Research and The Center for Applied Cybersecurity Research, Indiana University, http://cacr.iu.edu/sites/cacr.iu.edu/files/Use_Workshop_Report.pdf. Accessed 27 Feb 2014.
Chamoux, J. 1981. Data protection in Europe: The problem of the physical person and their legal person. Journal of Media Law & Practice 2: 70–83.
Citron, D.K. 2008. Technological due process. Washington University Law Review 85(6): 1249–1313.
Citron, D.K., and F. Pasquale. 2014. The scored society: Due process for automated predictions. Washington Law Review 89(1): 1–33.
Clarke, R. 2009. Privacy impact assessment: Its origins and development. Computer Law & Security Review 25(2): 123–135.
Cohen, J.E. 2013. What privacy is for. Harvard Law Review 126(1904): 1933.
Dixon, P., and R. Gellman. 2014. The scoring of America: How secret consumer scores threaten your privacy and your future. 43–46, http://www.worldprivacyforum.org/wp-content/uploads/2014/04/WPF_Scoring_of_America_April2014_fs.pdf. Accessed 15 Apr 2015.
Dwork, C., and D.K. Mulligan. 2013. It’s not privacy and it’s not fair. Standford Law Review Online 66: 35–40.
Etzioni, A. 1999. The limits of privacy. New York: Basic Books.
European Parliament. 2014. Resolution of 12 March 2014 on the US NSA surveillance programme, surveillance bodies in various Member States and their impact on EU citizens’ fundamental rights and on transatlantic cooperation in Justice and Home Affairs. http://www.europarl.europa.eu/sides/getDoc.do?type=TA&language=EN&reference=P7-TA-2014-0230. Accessed 26 Feb 2015.
Federal Trade Commission. 2014. Data brokers: A call for transparency and accountability. Appendix B. https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf. Accessed 14 May 2015.
Finnis, J. 1984. The authority of law in the predicament of contemporary social theory. Journal of Law Ethics & Public Policy 1: 115–137.
Flaherty, D. 2000. Privacy impact assessments: An essential tool for data protection. Privacy Law & Policy Reporter 7(5): 45 http://www.austlii.edu.au/cgi-bin/sinodisp/au/journals/PrivLawPRpr/2000/45.html?stem=0&synonyms=0&query=flaherty. Accessed 11 Nov 2014.
Floridi, L. 1999. Information ethics: On the philosophical foundation of computer ethics. Ethics and Information Technology 1: 37–56.
Floridi, L. 2013. The ethics of information. New York: Oxford University Press.
Floridi, L. 2014. The 4TH revolution. How the infosphere is reshaping human reality. New York/Oxford: Oxford University Press.
FRA – European Union Agency for Fundamental Rights. 2013. Access to data protection remedies in EU Member States. http://fra.europa.eu/sites/default/files/fra-2014-access-data-protection-remedies_en_0.pdf. Accessed 27 Feb 2014.
Freedland, M. 1999. Data protection and employment in the European union. An analytical study of the law and practice of data protection and the employment relationship in the EU and its member. http://ec.europa.eu/social/main.jsp?catId=708. Accessed 25 Jan 2015.
Giesker, H. 1905. Das Recht der Privaten an der eigenen Geheimsphäre. Ein Beitrag zu der Lehre von den Individualrechten. Zürich: Müller.
Gillespie, T. 2014. The relevance of algorithms. In Media technologies. Essays on communication, materiality, and society, ed. T. Gillespie, P.J. Boczkowski, and K.A. Foot, 167–194. Cambridge, MA: MIT Press.
Golle, P. 2006. Revisiting the uniqueness of simple demographics in the US population. In Proceedings of the 5th ACM workshop on privacy in electronic society, ed. A. Juels. New York: ACM 2006.
Gorner, J. 2013. Chicago police use ‘heat list’ as strategy to prevent violence. Officials generate analysis to predict who will likely be involved in crime, as perpetrator or victim, and go door to door to issue warnings. Chicago Tribune, August 21. http://articles.chicagotribune.com/2013-08-21/news/ct-met-heat-list-20130821_1_chicago-police-commander-andrew-papachristos-heat-list. Accessed 25 Feb 2015.
Hendrickx, F. Undated. Protection of workers’ personal data in the European union, 33–35, 98–101. http://ec.europa.eu/social/main.jsp?catId=708. Accessed 18 Jan 2015.
Hofmann, B. 2005. On value-judgments and ethics in health technology assessment. Poiesis & Praxis 3: 277–295.
Information Commissioner’s Office. 2011. Budget 2011–12. Spending plans 2012–13 to 2014–15. 2http://ico.org.uk/about_us/boards_committees_and_minutes/~/media/documents/library/Corporate/Detailed_specialist_guides/ico_budget_2011-12.ashx. Accessed 27 Feb 2014.
Irish Data Protection Commissioner. 2012. Facebook Ireland Ltd. Report of Re-Audit. http://dataprotection.ie/documents/press/Facebook_Ireland_Audit_Review_Report_21_Sept_2012.pdf. Accessed 27 Feb 2014.
Italian Data Protection Authority. 2013. Injunction and Order Issued Against Google Inc. http://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/3133945. Accessed 27 Feb 2014.
Kohler, J. 1907. Urheberrecht an schriftwerken und verlagsrecht. Stuttgart: F. Enke.
Koss, K.K. 2015. Leveraging predictive policing algorithms to restore fourth amendment protections in high-crime areas in a post-wardlow world. Chicago Kent Law Review 90: 301–334.
Kymlicka, W. 1995. Multicultural citizenship. New York: Oxford University Press.
Leebron, D.W. 1991. The right to privacy’s place in the intellectual history of tort law. Case Western Reserve Law Review 41: 769–810.
Mantelero, A., and G. Vaciago. 2014. Social media and big data. In Cyber crime & cyber terrorism. Investigators’ handbook, ed. B. Akhgar, A. Staniforth, and F.M. Bosco. Waltham: Elsevie.
Mantelero, A. 2014a. Social control, transparency, and participation in the big data world. Journal of Internet Law April, 23–29.
Mantelero, A. 2014b. The future of consumer data protection in the E.U. Rethinking the “notice and consent” paradigm in the new era of predictive analytics. Computer Law & Security Review 30: 643–660.
Mayer-Schönberger, V., and K. Cukier. 2013. Big data. A revolution that will transform how we live, work and think. London: John Murray.
Mayer-Schönberger, V. 1997. Generational development of data protection in Europe. In Technology and privacy: The new landscape, ed. P.E. Agre and M. Rotenberg. Cambridge, MA: MIT Press.
Miller, A.R. 1971. The assault on privacy computers, data banks, dossiers, 54–67. Ann Arbor: University of Michigan Press.
Newman, D.G. 2004. Collective interests and collective rights. American Journal of Jurisprudence 49(1): 127–163.
Nissenbaum, H. 2010. Privacy in context. technology, policy, and the integrity of social life. Stanford: Stanford University Press, 231.
Ohm, P. 2010. Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review 57: 1701–1777.
Packard, V. 1964. The naked society. New York: David McKay.
Perry, W.L. et al. 2013. Predictive policing. The Role of Crime Forecasting in Law Enforcement Operations. http://www.rand.org/content/dam/rand/pubs/research_reports/RR200/RR233/RAND_RR233.pdf. Accessed 10 Mar 2015.
Post, R.C. 1990. Rereading warren and brandeis: Privacy, property and appropriation. Case Western Reserve Law Review 41: 647–680.
Richards, N.M., and J.H. King. 2013. Three paradoxes of big data. Stanford Law Review 66: 41–46.
Schudson, M. 1978. Discovering the news. A social history of American newspaper. New York: Basic Books.
Schütz, P. 2012. Comparing formal independence of data protection authorities in selected EU Member States. Conference Paper for the 4th Biennial ECPR Standing Group for Regulatory Governance Conference 2012. 17, fn. 73, and 18. http://regulation.upf.edu/exeter-12-papers/Paper%20265%20-%20Schuetz%202012%20-%20Comparing%20formal%20independence%20of%20data%20protection%20authorities%20in%20selected%20EU%20Member%20States.pdf. Accessed 27 Feb 2014.
Schwartz, P.M. 2011. Data protection law and the ethical use of analytics. http://www.huntonfiles.com/files/webupload/CIPL_Ethical_Undperinnings_of_Analytics_Paper.pdf. Accessed 27 Feb 2014.
Secretary’s Advisory Committee on Automated Personal Data Systems. 1973. Records, computers and the rights of citizens. http://epic.org/privacy/hew1973report/. Accessed 27 Feb 2014.
Simitis, S. 1987. Reviewing privacy in an information society. University of Pennsylvania Law Review 135(3): 707–746.
Solove, D.J. 2008. Understanding privacy. Cambridge, MA/London: Harvard University Press.
Stromhölm, S. 1967. Right of privacy and rights of personality. A comparative survey. Stockholm: Norstedt & Soners.
Sweeney, L. 2000a. Foundations of privacy protection from a computer science perspective. In Proceedings Joint Statistical Meeting, AAAS, Indianapolis. http://dataprivacylab.org/projects/disclosurecontrol/paper1.pdf. Accessed 24 Jan 2015.
Sweeney, L. 2000b. Simple demographics often identify people uniquely. Pittsburgh: Carnegie Mellon University. http://dataprivacylab.org/projects/identifiability/paper1.pdf. Accessed 24 Jan 2015.
Vedder, A.H. 1997. Privatization, information technology and privacy: Reconsidering the social responsibilities of private organizations. In Business ethics: Principles and practice, ed. Geoff Moore, 215–226. Sunderland: Business Education Publishers.
The White House, Executive Office of the President. 2014. Big data: Seizing opportunities, preserving values Washington, DC http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf. Accessed 26 Dec 2014.
Warren, S.D., and L.D. Brandeis. 1890. The right to privacy. Harvard Law Review 4(5): 193–220.
Westin, A.F. 1970. Privacy and freedom. New York: Atheneum.
Whitman, J.Q. 2004. The two western cultures of privacy: Dignity versus liberty. Yale Law Journal 113: 1151–1221.
Wright, David. 2011. A framework for the ethical impact assessment of information technology. Ethics and Information Technology 13(3): 199–226.
Wright, D. 2012. The state of the art in privacy impact assessment. Computer Law & Security Review 28(1): 54–61.
Wright, D., and P. de Hert (eds.). 2012. Privacy impact assessment. Dordrecht: Springer.
Wright, D., M. Friedewald, and R. Gellert. 2015. Developing and testing a surveillance impact assessment methodology. International Data Privacy Law 5(1): 40–53.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Mantelero, A. (2017). From Group Privacy to Collective Privacy: Towards a New Dimension of Privacy and Data Protection in the Big Data Era. In: Taylor, L., Floridi, L., van der Sloot, B. (eds) Group Privacy. Philosophical Studies Series, vol 126. Springer, Cham. https://doi.org/10.1007/978-3-319-46608-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-46608-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46606-4
Online ISBN: 978-3-319-46608-8
eBook Packages: Religion and PhilosophyPhilosophy and Religion (R0)