Skip to main content

From Group Privacy to Collective Privacy: Towards a New Dimension of Privacy and Data Protection in the Big Data Era

  • Chapter
  • First Online:
Group Privacy

Part of the book series: Philosophical Studies Series ((PSSP,volume 126))

Abstract

This chapter focuses on big data analytics and, in this context, investigates the opportunity to consider informational privacy and data protection as collective rights. From this perspective, privacy and data protection are not interpreted as referring to a given individual, but as common to the individuals that are grouped into various categories by data gatherers.

The peculiar nature of the groups generated by big data analytics requires an approach that cannot be exclusively based on individual rights. The new scale of data collection entails the recognition of a new layer, represented by groups’ need for the safeguard of their collective privacy and data protection rights.

This dimension requires a specific regulatory framework, which should be mainly focused on the legal representation of these collective interests, on the provision of a mandatory multiple-impact assessment of the use of big data analytics and on the role played by data protection authorities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See below Sect. 8.2.

  2. 2.

    It should be noted that these extensive analyses are also possible without directly identifying data subjects. See also Ohm (2010); Golle (2006); Sweeney (2000a, b).

  3. 3.

    In order to briefly describe the potential negative consequences of data processing at group level, it should be mentioned the potential impacts on social surveillance and the risks of group discrimination or stigmatization. See The White House (2014) and Bygrave (2002).

  4. 4.

    See Brandeis’ opinions in Olmstead v. United States, 277 U.S. 438, 471 (1928). See also Sweezy v. New Hampshire 354 US 234 (1957); NAACP v. Alabama 357 US 449 (1958); Massiah v. U.S. 377 US 201 (1964); Griswold v. Connecticut, 381 US 479 (1965); Roe v. Wade 410 US 113 (1973).

  5. 5.

    See, e.g., Trib. civ. Seine, 16 June 1858, D.P., 1858.3.62; see also Whitman (2004).

  6. 6.

    See the influential decision adopted by the Federal German Constitutional Court (Bundesverfassungsgericht), 15 December 1983, Neue Juristische Wochenschrift, 1984. https://www.zensus2011.de/SharedDocs/Downloads/DE/Gesetze/Volkszaehlungsurteil_1983.pdf?__blob=publicationFile&v=9. Accessed 25 June 2014.

  7. 7.

    See inter alia Article 29 Data Protection Working Party. 2013. Letter to Mr. Larry Page, Chief Executive Officer http://ec.europa.eu/justice/data-protection/article-29/documentation/other-document/files/2013/20130618_letter_to_google_glass_en.pdf. Accessed 27 February 2014; Irish Data Protection Commissioner. 2012. Facebook Ireland Ltd. Report of Re-Audit http://dataprotection.ie/documents/press/Facebook_Ireland_Audit_Review_Report_21_Sept_2012.pdf. Accessed 27 February 2014; Italian Data Protection Authority. 2013. Injunction and Order Issued Against Google Inc. http://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/3133945. Accessed 27 February 2014. Only in a few hypotheses, this collective dimension is recognised as autonomous and different from the individual one. This happens in labour law, where the representatives of employees concur on the adoption of the decisions concerning surveillance in the workplace on behalf of the workers, accepting limits to privacy in these contexts. See, European Commission. Undated. Second stage consultation of social partners on the protection of workers’ personal data, 7, 10, 16–17 http://ec.europa.eu/social/main.jsp?catId=708. Accessed 10 January 2015. See also specific references to the provisions of European national labour laws in Freedland (1999); Hendrickx (Undated). See also Article 4 of the Italian labour statute (L. 300/1970).

  8. 8.

    See Westin 1970; Breckenridge 1970; Solove 2008; Brenton 1964; Miller 1971; Mayer-Schönberger 1997; Secretary’s Advisory Committee on Automated Personal Data Systems. 1973. Records, Computers and the Rights of Citizens. http://epic.org/privacy/hew1973report/. Accessed 27 February 2014.

  9. 9.

    See above fn. 7.

  10. 10.

    See Bloustein (1978). In the description of the different contexts in which the right to privacy is relevant with regard to the group dimension, the author considers marital, priest-penitent, lawyer-client and physician-patient relationships. In these contexts, the right to privacy is mainly related to intimacy and secrecy.

  11. 11.

    See Bloustein (1978: 125): “Group privacy is an extension of individual privacy. The interest protected by group privacy is the desire and need of people to come together, to exchange information, share feelings, make plans and act in concert to attain their objectives”. This notion of group privacy focuses on secrecy and intimacy and, for this reason, it is fundamentally based on the level of trust existing among the members of a group. The consequence is a duty of confidentiality. The right concerns the nature of this duty and the breach of this obligation.

  12. 12.

    See Westin (1970). Moreover, the author points out the dimension of privacy concerning the communications among different groups.

  13. 13.

    The dynamics related to group privacy draw their origin from individuals, who are aware of their level of interaction and of the related social or legal consequences (Bloustein 1978). Therefore, group privacy becomes the aggregation of individual rights in the specific context of a group. This approach is consistent with sociological individualistic theories, which consider the group as an aggregation in which individuals interact with each other in a continuous and relatively stable manner. Moreover, the members of a group have the consciousness of being part of a group and usually the group is also recognised as an autonomous social structure by third parties.

  14. 14.

    On the debate regarding the application of privacy concept to collective entities, see Bygrave (2002).

  15. 15.

    It should be noted that only few data protection laws take into account the issues related to group privacy, mainly in terms of protection of information about legal entities. See Article 4 (original wording) of the Italian Data Protection Code (D. Lgs. 196/2003) (“ ‘data processor’ shall mean any natural or legal person, public administration, body, association or other agency that processes personal data on the controller’s behalf”). The article was amended in 2011, deleting any reference to legal persons. See also Article 2(4) of the Austrian data protection law, Datenschutzgesetz 2000 – DSG 2000 (“ ‘Data Subjectì’ [‘Betroffener’]: any natural or legal person or group of natural persons not identical with the controller, whose data are processed (sub-para. 8)”).

  16. 16.

    See above Sect. 8.2.

  17. 17.

    See Floridi (2013) and Floridi (1999).

  18. 18.

    See also Mantelero (2014a).

  19. 19.

    Big data analytics identify patterns in collective behaviours, also without identifying single individuals.

  20. 20.

    In many cases private companies and governments have no interests in profiling single customs or citizens, but are interested in the attitudes of clusters of individuals. Their main goal is to predict future behaviours of given segments of population and, consequently, actively act to reach economic or political purposes. See Bollier (2010).

  21. 21.

    As mentioned before, the notions of (individual) privacy and data protection have an influence on the definition of the boundaries of the collective dimension of privacy, but the larger scale affects the morphology of the related interests and their enforcement. At the same time, the notion of group privacy as hitherto described by legal scholars represents the dimension of privacy that is closer to the idea of collective privacy. For this reason, previous theoretical studies on group privacy can provide further elements to define a new set of rules to protect the collective dimension of privacy.

  22. 22.

    Criticisms about the notion of collective privacy have been expressed by Vedder (1997).

  23. 23.

    See Newman (2004: 128): “We can distinguish a collectivity from a set. A set is a collection of persons that we would identify as a different set were the persons included in the set to change. A collectivity is a collection of persons such that we would still identify it as the same collectivity were some or all of the persons in the collectivity to change (provided that the collectivity continued to meet some other conditions) and such that the persons who are in the collectivity identify themselves in some non-trivial way as members of this collectivity”.

  24. 24.

    In this sense, a commercial discrimination that affects a given set of users is relevant due to the fact that the set represents a small portion of consumers, which, in general, have a collective right not to be discriminated in negotiations.

  25. 25.

    See Federal Trade Commission (2014).

  26. 26.

    For example, the fact that a consumer belongs to a data segment for “Biker Enthusiasts” give him/her more chance to receive consumer coupons from motorcycle dealerships, but the same information may have a negative impact on his/her insurance profile, due to the supposed high probability to be engaged in risky behavior. See Federal Trade Commission (2014): “Similarly, while data brokers have a data category for “Diabetes Interest” that a manufacturer of sugar-free products could use to offer product discounts, an insurance company could use that same category to classify a consumer as higher risk”.

  27. 27.

    These individuals are divided into clusters on the basis of information retrieved from dozens of different sources and using hundreds of variables for their assessment. See Dixon and Gellman (2014).

  28. 28.

    See Newman. Collective Interests, 131.

  29. 29.

    On the contrary, an aggregative approach seems to be consistent with the notion of group privacy described by Bloustein (1978).

  30. 30.

    This distinction between aggregative and non-aggregative interests is made by Newman (2004), who defines these two categories of interests respectively as “shared” and “collective” interests. As observed by Finnis (1984), a collective interest in which the contrast is attenuate may become a shared interest.

  31. 31.

    The same divergence of interests exists with regard to government social surveillance for crime prevention and national security.

  32. 32.

    In the light of the above, the rights related to the collective dimension of privacy assume the nature of “group-differentiated rights”, which are held by members of groups on account of their group membership (Kymlicka 1995). From this perspective, groups have not only collective interests that represent the aggregation of individual interests, but also different specific interests focused on the group itself rather than on each of its members.

  33. 33.

    Bloustein. Individual and Group Privacy, 123–186.

  34. 34.

    Bygrave. Data Protection Law, 173–298.

  35. 35.

    An example is represented by the predictive software adopted by U.S. police departments to predict and prevent crimes on the basis of extensive collection of information about previous crimes. Regarding this big data application, there have been cases in which people were enrolled in the lists of potential offenders due to merely remote connections with authors of serious crimes (Gorner 2013; Perry, W.L. et al. 2013; Koss 2015; Mantelero and Vaciago 2014).

  36. 36.

    Nevertheless, this collective interest, which is relevant with regard to collective privacy, is not necessarily a shared interest. As mentioned above, single data subjects may accept invasive scrutiny of their behaviours to receive more customised services or for security reasons.

  37. 37.

    Bloustein. Individual and Group Privacy, 182.

  38. 38.

    It should be noted that big data analytics can also extract predictive inferences and correlations from publicly available information and from data voluntarily disclosed by data subjects. On the risks related to the interplay between private (commercial) surveillance and public surveillance conducted by government agencies, see also Mantelero and Vaciago (2014).

  39. 39.

    See the Italian Statute of the Workers’ Rights, Articles 4 and 8, Act 300, 20 May 1970.

  40. 40.

    See above fn. 7. See also Bygrave and Schartum (2009)

  41. 41.

    On the role of group actions, in order to protect individual and collective interest concerning personal information, see Bygrave (2002).

  42. 42.

    It should be noted that, in the field of big data analytics, the partially hidden nature of processes and their complexity make often it difficult to bring timely class actions, unlike the case of product liability, where the nature of the damage is more evident and this facilitates the reaction of the victims. As demonstrated by the recent NSA revelations, people are not usually aware of being under surveillance, and only the leak of information can disclose this practices and open a debate on their legitimacy, as well give the chance for individuals to bring legal actions. See also European Parliament (2014).

  43. 43.

    See Article 80 Regulation (EU) 2016/679.

  44. 44.

    In this sense, a major favor for a marked-oriented society or for government surveillance deeply affects the quantity and quality of remedies provided by the law.

  45. 45.

    See below Sect. 8.5.

  46. 46.

    See above fn. 7.

  47. 47.

    See above fn. 7.

  48. 48.

    See below Sect. 8.5.

  49. 49.

    In their role of representatives of collective interests, these entities could also bring legal actions for non-pecuniary damages, as well as they should be able to exercise the traditional individual rights on behalf of data subjects. See also Article 80 Regulation (EU) 2016/679.

  50. 50.

    In this sense, the stakeholders may have the right to access to the documents that describe the architecture and general purposes of big data processing. Nevertheless, in order to protect the legitimate interests of companies and governments, DPAs can limit this disclosure to third parties. In the big data context, these issues are also related to the transparency of the algorithms used by companies (Citron and Pasquale 2014). See also Mayer-Schönberger and Cukier (2013), who suggest a model based on independent internal and external audits. A wider access to the logic of the algorithms is required by Article 29 Data Protection Working Party. 2013. Opinion 03/2013 on purpose limitation, 47. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf. Accessed 27 February 2014. See also Gillespie (2014).

  51. 51.

    See Schwartz (2011); Wright (2011); Floridi (2014); Nissenbaum (2010); Calo (2013); Dwork and Mulligan (2013); Bygrave (2002); Cohen (2013); Hofmann (2005); Richards and King (2013); Article 29 Data Protection Working Party. 2014. Statement on the role of a risk-based approach in data protection legal frameworks, 4. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp218_en.pdf. Accessed 27 February 2014.

  52. 52.

    In this sense, for example, there are legal systems that give broad relevance to national and security interests, which in many cases prevail over individual and collective privacy. On the contrary, there are countries where extensive forms of social surveillance are considered disproportionate and invasive.

  53. 53.

    See, e.g., the different attitude of U.S. government with regard to surveillance, before and after the September 11, 2001 terrorist attacks. See also Bygrave (2004).

  54. 54.

    On the data protection impact assessment see Article 35 Regulation (EU) 2016/679. It should be noted that the data protection impact assessment does not represent a new approach to data protection, as the privacy impact assessment exists in different experiences around the world and has represented an important tool since mid-1990s. On the origins of the notion of privacy impact assessment, see Clarke (2009). Nevertheless, the exiting difference between privacy and data protection necessarily affects the extent of these different assessments, which investigate fields that are not completely overlapped.

  55. 55.

    On this multi-critera risk analysis, see more extensively Mantelero (2014b).

  56. 56.

    See above fn. 67.

  57. 57.

    See also Article 29 Data Protection Working Party. Statement on the role of a risk-based approach.

  58. 58.

    It is worth pointing out that the social and ethical assessments are similar to the data protection impact assessment in their nature, since they are prior assessments based on risk analysis. Nevertheless, in these cases, the wide range of interests that should be considered requires the involvement of different stakeholders and experts.

  59. 59.

    See Mantelero (2014b). The article, which deals with personal data and big data analytics, suggests adopting a new paradigm based on a mandatory multiple assessment coupled with an opt-out regime. In this model, although this assessment represents an economic burden for companies, it allows those who pass to use data for complex and multiple purposes, without requiring users to opt-in. At the same time, a prior assessment conducted by independent authorities and an opt-out model seem to offer more safeguards to users than the apparent, but inconsistent, user’s self-determination based on the opt-in model.

  60. 60.

    See also Wright (2011); Citron (2008). A different assessment exclusively based on the adoption of security standards or corporate self-regulation would not have the same extent and independence. This does not mean that, in this framework, forms of standardization or co-regulation cannot be adopted (Calo 2013).

  61. 61.

    See also FRA – European Union Agency for Fundamental Rights (2013). See also Simitis (1987).

  62. 62.

    This self-financing model, based on licensing or notification fees, was adopted in the past in Sweden and United Kingdom (Schütz 2012; Information Commissioner’s Office 2011). See also the fee-based model adopted by the European Medicines Agency.

  63. 63.

    It should be noted that regulations that require extensive prior risk assessments under the supervision of independent authorities are not new. They are already into force in other fields that are characterized by the presence of risks for individuals and society (e.g. authorization procedure for human medicines, mandatory security standards adopted by product liability laws).

  64. 64.

    For a more detailed description of the model here proposed, which also implies a review of the opt-in model in the big data context, see Mantelero (2014b).

Bibliography

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alessandro Mantelero .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Mantelero, A. (2017). From Group Privacy to Collective Privacy: Towards a New Dimension of Privacy and Data Protection in the Big Data Era. In: Taylor, L., Floridi, L., van der Sloot, B. (eds) Group Privacy. Philosophical Studies Series, vol 126. Springer, Cham. https://doi.org/10.1007/978-3-319-46608-8_8

Download citation

Publish with us

Policies and ethics