Abstract
We offer an ethical assessment of the market for data used to generate what are sometimes called “consumer scores” (i.e., numerical expressions that are used to describe or predict people’s dispositions and behavior), and we argue that the assessment has ethical implications on how the market for consumer scoring data should be regulated. To conduct the assessment, we employ two heuristics for evaluating markets. One is the “harm” criterion, which relates to whether the market produces serious harms, either for participants in the market, for third parties, or for society as a whole. The other is the “agency” criterion, which relates to whether participants understand the nature and significance of the exchanges they are making, if they can be guaranteed fair representation, or if there is differential need for the market’s good. We argue that consumer scoring data should be subject to the same sort of regulation as the older FICO credit scores. Although the movement in the 1990s that was aimed at regulating the FICO scores was not aimed at restraining a market per se, we argue that the reforms were underwritten by concerns about the same sorts of problems as those outlined by our heuristics. Therefore, consumer data should be subject to the same sort of regulation.
Similar content being viewed by others
Notes
A note about terminology: we’ll understand “personal data” in the same way as the General Data Protection Regulation, which defines personal data as “any information which are related to an identified or identifiable natural person,” where “data subjects are identifiable if they can be directly or indirectly identified, especially by reference to an identifier such as a name, an identification number, location data, an online identifier or one of several special characteristics, which expresses the physical, physiological, genetic, mental, commercial, cultural or social identity of these natural persons” (EU GDPR Art. 4(1)(1) 2016). We’ll use “personal data” and “personal information” interchangeably.
For a helpful overview see Acquisti et al. (2016).
Laudon (1996) explores some of these issues. He identifies loss of privacy as a serious moral concern associated with data markets as they exist in their present, unregulated form. He understands this loss of privacy as an externality, and as a solution recommends the creation of more formal marketplace—a “National Information Market.” Laudon’s analysis is valuable and prescient, but his critical analysis of the trade in data does not capture the full moral salience of the situation. This is primarily because he understands the underlying moral problem of the market as an externality. As we explain below, there are moral problems associated with markets in user data that are not externalities. Because his critical analysis is limited in this way, so is his solution—it does not guard against serious harms and wrongs that can befall consumers and society after the harms to them have been internalized by the market; markets that have no externalities can have other morally problematic features.
One notable exception is van den Hoven (2008), who focuses on moral reasons for protecting personal data that do not stem directly from a right to privacy.
See Posner (1981).
The FTC is a government agency in the United States that offers consumer protection.
See Sandel (2013).
See Walzer (1983).
Note that there arguably are certain kinds of data whose social value would be undermined if they are commodified. Kaplan (2015), for example, argues that selling medical data distorts the special relationship between doctors and patients.
See Satz (2012), p. 95.
See Satz (2012), p. 98.
“Jim Crow” refers to a social arrangement in the United States that, from around the late nineteenth century to around the mid twentieth century, enforced a strict racial caste system (Alexander and West 2012). The appearance of “redlining” in “digital redlining” harkens to a practice, originating in the Jim Crow era with the establishment of the Federal Housing Administration in 1934, that involved the segregation of neighborhoods through the denial of access to credit in and near black neighborhoods (Rothstein 2017).
Zone Improvement Plan (ZIP) codes are postal codes used in the United States. They divide the country into geographical segments of varying sizes. The ZIP + 4 code system is an expansion of the ZIP code system. The ZIP + 4 system divides the country into narrower segments, such as city blocks.
While our focus here is on the weak epistemic agency of consumers, it is worth mentioning that firms buying and selling data can be weak epistemic agents themselves. In a recent study, Latanya Sweeney demonstrated that anonymized health data could be, unbeknownst to its seller (Washington State), easily de-anonymized (Sweeney 2015).
This is borne out in empirical studies, which, for example, shows that most Americans (falsely) believe that it is illegal for online and brick-and-mortar stores to use consumer data to charge different prices to different people buying identical products at the same time of day (Turow et al. 2014).
HIPAA (1996) is a United States law that regulates the sharing of medical information.
The “FICO” (which originally stood for “Fair Isaac Company”) score is a consumer score that estimates risk of default on a loan. In the United States, the FICO score is the primary means by which one’s creditworthiness is determined (O’Neil 2016).
The FDA regulates food and drugs in the United States for safety and quality.
References
Acquisti, A., Taylor, C. R., & Wagman, L. (2016). The economics of privacy. SSRN Scholarly Paper. Rochester, NY: Social Science Research Network. Retrieved March 8, 2016 from https://papers.ssrn.com/abstract=2580411.
Akerlof, G. A. (1970). The market for ‘Lemons’: Quality uncertainty and the market mechanism. The Quarterly Journal of Economics, 84(3), 488–500. https://doi.org/10.2307/1879431.
Alexander, M., & West, C. (2012). The New Jim Crow: mass incarceration in the age of colorblindness. New York: The New Press.
Ananny, M., & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973–989. https://doi.org/10.1177/1461444816676645.
Anderson, E. (1990). The ethical limitations of the market. Economics and Philosophy, 6(2), 179–205. https://doi.org/10.1017/S0266267100001218.
Anderson, E. (1995). Value in ethics and economics (Revised edn.). Cambridge: Harvard University Press.
Angwin, J., & Larson, J. (2015). The tiger mom tax: Asians are nearly twice as likely to get a higher price from Princeton review. Retrieved September 1, 2015 from, https://www.propublica.org/article/asians-nearly-twice-as-likely-to-get-higher-price-from-princeton-review.
Angwin, J., Larson, K., Mattu, S., & Kirchner, L. (2016a) “Machine Bias.” ProPublica. Retrieved May 23, 2016 from, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
Angwin, J., & Parris, Jr. T. (2016b). Facebook lets advertisers exclude users by race. Retrieved October 28, 2016 from https://www.propublica.org/article/facebook-lets-advertisers-exclude-users-by-race.
Angwin, J., Tobin, A., & Varner, M. (2017a). Facebook (still) letting housing advertisers exclude users by race. Retrieved November 21, 2017 from. https://www.propublica.org/article/facebook-advertising-discrimination-housing-race-sex-national-origin.
Angwin, J., Varner, M., & Ariana, T. (2017b). Facebook enabled advertisers to reach ‘Jew Haters. Retrieved September 14, 2017 from https://www.propublica.org/article/facebook-enabled-advertisers-to-reach-jew-haters.
Barnes, S. B. (2006). A privacy paradox: Social networking in the United States. First Monday. https://doi.org/10.5210/fm.v11i9.1394.
Berendt, B., Günther, O., & Spiekermann, S. (2005). Privacy in E-Commerce: Stated preferences vs. actual behavior. Communications of the ACM, 48(4), 101–106. https://doi.org/10.1145/1053291.1053295.
Citron, D., & Pasquale, F. (2014). The scored society: Due process for automated predictions. Faculty Scholarship. Retrieved January 1, 2014 from http://digitalcommons.law.umaryland.edu/fac_pubs/1431.
DeCew, J. (2018). Privacy. In E. N. Zalta (Eds.), The Stanford Encyclopedia of Philosophy. Spring 2018. Metaphysics Research Lab, Stanford University, 2018. Retrieved from https://plato.stanford.edu/archives/spr2018/entries/privacy/.
Dixon, P., & Gellman, B. (2014). The Scoring of America: How secret consumer scores threaten your privacy and your future. World Privacy Forum, 2014.
Duhigg, C. (2012). How companies learn your secrets. The New York Times, February 16, 2012, sec. Magazine. Retrieved from https://www.nytimes.com/2012/02/19/magazine/shopping-habits.html.
EU General Data Protection Regulation (GDPR). (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1.
Experian. (2017). Summarized credit statistics. Retrieved Februrary 1, 2017 from http://www.experian.com/assets/marketing-services/productsheets/summarized-credit-stat.pdf.
Fair Isaac Corporation. (2000). Fair, Isaac ‘Demystifies’ FICO scores with list of score factors, Web-Based Explanation Service. Retrieved June 8, 2000 from http://www.prnewswire.com/news-releases/fair-isaac-demystifies-fico-scoreswith-list-of-score-factors-web-based-explanation-service-73492572.html.
Federal Trade Commission. (2015). Data brokers: A call for transparency and accountability. Scotts Valley: CreateSpace Independent Publishing Platform.
Glenn, T., & Monteith, S. (2014). Privacy in the digital world: Medical and health data outside of HIPAA protections. Current Psychiatry Reports, 16(11), 494.
Harris, E., Perlroth, N., Popper, N., & Stout, H. (2014). A sneaky path into target customers’ wallets. The New York Times, January 17, 2014.
Hoofnagle, C. J., & King, J. (2007). Consumer Information Sharing: Where the Sun Still Don’t Shine. SSRN Scholarly Paper. Rochester, NY: Social Science Research Network. Retrieved December 17, 2007 from https://papers.ssrn.com/abstract=1137990.
Kaplan, B. (2015). Selling health data: De-identification, privacy, and speech. Cambridge Quarterly of Healthcare Ethics: CQ: The International Journal of Healthcare Ethics Committees, 24(3), 256–271. https://doi.org/10.1017/S0963180114000589.
Laudon, K. C. (1996). Markets and privacy. Communications of the ACM, 39(9), 92–104. https://doi.org/10.1145/234215.234476.
LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. https://doi.org/10.1038/nature14539.
Mui, Y. Q. (2011). Little-known firms tracking data used in credit scores. The Washington Post, July 16, 2011.
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy (1st ed.). New York: Crown.
Obar, J. A., & Oeldorf-Hirsch, A. (2016) The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Networking Services (August 24, 2016). In TPRC 44: The 44th Research Conference on Communication, Information and Internet Policy 2016. https://doi.org/10.2139/ssrn.2757465. Retrieved from SSRN https://ssrn.com/abstract=2757465.
Pasquale, F. (2016). The Black Box Society: The secret algorithms that control money and information. Cambridge: Harvard University Press.
PublicData “Policies and Positions.” Retrieved February 1, 2018 from http://www.publicdata.com/mobilepandp.html.
Posner, R. (1981). The economics of privacy. American Economic Review, January, 405.
Ramirez, E. (2013). The privacy challenges of big data: A view from the lifeguard’s chair. Aspen, CO.
Rosenberg, M., Confessore, N., & Cadwalladr, C. (2018). “How Trump Consultants Exploited the Facebook Data of Millions.” The New York Times, March 17, 2018.
Rothstein, R. (2017). The Color of Law: A forgotten history of how our government segregated america (1st ed.). New York: Liveright.
Sandel, M. J. (2013). What Money Can’t Buy: The Moral Limits of Markets. (Reprint ed.). New York: Farrar Straus Giroux.
Satz, D. (2008). XIV—the moral limits of markets: The case of human kidneys. Proceedings of the Aristotelian Society, 108(1), 269–288. https://doi.org/10.1111/j.1467-9264.2008.00246.x.
Satz, D. (2012). Why some things should not be for sale: The moral limits of markets (Reprint ed.). New York: Oxford University Press.
Sen, A. (1999). On ethics & economics. New Delhi: Oxford.
Sweeney, L. “Discrimination in Online Ad Delivery.” ArXiv:1301.6822 [Cs]. Retrieved January 28, 2013 from. http://arxiv.org/abs/1301.6822.
Sweeney, L. (2015) Only You, Your Doctor, and Many Others May Know. Technology Science. Retrieved September 29, 2015 from https://techscience.org/a/2015092903/.
Turow, J. (2017). The Aisles have eyes: How retailers track your shopping, strip your privacy, and define your power. New Haven: Yale University Press.
Turow, J., Bleakley, A., Bracken, J., Carpini, M. X. D., Draper, N., Feldman, L., Good, N., et al. (2014). Americans, Marketers, and the Internet: 1999–2012.” SSRN Scholarly Paper. Rochester, NY: Social Science Research Network, Retrieved April 11, 2014 from. https://papers.ssrn.com/abstract=2423753.
Tutt, A. (2016). An FDA for Algorithms. SSRN Scholarly Paper. Rochester, NY: Social Science Research Network. Retrieved March 15, 2016 from https://papers.ssrn.com/abstract=2747994.
van den Hoven, J. (2008). Information technology, privacy, and the protection of personal data. In M. J. van den Hoven & J. Weckert (Eds.), Information Technology and Moral Philosophy (p. 301). Cambridge: Cambridge University Press.
United States. (2004). The health insurance portability and accountability act (HIPAA). [Washington, D.C.]: U.S. Dept. of Labor, Employee Benefits Security Administration. http://purl.fdlp.gov/GPO/gpo10291.
van den Hoven, J., Blaauw, M., Pieters, W., & Warnier, M. (2018). Privacy and information technology. In E. N. Zalta (Eds.), The Stanford Encyclopedia of Philosophy. Summer 2018. Metaphysics Research Lab, Stanford University, 2018. Retrieved from https://plato.stanford.edu/archives/sum2018/entries/it-privacy/.
Walzer, M. (1983). Spheres of justice: A defense of pluralism and equality. New York: Basic Books.
Yu, P. S., & Dietrich, S. M. (2012). Broken records: How errors by criminal background checking companies harm workers and businesses. National Consumer Law Center, April 11, 2012.
Acknowledgements
We are grateful to Hadley Cooney, Myriam Garcia, Dan Hausman, Zi Lin, Harvey D. Long, Farid Masrour, Kian Mintz-Woo, Joshua Mund, David O'Brien, Emi Okayasu, Emma Prendergast, Ben Schwan, Olav Vassend, participants at the 2018 Zicklin Center Normative Business Ethics Workshop Series, the 2018 Information Ethics Roundtable and the 2017 Great Lakes Philosophy Conference, audiences at California State University, Sacramento and the iSchool at UW-Madison, and three anonymous referees at Ethics and Information Technology. We are especially grateful to Alan Rubel for extensive feedback.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Pham, A., Castro, C. The moral limits of the market: the case of consumer scoring data. Ethics Inf Technol 21, 117–126 (2019). https://doi.org/10.1007/s10676-019-09500-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10676-019-09500-7