Privacy, Literacy, and Awareness Paradox in Big Data Ecosystems

The abstract and a copy of the presentation that we (Mihaela Popescu & Lemi Baruh) made in IAMCR conference in Dublin are available below. The paper introduces the concept of “awareness paradox” to discuss privacy literacy in an era of Big Data analytics. Thanks to all listeners for their responses and questions. The full paper is still a work in progress.

Title: Digital Literacy & Privacy Self-governance: A Value-based Approach to Privacy in Big Data Ecosystems

Abstract: The growth of interactive and mobile technologies, which equip institutions with vast capabilities to amass an unprecedented amount of information about individuals, has enabled the development of new business models. These business models rely on data-mining techniques and real-time or near real-time Big Data analytics designed to uncover hidden patterns and relations. Increasingly, the use of personal information, including behavioral and locational data, for large-scale data mining is becoming a normative capital-generating practice.

This new regime of data intensive surveillance is akin to a fishing expedition that starts by comparing each data-point to the population base, while potentially signaling any deviation as a potential risk to avoid or an opportunity to capitalize on. More importantly, by relying on algorithmic analysis of data, this regime of surveillance removes humans from the interpretation process, makes the process increasingly opaque, and adds an aura of objectivity that preempts challenges to the epistemological foundations of its inferences. At the same time, as policies for digital media use shift toward promoting self-management, they increasingly assume an ideal “omni-competent” user who is at once able to “benefit” from being open to information sharing and communication availability, weigh these benefits against the potential risks in the digital environment, and engage in risk-aversive behaviors. This assumption is distinctly at odds with the reality of individuals’ understanding of privacy risks. The ubiquitous and technically specialized nature of modern data collection makes it increasingly difficult for online and mobile users to understand which entities are collecting data about them and how. Moreover, studies show that less than a third of online users read, however partially, a website’s privacy policy (McDonald et al. 2009), with only about 2% likely to read it thoroughly (Turow et al. 2007). Ironically, as the work of the Ponemon Institute in the United States demonstrates, absent legislation to the contrary, Big Data also means that companies are able to identify privacy-centric customers and treat them differently in order to allay their concerns, rather than providing privacy-conscious policies for all customers (Urbanski, 2013).

This paper positions the discussion of privacy self-governance in the context of normative digital literacy skills. The paper seeks to unpack and question the nature of these apparently conflicting digital literacy competencies as they relate to current privacy policies in the United States, policies which emphasize individual responsibility, choice, and informed consent by educated consumers. Taking as the point of departure an exploration of what it means to be aware of privacy risks, the paper analyzes recent policy documents issued by governmental agencies as well as commercial discourse in marketing trade journals, in order to examine how state and market-based entities frame both the meaning of user privacy risk awareness and the meaning of voluntary opt-in into data collection regimes. Next, by referencing data collection and data mining practices in Big Data ecosystems, the paper discusses to what extent these differential framings translate into a coherent set of principles that position privacy education among the “new” literacies said to enhance individual participation in the digital economy. Given the “take it or leave it” approach adopted by corporations that engage in data collection and use, the paper questions the potential of digital literacy to translate into meaningful actions from users able to prompt a change in data practices that dominate the U.S. market. The paper concludes with a discussion of the policy implications of this “awareness paradox”—a concept that describes how more digital literacy about privacy may lead to a higher tendency for some users to withdraw themselves from the market for certain types of online services, which, in turn, my decrease the incentives for the market to cater to their privacy needs.

Click Here for the Presentation

In case it is useful, please cite as: Popescu, M. & Baruh, L. (2013, June). Digital Literacy and Privacy Self-Governance: A Value-Based Approach to Privacy in Big Data Ecosystems. Paper presented at the annual meeting of the International Association for Media and Communication Research Conference, Dublin, Ireland.