Abstract
Users of social, economic, or medical networks share personal information in exchange for tangible benefits, but may be harmed by leakage and misuse of the shared information. This paper analyzes the effects of privacy enhancements on the tradeoffs faced by such privacy-concerned individuals. The main insights are that different privacy enhancements may have opposite effects on the volume of information sharing, and that although they always seem beneficial to non-strategic users, privacy enhancements may backfire when users are strategic. The observation that privacy regulation may be harmful is not new, and the burgeoning empirical and experimental literature on the topic has shown that the effects of regulation may be positive or negative, depending on the context [1]. The theoretical literature on privacy has its roots in the work of [3] and [4], who derive a similar conclusion in a signaling context: under stronger privacy regimes individuals can more readily hide negative traits, which may be harmful to other market participants and to social welfare. This paper's goal is to identify properties of interactions that determine the effects of various privacy regulations. The point of departure is the observation that the conception of privacy commonly studied in the theory literature|namely, as a technology for altering the signaling capabilities of individuals|misses a key dimension of privacy harm: A concern not about third parties' inferences about individuals' types, but rather about misuse of the information itself. identity theft, spam, harassment, stalking, re-identification, online tracking, excessive profiling, and targeted advertising care less about whether the information leaked about them is positive or negative, and more about the fact that information has been leaked and misused in the first place. We frame the paper within the context of online social networks, but describe its applicability in other contexts as well. In the model, agents are not concerned about what the leaked information signals about their type, but rather about the quantity of personal information leaked. As a preliminary formalization, consider a user of a social network who wishes to share some information. The user derives a benefit from sharing information on the platform, a benefit captured by an increasing function of the amount of information shared. However, there is some chance of information leakage and misuse, in which case the user incurs a cost, captured by another increasing function of the amount of information shared. The user thus faces a tradeoff between benefit and cost. Is a privacy enhancement, in the form of lowering , beneficial to the user? The answer is easily seen to be positive.
Original language | English |
---|---|
Title of host publication | EC 2017 - Proceedings of the 2017 ACM Conference on Economics and Computation |
Publisher | Association for Computing Machinery, Inc |
Pages | 349-350 |
Number of pages | 2 |
ISBN (Electronic) | 9781450345279 |
DOIs | |
State | Published - 20 Jun 2017 |
Externally published | Yes |
Event | 18th ACM Conference on Economics and Computation, EC 2017 - Cambridge, United States Duration: 26 Jun 2017 → 30 Jun 2017 |
Publication series
Name | EC 2017 - Proceedings of the 2017 ACM Conference on Economics and Computation |
---|
Conference
Conference | 18th ACM Conference on Economics and Computation, EC 2017 |
---|---|
Country/Territory | United States |
City | Cambridge |
Period | 26/06/17 → 30/06/17 |
Bibliographical note
Publisher Copyright:© 2017 ACM.
ASJC Scopus subject areas
- Computer Science (miscellaneous)
- Statistics and Probability
- Computational Mathematics
- Economics and Econometrics