Abstract
Privacy-by-Design (PbD) suggests designing the fundamental architecture and features of computing systems with privacy in mind. Although widely adopted by regulatory frameworks, a growing number of critics have questioned whether PbD's focus on compliance with privacy regulation may prevent it from addressing users' specific privacy attitudes and expectations. Motivated to enhance user-centered privacy-by-design processes, we examine what are the consequences of the way privacy questions are framed to crowd users, and how personal characteristics of the crowd users impact their responses. We recruited a total of 665 participants, of which 456 were recruited via Amazon Mechanical Turk (AMT), and 209 were university students. We show that the framing of computing systems' features using data flows results in features' evaluations that are less critical, compared to using descriptions of personal experiences. We also found, based on the student sample, that students with professional engineering experience are less critical than those with no work experience when assessing the features' appropriateness. We discuss how our results can be used to enhance privacy-by-design processes and encourage user-centered privacy engineering.
Original language | English |
---|---|
Article number | 102641 |
Journal | International Journal of Human Computer Studies |
Volume | 154 |
DOIs | |
State | Published - Oct 2021 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2021 Elsevier Ltd
Keywords
- Framing
- Privacy
- Privacy-by-design
- User-centered design
- Vignette study
ASJC Scopus subject areas
- Human Factors and Ergonomics
- Software
- Education
- General Engineering
- Human-Computer Interaction
- Hardware and Architecture