Improving the prediction of users’ disclosure behavior… by making them disclose more predictably? (original) (raw)

Helping users with information disclosure decisions: Potential for adaptation

2013

Personalization relies on personal data about each individual user. Users are quite often reluctant though to disclose information about themselves and to be "tracked" by a system. We investigated whether different types of rationales (justifications) for disclosure that have been suggested in the privacy literature would increase users' willingness to divulge demographic and contextual information about themselves, and would raise their satisfaction with the system. We also looked at the effect of the order of requests, owing to findings from the literature. Our experiment with a mockup of a mobile app recommender shows that there is no single strategy that is optimal for everyone. Heuristics can be defined though that select for each user the most effective justification to raise disclosure or satisfaction, taking the user's gender, disclosure tendency, and the type of solicited personal information into account. We discuss the implications of these findings for research aimed at personalizing privacy strategies to each individual user.

Privacy & Personalization: Preliminary Results of an Empirical Study of Disclosure Behavior

2005

This paper describes empirical research into the privacy preferences and behaviors of individuals regarding personalization in music recommender systems. These phenomena concern music recommendations based on two different types of user information: preferences for music genres and personality traits. Our results indicate similar disclosure behavior by users for both types of personal information. This contradicts attitudes of users as reported in postexperiment questionnaires and interviews. Factors found to influence disclosure behavior are: information about the purpose of the disclosure and recipients of the information, the degree of confidentiality of the information involved, and the benefits people expect to gain from disclosing personal information.

Human Aspects and Perception of Privacy in Relation to Personalization

ArXiv, 2017

The concept of privacy is inherently intertwined with human attitudes and behaviours, as most computer systems are primarily designed for human use. Especially in the case of Recommender Systems, which feed on information provided by individuals, their efficacy critically depends on whether or not information is externalized, and if it is, how much of this information contributes positively to their performance and accuracy. In this paper, we discuss the impact of several factors on users' information disclosure behaviours and privacy-related attitudes, and how users of recommender systems can be nudged into making better privacy decisions for themselves. Apart from that, we also address the problem of privacy adaptation, i.e. effectively tailoring Recommender Systems by gaining a deeper understanding of people's cognitive decision-making process.

Online Privacy Heuristics that Predict Information Disclosure

Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems

Online users' attitudes toward privacy are contextdependent. Studies show that contextual cues are quite influential in motivating users to disclose personal information. Increasingly, these cues are embedded in the interface, but the mechanisms of their effects (e.g., unprofessional design contributing to more disclosure) are not fully understood. We posit that each cue triggers a specific "cognitive heuristic" that provides a rationale for decision-making. Using a national survey (N = 786) that elicited participants' disclosure intentions in common online scenarios, we identify 12 distinct heuristics relevant to privacy, and demonstrate that they are systematically associated with information disclosure. Data show that those with a higher accessibility to a given heuristic are more likely to disclose information. Design implications for protection of online privacy and security are discussed.

Does self-disclosure matter? A dynamic two-stage perspective for the personalization-privacy paradox

Although marketing managers are relying increasingly on customer data, insight into the best approaches for resolving the personalization-privacy paradox remains limited. Specifically, we argue for the success of a personalization involving the integration of two stages: the self-disclosure stage and the personalization stage. Using a conceptual framework grounded in the foot-in-the-door effect, we argue that compliance with commitment to self-disclosure as the initial small request induces greater compliance with the later target request. The results of a large-scale two-stage field experiment based on a combined propensity score matching and difference-indifference model show positive causal effects of the act of self-disclosure and the positive effect of the intensity of self-disclosure on purchase responses to personalized promotions. The results also indicate that a combination of privacy assurance and personalization declaration drives customers' act of self-disclosure and increases the intensity of self-disclosure. Findings empower managers to capitalize on new opportunities in personalization.

Tell me more, tell me more: repeated personal data requests increase disclosure

Journal of Cybersecurity

Personal data is of great commercial benefit and potential sensitivity. However, for the consumers who provide their personal data, doing so comes with potential costs, benefits and security risks. Typically, consumers have the option to consent to the use of personal/sensitive data but existing research suggests consumer choices may only be weakly related to their concerns (the privacy paradox). Here, we examine if the repetitive nature of data requests alters behaviour but not concern, therefore, explaining the divergence. This work is theoretically grounded in ‘Foot in the door’ research in which small initial requests facilitate subsequent larger requests. An initial laboratory study asking for real, personal data demonstrated increased information disclosure at a subsequent request. A second online study replicated the increased information disclosure effect and found no change in associated privacy concern. We find this supports foot-in-the-door as one explanation of the priva...

Impacts of user privacy preferences on personalized systems

2004

Personalized (or “user-adaptive”) systems have gained substantial momentum with the rise of the World Wide Web. The market research firm Jupiter (Foster, 2000) defines personalization as predictive analysis of consumer data used to adapt targeted media, advertising and merchandising to consumer needs.

Unlocking the privacy paradox

CHI '13 Extended Abstracts on Human Factors in Computing Systems, 2013

Even though users have become increasingly concerned about their privacy online, they continue to disclose deeply personal information in a number of online venues, including e-commerce portals and social networking sites. Scholars have tried to explain this inconsistency between attitudes and behavior by suggesting that online users consciously weigh the trade-off between the costs and benefits of online information disclosure. We argue that online user behaviors are not always rational, but may occur due to expedient decision-making in the heat of the moment. Such decisions are based on cognitive heuristics (i.e., rules of thumb) rather than on a careful analysis of each transaction. Based on this premise, we seek to identify the specific triggers for disclosure of private information online. In the experiment reported here, we explore the operation of two specific heuristics-benefit and fuzzy boundary-in influencing privacy-related attitudes and behaviors. Theoretical and design implications are discussed.

Easing the Burden of Setting Privacy Preferences: A Machine Learning Approach

Communications in Computer and Information Science, 2017

Setting appropriate privacy preferences is both a difficult and cumbersome task for users. In this paper, we propose a solution to address users' privacy concerns by easing the burden of manually configuring appropriate privacy settings at the time of their registration into a new system or service. To achieve this, we implemented a machine learning approach that provides users personalized privacy-by-default settings. In particular, the proposed approach combines prediction and clustering techniques, for modeling and guessing the privacy profiles associated to users' privacy preferences. This approach takes into consideration the combinations of service providers, types of personal data and usage purposes. Based on a minimal number of questions that users answer at the registration phase, it predicts their privacy preferences and sets an optimal default privacy setting. We evaluated our approach with a data set resulting from a questionnaire administered to 10,000 participants. Results show that with a limited user input of 5 answers the system is able to predict the personalised privacy settings with an accuracy of 85%.

Counteracting the negative effect of form auto-completion on the privacy calculus

When filling out web forms, people typically do not want to submit every piece of requested information to every website. Instead, they selectively disclose information after weighing the potential benefits and risks of disclosure: a process called "privacy calculus". Giving users control over what to enter is a prerequisite for this selective disclosure behavior. Exercising this control by manually filling out a form is a burden though. Modern browsers therefore offer an auto-completion feature that automatically fills out forms with previously stored values. This feature is convenient, but it makes it so easy to submit a fully completed form that users seem to skip the privacy calculus altogether. In an experiment we compare this traditional auto-completion tool with two alternative tools that give users more control than the traditional tool. While users of the traditional tool indeed forego their selective disclosure behavior, the alternative tools effectively reinstate the privacy calculus.