Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose

Stapels JG, Penner A, Diekmann N, Eyssel F (2023)
International Journal of Social Robotics 15: 1487–1505.

Zeitschriftenaufsatz | Veröffentlicht | Englisch
 
Download
OA 403.79 KB
Autor*in
Stapels, Julia G.; Penner, AngelikaUniBi ; Diekmann, Niels; Eyssel, FriederikeUniBi
Abstract / Bemerkung
**Abstract**
When encountering social robots, potential users are often facing a dilemma between privacy and utility. That is, high utility often comes at the cost of lenient privacy settings, allowing the robot to store personal data and to connect to the internet permanently, which brings in associated data security risks. However, to date, it still remains unclear how this dilemma affects attitudes and behavioral intentions towards the respective robot. To shed light on the influence of a social robot’s privacy settings on robot-related attitudes and behavioral intentions, we conducted two online experiments with a total sample ofN = 320 German university students. We hypothesized that strict privacy settings compared to lenient privacy settings of a social robot would result in more favorable attitudes and behavioral intentions towards the robot in Experiment 1. For Experiment 2, we expected more favorable attitudes and behavioral intentions for choosing independently the robot’s privacy settings in comparison to evaluating preset privacy settings. However, those two manipulations seemed to influence attitudes towards the robot in diverging domains: While strict privacy settings increased trust, decreased subjective ambivalence and increased the willingness to self-disclose compared to lenient privacy settings, the choice of privacy settings seemed to primarily impact robot likeability, contact intentions and the depth of potential self-disclosure. Strict compared to lenient privacy settings might reduce the risk associated with robot contact and thereby also reduce risk-related attitudes and increase trust-dependent behavioral intentions. However, if allowed to choose, people make the robot ‘their own’, through making a privacy-utility tradeoff. This tradeoff is likely a compromise between full privacy and full utility and thus does not reduce risks of robot-contact as much as strict privacy settings do. Future experiments should replicate these results using real-life human robot interaction and different scenarios to further investigate the psychological mechanisms causing such divergences.
Stichworte
Social robot; Data protection; Privacy; Self-disclosure; Attitudes towards robots; Ambivalence
Erscheinungsjahr
2023
Zeitschriftentitel
International Journal of Social Robotics
Band
15
Seite(n)
1487–1505
ISSN
1875-4791
eISSN
1875-4805
Finanzierungs-Informationen
Open-Access-Publikationskosten wurden durch die Universität Bielefeld im Rahmen des DEAL-Vertrags gefördert.
Page URI
https://pub.uni-bielefeld.de/record/2983122

Zitieren

Stapels JG, Penner A, Diekmann N, Eyssel F. Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose. International Journal of Social Robotics. 2023;15:1487–1505.
Stapels, J. G., Penner, A., Diekmann, N., & Eyssel, F. (2023). Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose. International Journal of Social Robotics, 15, 1487–1505. https://doi.org/10.1007/s12369-023-01043-8
Stapels, Julia G., Penner, Angelika, Diekmann, Niels, and Eyssel, Friederike. 2023. “Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose”. International Journal of Social Robotics 15: 1487–1505.
Stapels, J. G., Penner, A., Diekmann, N., and Eyssel, F. (2023). Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose. International Journal of Social Robotics 15, 1487–1505.
Stapels, J.G., et al., 2023. Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose. International Journal of Social Robotics, 15, p 1487–1505.
J.G. Stapels, et al., “Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose”, International Journal of Social Robotics, vol. 15, 2023, pp. 1487–1505.
Stapels, J.G., Penner, A., Diekmann, N., Eyssel, F.: Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose. International Journal of Social Robotics. 15, 1487–1505 (2023).
Stapels, Julia G., Penner, Angelika, Diekmann, Niels, and Eyssel, Friederike. “Never trust anything that can think for itself, if you can’t control its privacy settings: The influence of a robot’s privacy settings on users’ attitudes and willingness to self-disclose”. International Journal of Social Robotics 15 (2023): 1487–1505.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Creative Commons Namensnennung 4.0 International Public License (CC-BY 4.0):
Volltext(e)
Access Level
OA Open Access
Zuletzt Hochgeladen
2024-07-03T07:55:36Z
MD5 Prüfsumme
0ee1b609afa917272ab6ddfc1c8fcc2a


Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Web of Science

Dieser Datensatz im Web of Science®
Suchen in

Google Scholar