Ghost-in-the-Machine reveals human social signals for human–robot interaction
Loth S, Jettka K, Giuliani M, de Ruiter J (2015)
Frontiers in Psychology 6: 1641.
Zeitschriftenaufsatz
| Veröffentlicht | Englisch
Download
Autor*in
Einrichtung
Abstract / Bemerkung
We used a new method called “Ghost-in-the-Machine” (GiM) to investigate social interactions with a robotic bartender taking orders for drinks and serving them. Using the GiM paradigm allowed us to identify how human participants recognize the intentions of customers on the basis of the output of the robotic recognizers. Specifically, we measured which recognizer modalities (e.g., speech, the distance to the bar) were relevant at different stages of the interaction. This provided insights into human social behavior necessary for the development of socially competent robots. When initiating the drink-order interaction, the most important recognizers were those based on computer vision. When drink orders were being placed, however, the most important information source was the speech recognition. Interestingly, the participants used only a subset of the available information, focussing only on a few relevant recognizers while ignoring others. This reduced the risk of acting on erroneous sensor data and enabled them to complete service interactions more swiftly than a robot using all available sensor data. We also investigated socially appropriate response strategies. In their responses, the participants preferred to use the same modality as the customer’s requests, e.g., they tended to respond verbally to verbal requests. Also, they added redundancy to their responses, for instance by using echo questions. We argue that incorporating the social strategies discovered with the GiM paradigm in multimodal grammars of human–robot interactions improves the robustness and the ease-of-use of these interactions, and therefore provides a smoother user experience.
Erscheinungsjahr
2015
Zeitschriftentitel
Frontiers in Psychology
Band
6
Art.-Nr.
1641
ISSN
1664-1078
Finanzierungs-Informationen
Open-Access-Publikationskosten wurden durch die Deutsche Forschungsgemeinschaft und die Universität Bielefeld gefördert.
Page URI
https://pub.uni-bielefeld.de/record/2785561
Zitieren
Loth S, Jettka K, Giuliani M, de Ruiter J. Ghost-in-the-Machine reveals human social signals for human–robot interaction. Frontiers in Psychology. 2015;6: 1641.
Loth, S., Jettka, K., Giuliani, M., & de Ruiter, J. (2015). Ghost-in-the-Machine reveals human social signals for human–robot interaction. Frontiers in Psychology, 6, 1641. https://doi.org/10.3389/fpsyg.2015.01641
Loth, Sebastian, Jettka, Katharina, Giuliani, Manuel, and de Ruiter, Jan. 2015. “Ghost-in-the-Machine reveals human social signals for human–robot interaction”. Frontiers in Psychology 6: 1641.
Loth, S., Jettka, K., Giuliani, M., and de Ruiter, J. (2015). Ghost-in-the-Machine reveals human social signals for human–robot interaction. Frontiers in Psychology 6:1641.
Loth, S., et al., 2015. Ghost-in-the-Machine reveals human social signals for human–robot interaction. Frontiers in Psychology, 6: 1641.
S. Loth, et al., “Ghost-in-the-Machine reveals human social signals for human–robot interaction”, Frontiers in Psychology, vol. 6, 2015, : 1641.
Loth, S., Jettka, K., Giuliani, M., de Ruiter, J.: Ghost-in-the-Machine reveals human social signals for human–robot interaction. Frontiers in Psychology. 6, : 1641 (2015).
Loth, Sebastian, Jettka, Katharina, Giuliani, Manuel, and de Ruiter, Jan. “Ghost-in-the-Machine reveals human social signals for human–robot interaction”. Frontiers in Psychology 6 (2015): 1641.
Alle Dateien verfügbar unter der/den folgenden Lizenz(en):
Copyright Statement:
Dieses Objekt ist durch das Urheberrecht und/oder verwandte Schutzrechte geschützt. [...]
Volltext(e)
Access Level
Open Access
Zuletzt Hochgeladen
2019-09-25T06:44:18Z
MD5 Prüfsumme
b2120d3258a82f5fabb7a8a2c9fd3ddb
Daten bereitgestellt von European Bioinformatics Institute (EBI)
3 Zitationen in Europe PMC
Daten bereitgestellt von Europe PubMed Central.
Seat Choice in a Crowded Café: Effects of Eye Contact, Distance, and Anchoring.
Staats H, Groot P., Front Psychol 10(), 2019
PMID: 30837924
Staats H, Groot P., Front Psychol 10(), 2019
PMID: 30837924
Confidence in uncertainty: Error cost and commitment in early speech hypotheses.
Loth S, Jettka K, Giuliani M, Kopp S, de Ruiter JP., PLoS One 13(8), 2018
PMID: 30067853
Loth S, Jettka K, Giuliani M, Kopp S, de Ruiter JP., PLoS One 13(8), 2018
PMID: 30067853
86 References
Daten bereitgestellt von Europe PubMed Central.
Psychological status of the script concept.
Abelson R.., 1981
Abelson R.., 1981
Skill in sport
Abernethy B., Maxwell J., Jackson R., Masters R.., 2007
Abernethy B., Maxwell J., Jackson R., Masters R.., 2007
Deliberate delays during robot-to-human handovers improve compliance with gaze communication
Admoni H., Dragan A., Srinivasa S., Scassellati B.., 2014
Admoni H., Dragan A., Srinivasa S., Scassellati B.., 2014
On the division of attention: a disproof of the single channel hypothesis.
Allport DA, Antonis B, Reynolds P., Q J Exp Psychol 24(2), 1972
PMID: 5043119
Allport DA, Antonis B, Reynolds P., Q J Exp Psychol 24(2), 1972
PMID: 5043119
Visual tracking of hands, faces and facial features of multiple persons.
Baltzakis H., Pateraki M., Trahanias P.., 2012
Baltzakis H., Pateraki M., Trahanias P.., 2012
Dialog in the open world: platform and applications
Bohus D., Horvitz E.., 2009
Bohus D., Horvitz E.., 2009
Learning to predict engagement with a spoken dialog system in open-world settings
Bohus D., Horvitz E.., 2009
Bohus D., Horvitz E.., 2009
Models for multiparty engagement in open-world dialog
Bohus D., Horvitz E.., 2009
Bohus D., Horvitz E.., 2009
Open-world dialog: challenges, directions, and prototype
Bohus D., Horvitz E.., 2009
Bohus D., Horvitz E.., 2009
On the challenges and opportunities of physically situated dialog
Bohus D., Horvitz E.., 2010
Bohus D., Horvitz E.., 2010
Multiparty turn taking in situated dialog: study, lessons, and directions
Bohus D., Horvitz E.., 2011
Bohus D., Horvitz E.., 2011
Directions robot: in-the-wild experiences and lessons learned
Bohus D., Saw C., Horvitz E.., 2014
Bohus D., Saw C., Horvitz E.., 2014
Crowdsourcing human-robot interaction: new methods and system evaluation in a public environment.
Breazeal C., DePalma N., Orkin J., Chernova S., Jung M.., 2013
Breazeal C., DePalma N., Orkin J., Chernova S., Jung M.., 2013
Broadbent D.., 1969
Speech differences between women and men on the wrong track?
Brouwer D., Gerritsen M., De D.., 1979
Brouwer D., Gerritsen M., De D.., 1979
Pointing and placing
Clark H.., 2003
Clark H.., 2003
Wordless questions, wordless answers
Clark H.., 2012
Clark H.., 2012
Wizard of Oz studies
Dahlbäck N., Jönsson A., Ahrenberg L.., 1993
Dahlbäck N., Jönsson A., Ahrenberg L.., 1993
Gorillas we have missed: sustained inattentional deafness for dynamic events.
Dalton P, Fraenkel N., Cognition 124(3), 2012
PMID: 22726569
Dalton P, Fraenkel N., Cognition 124(3), 2012
PMID: 22726569
“A model of intentional communication: AIRBUS (Asymmetric Intention Recognition with Bayesian Updating of Signals),”
De J., Cummins C.., 2012
De J., Cummins C.., 2012
AUTHOR UNKNOWN, 2009
G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences.
Faul F, Erdfelder E, Lang AG, Buchner A., Behav Res Methods 39(2), 2007
PMID: 17695343
Faul F, Erdfelder E, Lang AG, Buchner A., Behav Res Methods 39(2), 2007
PMID: 17695343
Validating attention classifiers for multi-party human-robot interaction
Foster M.., 2014
Foster M.., 2014
How can I help you? Comparing engagement classification strategies for a robot bartender
Foster M., Gaschler A., Giuliani M.., 2013
Foster M., Gaschler A., Giuliani M.., 2013
Two people walk into a bar: dynamic multi-party social interaction with a robot agent
Foster M., Gaschler A., Giuliani M., Isard A., Pateraki M., Petrick R.., 2012
Foster M., Gaschler A., Giuliani M., Isard A., Pateraki M., Petrick R.., 2012
Simulating speech systems.
Fraser N., Gilbert G.., 1991
Fraser N., Gilbert G.., 1991
Modelling state of interaction from head poses for social human-robot interaction
Gaschler A., Huth K., Giuliani M., Kessler I., De J., Knoll A.., 2012
Gaschler A., Huth K., Giuliani M., Kessler I., De J., Knoll A.., 2012
Comparing task-based and socially intelligent behaviour in a robot bartender
Giuliani M., Petrick R., Foster M., Gaschler A., Isard A., Pateraki M.., 2013
Giuliani M., Petrick R., Foster M., Gaschler A., Isard A., Pateraki M.., 2013
Goffman E.., 1963
Human-robot interaction: a survey.
Goodrich M., Schultz A.., 2007
Goodrich M., Schultz A.., 2007
Action and embodiment within situated human interaction.
Goodwin C.., 2000
Goodwin C.., 2000
Gray J., Breazeal C., Berlin M., Brooks A., Lieberman J.., 2005
Applying the wizard-of-Oz framework to cooperative service discovery and configuration
Green A., Hüttenrauch H., Eklundh K.., 2004
Green A., Hüttenrauch H., Eklundh K.., 2004
Meaning.
Grice H.., 1957
Grice H.., 1957
Hall E.., 1969
Heritage J.., 1984
Generating connection events for human-robot collaboration
Holroyd A., Rich C., Sidner C., Ponsler B.., 2011
Holroyd A., Rich C., Sidner C., Ponsler B.., 2011
AUTHOR UNKNOWN, 2012
Representations for actions
Jeannerod M.., 2006
Jeannerod M.., 2006
An iterative design methodology for user-friendly natural language office information applications.
Kelley J.., 1984
Kelley J.., 1984
Smile and the world will smile with you—The effects of a virtual agent“s smile on users” evaluation and behavior.
Krämer N., Kopp S., Becker-Asano C., Sommer N.., 2013
Krämer N., Kopp S., Becker-Asano C., Sommer N.., 2013
If they move in sync, they must feel in sync: movement synchrony leads to attributions of rapport and entitativity.
Lakens D., Stel M.., 2011
Lakens D., Stel M.., 2011
The snackbot: documenting the design of a robot for long-term human-robot interaction
Lee M., Forlizzi J., Rybski P., Crabbe F., Chung W., Finkle J.., 2009
Lee M., Forlizzi J., Rybski P., Crabbe F., Chung W., Finkle J.., 2009
Interactional biases in human thinking
Levinson S.., 1995
Levinson S.., 1995
“Social navigation - identifying robot navigation patterns in a path crossing scenario,”
Lichtenthäler C., Peters A., Griffiths S., Kirsch A.., 2013
Lichtenthäler C., Peters A., Griffiths S., Kirsch A.., 2013
A wizard-of-oz interface to study information presentation strategies for spoken dialogue systems
Liu X., Rieser V., Lemon O.., 2009
Liu X., Rieser V., Lemon O.., 2009
Ghost-in-the-machine: initial results
Loth S., Giuliani M., De J.., 2014
Loth S., Giuliani M., De J.., 2014
Automatic detection of service initiation signals used in bars.
Loth S, Huth K, De Ruiter JP., Front Psychol 4(), 2013
PMID: 24009594
Loth S, Huth K, De Ruiter JP., Front Psychol 4(), 2013
PMID: 24009594
Seeking attention: testing a model of initiating service Interactions
Loth S., Huth K., De J.., 2015
Loth S., Huth K., De J.., 2015
Love J., Selker R., Verhagen J., Smira M., Wild A., Marsman M.., 2014
Mack A., Rock I.., 1998
A dual task response modality effect: support for multiprocessor models of attention.
Mcleod P.., 1977
Mcleod P.., 1977
A spatial model of engagement for a social robot
Michalowski M., Sabanovic S., Simmons R.., 2006
Michalowski M., Sabanovic S., Simmons R.., 2006
Morey R., Rouder J., Jamil T.., 2014
The sound of many hands clapping.
Neda Z, Ravasz E, Brechet Y, Vicsek T, Barabasi AL., Nature 403(6772), 2000
PMID: 10706271
Neda Z, Ravasz E, Brechet Y, Vicsek T, Barabasi AL., Nature 403(6772), 2000
PMID: 10706271
The development of a method for identifying penalty kick strategies in association football.
Noel B, Furley P, van der Kamp J, Dicks M, Memmert D., J Sports Sci 33(1), 2014
PMID: 24914924
Noel B, Furley P, van der Kamp J, Dicks M, Memmert D., J Sports Sci 33(1), 2014
PMID: 24914924
The restaurant game: learning social behavior and language from thousands of players online.
Orkin J., Roy D.., 2007
Orkin J., Roy D.., 2007
Automatic learning and generation of social behaviour from collective human gameplay
Orkin J., Roy D.., 2009
Orkin J., Roy D.., 2009
Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation.
Pateraki M., Baltzakis H., Trahanias P.., 2014
Pateraki M., Baltzakis H., Trahanias P.., 2014
What would you like to drink? Recognising and planning with social states in a robot bartender domain
Petrick R., Foster M.., 2012
Petrick R., Foster M.., 2012
Plan-based social interaction with a robot bartender
Petrick R., Foster M.., 2013
Petrick R., Foster M.., 2013
A survey on vision-based human action recognition.
Poppe R.., 2010
Poppe R.., 2010
AUTHOR UNKNOWN, 2007
Recognizing engagement in human-robot interaction
Rich C., Ponsler B., Holroyd A., Sidner C.., 2010
Rich C., Ponsler B., Holroyd A., Sidner C.., 2010
Rocking together: dynamics of intentional and unintentional interpersonal coordination.
Richardson MJ, Marsh KL, Isenhower RW, Goodman JR, Schmidt RC., Hum Mov Sci 26(6), 2007
PMID: 17765345
Richardson MJ, Marsh KL, Isenhower RW, Goodman JR, Schmidt RC., Hum Mov Sci 26(6), 2007
PMID: 17765345
Wizard of Oz studies in hri: a systematic review and new reporting guidelines.
Riek L.., 2012
Riek L.., 2012
Adaptive information presentation for spoken dialogue systems: evaluation with human subjects
Rieser V., Keizer S., Liu X., Lemon O.., 2011
Rieser V., Keizer S., Liu X., Lemon O.., 2011
Learning human multimodal dialogue strategies.
Rieser V., Lemon O.., 2009
Rieser V., Lemon O.., 2009
Ripley B., Venables W.., 2014
Bayesian t tests for accepting and rejecting the null hypothesis.
Rouder JN, Speckman PL, Sun D, Morey RD, Iverson G., Psychon Bull Rev 16(2), 2009
PMID: 19293088
Rouder JN, Speckman PL, Sun D, Morey RD, Iverson G., Psychon Bull Rev 16(2), 2009
PMID: 19293088
A simplest systematics for the organization of turn-taking for conversation.
Sacks H., Schegloff E., Jefferson G.., 1974
Sacks H., Schegloff E., Jefferson G.., 1974
Schank R., Abelson R.., 1977
Sequencing in conversational openings.
Schegloff E.., 1968
Schegloff E.., 1968
Notes on a conversational practice: formulating place
Schegloff E.., 1972
Schegloff E.., 1972
The preference for self-correction in the organization of repair in conversation.
Schegloff E., Jefferson G., Sacks H.., 1977
Schegloff E., Jefferson G., Sacks H.., 1977
Opening up closings.
Schegloff E., Sacks H.., 1973
Schegloff E., Sacks H.., 1973
Foveal and peripheral fields of vision influences perceptual skill in anticipating opponents' attacking position in volleyball.
Schorer J, Rienhoff R, Fischer L, Baker J., Appl Psychophysiol Biofeedback 38(3), 2013
PMID: 23775537
Schorer J, Rienhoff R, Fischer L, Baker J., Appl Psychophysiol Biofeedback 38(3), 2013
PMID: 23775537
Real-time human pose recognition in parts from single depth images.
Shotton J., Sharp T., Kipman A., Fitzgibbon A., Finocchio M., Blake A.., 2013
Shotton J., Sharp T., Kipman A., Fitzgibbon A., Finocchio M., Blake A.., 2013
Engagement rules for human-robot collaborative interactions
Sidner C., Lee C.., 2003
Sidner C., Lee C.., 2003
Explorations in engagement for humans and robots.
Sidner C., Lee C., Kidd C., Lesh N., Rich C.., 2005
Sidner C., Lee C., Kidd C., Lesh N., Rich C.., 2005
Gorillas in our midst: sustained inattentional blindness for dynamic events.
Simons DJ, Chabris CF., Perception 28(9), 1999
PMID: 10694957
Simons DJ, Chabris CF., Perception 28(9), 1999
PMID: 10694957
Bridging the gap between social animal and unsocial machine: a survey of social signal processing.
Vinciarelli A., Pantic M., Heylen D., Pelachaud C., Poggi I., D’Errico F.., 2012
Vinciarelli A., Pantic M., Heylen D., Pelachaud C., Poggi I., D’Errico F.., 2012
Designing games with a purpose.
von L., Dabbish L.., 2008
von L., Dabbish L.., 2008
Active adaptation in human-agent collaborative interaction.
Xu Y., Ohmoto Y., Ueda K., Komatsu T., Okadome T., Kamei K.., 2010
Xu Y., Ohmoto Y., Ueda K., Komatsu T., Okadome T., Kamei K.., 2010
Development of a mobile museum guide robot that can configure spatial formation with visitors
Yousuf A., Kobayashi Y., Yamazaki A., Yamazaki K.., 2012
Yousuf A., Kobayashi Y., Yamazaki A., Yamazaki K.., 2012
Export
Markieren/ Markierung löschen
Markierte Publikationen
Web of Science
Dieser Datensatz im Web of Science®Quellen
PMID: 26582998
PubMed | Europe PMC
Suchen in