Lehrgebiet: Digitale Infrastruktur sowie Geschäfts- und Prozessmanagement im E-Commerce
Büro: 04-1.328 (Parkstadt Mülheim) / 01.206 (Campus Bottrop)
Telefon: +49 208 88 254-794
Dr. Carolin Straßmann studierte Angewandte Kognitions- und Medienwissenschaften (Bachelor- und Master) an der Universität Duisburg-Essen. Anschließend arbeite sie am Lehrstuhl für Sozialpsychologie: Medien-, und Kommunikation (Leitung Prof. Dr. Nicole Krämer) als wissenschaftliche Mitarbeiterin in verschiedenen BMBF Projekten. Ende 2018 schloss sie ihre Promotion zum Thema „All Eyes on The Agent’s Appearance?! Investigation of Target-group-related Social Effects of a Virtual Agent’s Appearance in Longitudinal Human-Agent Interactions“ ab.
Seit März 2024 hat Dr. Straßmann die Professur für Digitale Infrastuktur sowie Geschäfts- und Prozessmanagement im E-Commerce inne. Zuvor war sie seit April 2018 als Lehrkraft für besondere Aufgaben am Institut Informatik tätig und sammelte seit März 2019 als Teilnehmerin des Landesprogramms „Karrierewege FH-Professur“ Industrieerfahrung bei der celano GmbH.
In ihrer Forschung beschäftigt sie sich hauptsächlich mit der Wirkung und Gestaltung von virtuellen Agenten und sozialen Robotern. Dabei betrachte sie beispielsweise Aspekte wie Erscheinungsbild oder non-verbales Verhalten, um Technologien optimal an Bedürfnisse von Menschen anpassen zu können. Die Vision einer positiven, hilfreichen und adaptiven Technologiegestaltung, welche zum
gesellschaftlichen Wohl beiträgt, ist stetiger Motivator ihrer Tätigkeiten.
ARBEITS- UND FORSCHUNGSSCHWERPUNKTE
-
- Mensch-Technik-Interaktion
- Soziale Wirkung von virtuellen Agenten und sozialen Robotern
- Langzeit-Interaktionen mit KI-basierten Systemen
- Persuasive Wirkung von innovativen Technologien
- Positiver Einfluss von innovativen Technologien
LEHRVERANSTALTUNGEN
-
- Angewandte Statistik
- Software Ergonomie und Usability Engineering
- Kognitions-, Kommunikations- und Medienpsychologie
- Kompetenzentwicklung
- Betreuung von Qualitfikationsarbeiten und studentischen Projekten
PROJEKTE
AUSGEWÄHLTE PUBLIKATIONEN
2024
27.Arntz, Alexander; Helgert, André; Straßmann, Carolin; Eimler, Sabrina C.
Enhancing Human-Robot Interaction Research by Using a Virtual Reality Lab Approach Proceedings Article
In: 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), S. 340-344, 2024.
Abstract | Links | BibTeX | Schlagwörter: Technological innovation;Solid modeling;Runtime;Human-robot interaction;Virtual environments;Physiology;Robots;Virtual Reality;Human-Robot Interaction;Empirical Studies;Research Platform;Study Tool;Wizard of Oz
@inproceedings{10445600,
title = {Enhancing Human-Robot Interaction Research by Using a Virtual Reality Lab Approach},
author = {Alexander Arntz and André Helgert and Carolin Straßmann and Sabrina C. Eimler},
doi = {10.1109/AIxVR59861.2024.00058},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)},
pages = {340-344},
abstract = {Human-robot interaction (HRI) research often faces limitations in real-world environments due to uncontrollable external factors. This applies in particular to field study setups in public spaces, as these can limit the validity of the study results, e.g. due to unpredictable and unsystematic changes in the environment, noise, people passing, etc. Especially for interdisciplinary studies involving psychological perspectives, virtual reality (VR) has emerged as a promising solution, offering realistic, controlled, and reproducible environments. Also, recent technological advancements enable detailed observation of human behavior and physiological responses via eye tracking, physiological assessments, and motion capture. To effectively add value by using VR as a tool, immersion, and presence in the virtual environment are essential preconditions. Besides, the manipulability of the VR environment during runtime is a bonus in exploring human behavior in interaction with robot-enriched spaces. As a methodological innovation in HRI studies, this paper presents a VR lab as a research tool that provides a virtual model of the robot Pepper along with interfaces for easy navigation and adaptive robot behavior. Moreover, the presented Wizard of Oz dashboard allows to flexibly react to the scenery by allowing the manipulation of several robot parameters during runtime. With the help of the VR lab, a framework for a variety of interdisciplinary research purposes in human-robot interaction (not only) in public spaces is provided.},
keywords = {Technological innovation;Solid modeling;Runtime;Human-robot interaction;Virtual environments;Physiology;Robots;Virtual Reality;Human-Robot Interaction;Empirical Studies;Research Platform;Study Tool;Wizard of Oz},
pubstate = {published},
tppubtype = {inproceedings}
}
Human-robot interaction (HRI) research often faces limitations in real-world environments due to uncontrollable external factors. This applies in particular to field study setups in public spaces, as these can limit the validity of the study results, e.g. due to unpredictable and unsystematic changes in the environment, noise, people passing, etc. Especially for interdisciplinary studies involving psychological perspectives, virtual reality (VR) has emerged as a promising solution, offering realistic, controlled, and reproducible environments. Also, recent technological advancements enable detailed observation of human behavior and physiological responses via eye tracking, physiological assessments, and motion capture. To effectively add value by using VR as a tool, immersion, and presence in the virtual environment are essential preconditions. Besides, the manipulability of the VR environment during runtime is a bonus in exploring human behavior in interaction with robot-enriched spaces. As a methodological innovation in HRI studies, this paper presents a VR lab as a research tool that provides a virtual model of the robot Pepper along with interfaces for easy navigation and adaptive robot behavior. Moreover, the presented Wizard of Oz dashboard allows to flexibly react to the scenery by allowing the manipulation of several robot parameters during runtime. With the help of the VR lab, a framework for a variety of interdisciplinary research purposes in human-robot interaction (not only) in public spaces is provided.26.Helgert, André; Straßmann, Carolin; Eimler, Sabrina C.
Unlocking Potentials of Virtual Reality as a Research Tool in Human-Robot Interaction: A Wizard-of-Oz Approach Proceedings Article
In: Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction, S. 535–539, Association for Computing Machinery, Boulder, CO, USA, 2024, ISBN: 9798400703232.
Abstract | Links | BibTeX | Schlagwörter: accessibility, social robots, virtual reality, Wizard-of-Oz
@inproceedings{10.1145/3610978.3640741,
title = {Unlocking Potentials of Virtual Reality as a Research Tool in Human-Robot Interaction: A Wizard-of-Oz Approach},
author = {André Helgert and Carolin Straßmann and Sabrina C. Eimler},
url = {https://doi.org/10.1145/3610978.3640741},
doi = {10.1145/3610978.3640741},
isbn = {9798400703232},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction},
pages = {535–539},
publisher = {Association for Computing Machinery},
address = {Boulder, CO, USA},
series = {HRI '24},
abstract = {Wizard-of-Oz (WoZ) systems represent a widespread method in HRI research. While they are cost-effective, flexible and are often preferred over developing autonomous dialogs in experimental settings, they are typically tailored to specific use cases. In addition, WoZ systems are mainly used in lab studies that deviate from real world scenarios. Here, virtual reality (VR) can be used to immerse the user in a real world interaction scenario with robots. This article highlights the necessity for a modularized and customizable WoZ system, using the benefits of VR. The proposed system integrates well-established features like speech and gesture control, while expanding functionality to encompass a data dashboard and dynamic robot navigation using VR technology. The discussion emphasizes the importance of developing technical systems, like the WoZ system, in a modularized and customizable way, particularly for non-technical researchers. Overcoming usability hurdles is crucial to establishing this tool's role in the HRI research field.},
keywords = {accessibility, social robots, virtual reality, Wizard-of-Oz},
pubstate = {published},
tppubtype = {inproceedings}
}
Wizard-of-Oz (WoZ) systems represent a widespread method in HRI research. While they are cost-effective, flexible and are often preferred over developing autonomous dialogs in experimental settings, they are typically tailored to specific use cases. In addition, WoZ systems are mainly used in lab studies that deviate from real world scenarios. Here, virtual reality (VR) can be used to immerse the user in a real world interaction scenario with robots. This article highlights the necessity for a modularized and customizable WoZ system, using the benefits of VR. The proposed system integrates well-established features like speech and gesture control, while expanding functionality to encompass a data dashboard and dynamic robot navigation using VR technology. The discussion emphasizes the importance of developing technical systems, like the WoZ system, in a modularized and customizable way, particularly for non-technical researchers. Overcoming usability hurdles is crucial to establishing this tool's role in the HRI research field.25.Arntz, Alexander; Dia, Agostino Di; Riebner, Tim; Straßmann, Carolin; Eimler, Sabrina C.
Teamwork Makes the Dream Work: A Virtual Reality-based Human-Robot Collaboration Sandbox Simulating Multiple Teams Proceedings Article
In: 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), S. 335-339, 2024, ISSN: 2771-7453.
Abstract | Links | BibTeX | Schlagwörter: Robot kinematics;Virtual assistants;Virtual environments;Industrial robots;Libraries;Teamwork;Task analysis;Human-Robot Collaboration;Virtual Reality;Machine Learning;Artificial Intelligence
@inproceedings{10445597,
title = {Teamwork Makes the Dream Work: A Virtual Reality-based Human-Robot Collaboration Sandbox Simulating Multiple Teams},
author = {Alexander Arntz and Agostino Di Dia and Tim Riebner and Carolin Straßmann and Sabrina C. Eimler},
doi = {10.1109/AIxVR59861.2024.00057},
issn = {2771-7453},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)},
pages = {335-339},
abstract = {We present a virtual reality-based Human-Robot Collaboration sandbox that allows the representation of multiple teams composed of humans and robots. Within the sandbox, virtual robots and humans can collaborate with their respective partners and interact with other teams to coordinate the required procedures while accomplishing a shared task. For this purpose, the virtual reality sandbox is equipped with a variety of interaction mechanics that enable a range of different shared tasks. The network integration allows for multiple users within the virtual environment. The VR application contains a library of different industrial robots that can act autonomously controlled by machine learning agents and interact with the user through verbal commands. The sandbox is specifically designed to serve as a research tool to explore new concepts and validate existing approaches in the domain of Human-Robot Collaboration involving autonomous robots in a series of upcoming studies.},
keywords = {Robot kinematics;Virtual assistants;Virtual environments;Industrial robots;Libraries;Teamwork;Task analysis;Human-Robot Collaboration;Virtual Reality;Machine Learning;Artificial Intelligence},
pubstate = {published},
tppubtype = {inproceedings}
}
We present a virtual reality-based Human-Robot Collaboration sandbox that allows the representation of multiple teams composed of humans and robots. Within the sandbox, virtual robots and humans can collaborate with their respective partners and interact with other teams to coordinate the required procedures while accomplishing a shared task. For this purpose, the virtual reality sandbox is equipped with a variety of interaction mechanics that enable a range of different shared tasks. The network integration allows for multiple users within the virtual environment. The VR application contains a library of different industrial robots that can act autonomously controlled by machine learning agents and interact with the user through verbal commands. The sandbox is specifically designed to serve as a research tool to explore new concepts and validate existing approaches in the domain of Human-Robot Collaboration involving autonomous robots in a series of upcoming studies.2023
24.Erle, Lukas; Timm, Lara; Straßmann, Carolin; Eimler, Sabrina C.
Using Focus Group Interviews to Examine Biased Experiences in Human-Robot-Interaction Artikel
In: S. 4, 2023.
Abstract | Links | BibTeX | Schlagwörter:
@article{nokey,
title = {Using Focus Group Interviews to Examine Biased Experiences in Human-Robot-Interaction},
author = {Lukas Erle and Lara Timm and Carolin Straßmann and Sabrina C. Eimler},
editor = {ArXiv},
doi = {10.48550/arXiv.2310.01421},
year = {2023},
date = {2023-09-27},
pages = {4},
abstract = {When deploying interactive agents like (social) robots in public spaces they need to be able to interact with a diverse audience, with members each having individual diversity characteristics and prior experiences with interactive systems. To cater for these various predispositions, it is important to examine what experiences citizens have made with interactive systems and how these experiences might create a bias towards such systems. To analyze these bias-inducing experiences, focus group interviews have been conducted to learn of citizens individual discrimination experiences, their attitudes towards and arguments for and against the deployment of social robots in public spaces. This extended abstract focuses especially on the method and measurement of diversity.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
When deploying interactive agents like (social) robots in public spaces they need to be able to interact with a diverse audience, with members each having individual diversity characteristics and prior experiences with interactive systems. To cater for these various predispositions, it is important to examine what experiences citizens have made with interactive systems and how these experiences might create a bias towards such systems. To analyze these bias-inducing experiences, focus group interviews have been conducted to learn of citizens individual discrimination experiences, their attitudes towards and arguments for and against the deployment of social robots in public spaces. This extended abstract focuses especially on the method and measurement of diversity.23.Helgert, André; Eimler, Sabrina C.; Straßmann, Carolin
In: S. 4, 2023.
Abstract | Links | BibTeX | Schlagwörter:
@article{nokey,
title = {Virtual Reality as a Tool for Studying Diversity and Inclusion in Human-Robot Interaction: Advantages and Challenges},
author = {André Helgert and Sabrina C. Eimler and Carolin Straßmann},
editor = {ArXiv},
doi = {10.48550/arXiv.2309.14937},
year = {2023},
date = {2023-09-23},
pages = {4},
abstract = {This paper investigates the potential of Virtual Reality (VR) as a research tool for studying diversity and inclusion characteristics in the context of human-robot interactions (HRI). Some exclusive advantages of using VR in HRI are discussed, such as a controllable environment, the possibility to manipulate the variables related to the robot and the human-robot interaction, flexibility in the design of the robot and the environment, and advanced measurement methods related e.g. to eye tracking and physiological data. At the same time, the challenges of researching diversity and inclusion in HRI are described, especially in accessibility, cyber sickness and bias when developing VR-environments. Furthermore, solutions to these challenges are being discussed to fully harness the benefits of VR for the studying of diversity and inclusion.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
This paper investigates the potential of Virtual Reality (VR) as a research tool for studying diversity and inclusion characteristics in the context of human-robot interactions (HRI). Some exclusive advantages of using VR in HRI are discussed, such as a controllable environment, the possibility to manipulate the variables related to the robot and the human-robot interaction, flexibility in the design of the robot and the environment, and advanced measurement methods related e.g. to eye tracking and physiological data. At the same time, the challenges of researching diversity and inclusion in HRI are described, especially in accessibility, cyber sickness and bias when developing VR-environments. Furthermore, solutions to these challenges are being discussed to fully harness the benefits of VR for the studying of diversity and inclusion.22.Erle, Lukas; Timm, Lara; Straßmann, Carolin; Eimler, Sabrina C.
Algorithmic Bias and Digital Divide – An Examination of Discrimination Experiences in Human-System Interactions (Poster) Proceedings Article
In: Melzer, André; Wagener, Gary Lee (Hrsg.): Proceedings of the 13th Conference of the Media Psychology Division (DGPs), Melusina Press, 2023.
Links | BibTeX | Schlagwörter:
@inproceedings{Erle2023,
title = {Algorithmic Bias and Digital Divide – An Examination of Discrimination Experiences in Human-System Interactions (Poster)},
author = {Lukas Erle and Lara Timm and Carolin Straßmann and Sabrina C. Eimler},
editor = {André Melzer and Gary Lee Wagener},
url = {https://doi.org/10.26298/1981-5555},
year = {2023},
date = {2023-09-06},
booktitle = {Proceedings of the 13th Conference of the Media Psychology Division (DGPs)},
publisher = {Melusina Press},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
21.Straßmann, Carolin; Helgert, André; Breil, Valentin; Settelmayer, Lina; Diehl, Inga
Exploring the Use of Colored Ambient Lights to Convey Emotional Cues With Conversational Agents: An Experimental Study Proceedings Article
In: 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), S. 99-105, 2023.
Abstract | Links | BibTeX | Schlagwörter: Ethics;Emotion recognition;Virtual assistants;Games;Behavioral sciences;Robots
@inproceedings{10309310,
title = {Exploring the Use of Colored Ambient Lights to Convey Emotional Cues With Conversational Agents: An Experimental Study},
author = {Carolin Straßmann and André Helgert and Valentin Breil and Lina Settelmayer and Inga Diehl},
doi = {10.1109/RO-MAN57019.2023.10309310},
year = {2023},
date = {2023-01-01},
urldate = {2023-01-01},
booktitle = {2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)},
pages = {99-105},
abstract = {Conversational agents (CAs) lack of possibilities to enrich the interaction with emotional cues, although this makes the conversation more human-like and enhances user engagement. Thus, the potential of CAs is not fully exploit and possibilities to convey emotional cues are needed. In this work, CAs use colored ambient lights to display moral emotions during the interaction. To evaluate this approach, a between-subject lab experiment (N=64) was conducted. Participants played a cooperation game with Amazon’s Alexa. Depending on the experimental condition participants received different light expressions: no light, neutral light, or morally emotional light (yellow = joy, blue = sorrow, red = anger matching the game decisions). The effect of the light expressions on the perception of the CA, users’ empathy and cooperation behavior was tested. Against our assumptions, the results indicated no positive effect of the emotional light cues. Limitations, next steps, and implications are discussed.},
keywords = {Ethics;Emotion recognition;Virtual assistants;Games;Behavioral sciences;Robots},
pubstate = {published},
tppubtype = {inproceedings}
}
Conversational agents (CAs) lack of possibilities to enrich the interaction with emotional cues, although this makes the conversation more human-like and enhances user engagement. Thus, the potential of CAs is not fully exploit and possibilities to convey emotional cues are needed. In this work, CAs use colored ambient lights to display moral emotions during the interaction. To evaluate this approach, a between-subject lab experiment (N=64) was conducted. Participants played a cooperation game with Amazon’s Alexa. Depending on the experimental condition participants received different light expressions: no light, neutral light, or morally emotional light (yellow = joy, blue = sorrow, red = anger matching the game decisions). The effect of the light expressions on the perception of the CA, users’ empathy and cooperation behavior was tested. Against our assumptions, the results indicated no positive effect of the emotional light cues. Limitations, next steps, and implications are discussed.20.Straßmann, Carolin; Helgert, André; Lingnau, Andreas
Psychological Outcomes and Effectiveness of a Collaborative Video-Based Learning Tool for Synchronous Discussions Proceedings Article
In: Fulantelli, Giovanni; Burgos, Daniel; Casalino, Gabriella; Cimitile, Marta; Bosco, Giosuè Lo; Taibi, Davide (Hrsg.): Higher Education Learning Methodologies and Technologies Online, S. 691–705, Springer Nature Switzerland, Cham, 2023, ISBN: 978-3-031-29800-4.
Abstract | BibTeX | Schlagwörter:
@inproceedings{10.1007/978-3-031-29800-4_52,
title = {Psychological Outcomes and Effectiveness of a Collaborative Video-Based Learning Tool for Synchronous Discussions},
author = {Carolin Straßmann and André Helgert and Andreas Lingnau},
editor = {Giovanni Fulantelli and Daniel Burgos and Gabriella Casalino and Marta Cimitile and Giosuè Lo Bosco and Davide Taibi},
isbn = {978-3-031-29800-4},
year = {2023},
date = {2023-01-01},
urldate = {2023-01-01},
booktitle = {Higher Education Learning Methodologies and Technologies Online},
pages = {691–705},
publisher = {Springer Nature Switzerland},
address = {Cham},
abstract = {In the context of digitization and the COVID-19 pandemic, online teaching comes more and more into focus of higher education. Various online learning methodologies such as using videos as a teaching method are promising, but can also create problems in the area of social relations or learning habits. The concept of collaborative learning, especially in an online environment, could counteract these problems. This paper presents a collaborative online learning tool that allows students to get together in learning groups to watch educational videos together. Various functions, such as a chat, help students to communicate with each other. A psychological evaluation was conducted to investigate the effects on students. The evaluation demonstrated positive effects of the tool, since it enhances important psychological processes (like flow and cognitive load) within the learning process. Moreover, its usability was rated as good and participants showed a high usage intention for the tool. Nevertheless, further investigations in long-term learning courses are needed to finally confirm the tool's effectiveness.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
In the context of digitization and the COVID-19 pandemic, online teaching comes more and more into focus of higher education. Various online learning methodologies such as using videos as a teaching method are promising, but can also create problems in the area of social relations or learning habits. The concept of collaborative learning, especially in an online environment, could counteract these problems. This paper presents a collaborative online learning tool that allows students to get together in learning groups to watch educational videos together. Various functions, such as a chat, help students to communicate with each other. A psychological evaluation was conducted to investigate the effects on students. The evaluation demonstrated positive effects of the tool, since it enhances important psychological processes (like flow and cognitive load) within the learning process. Moreover, its usability was rated as good and participants showed a high usage intention for the tool. Nevertheless, further investigations in long-term learning courses are needed to finally confirm the tool's effectiveness.2022
19.Helgert, André; Canbulat, Anil; Lingnau, Andreas; Straßmann, Carolin
A Framework for Analyzing Interactions in a Video-based Collaborative Learning Environment Proceedings Article
In: 2022 International Conference on Advanced Learning Technologies (ICALT), S. 125-127, 2022, ISSN: 2161-377X.
Abstract | Links | BibTeX | Schlagwörter: COVID-19;Learning management systems;Pandemics;Distance learning;Collaboration;Collaborative work;Behavioral sciences;Learning Analytics;Computer-supported Collaborative Learning;Social Learning
@inproceedings{9853790,
title = {A Framework for Analyzing Interactions in a Video-based Collaborative Learning Environment},
author = {André Helgert and Anil Canbulat and Andreas Lingnau and Carolin Straßmann},
doi = {10.1109/ICALT55010.2022.00045},
issn = {2161-377X},
year = {2022},
date = {2022-07-01},
urldate = {2022-07-01},
booktitle = {2022 International Conference on Advanced Learning Technologies (ICALT)},
pages = {125-127},
abstract = {Studying in social isolation is a reality for many students that was further reinforced after the start of the COVID-19 pandemic. Research shows that isolation can lead to decreased learning efficiency and is intensified by the increased asynchronous online teaching during the pandemic. This change is not only challenging for students, but also for teachers, as students do not have a direct communication and feedback channel when learning content is presented in form of pre-recorded videos in a learning management system. In this paper, we present VGather2Learn Analytics, which is an extension to the already existing collaborative learning system VGather2Learn, which makes it possible for teachers to analyse the learning behavior of students in asynchronous video-teaching. The information presented in a dashboard will allow teachers to better understand how students interact while watching learning videos collaboratively and can improve online-teaching.},
keywords = {COVID-19;Learning management systems;Pandemics;Distance learning;Collaboration;Collaborative work;Behavioral sciences;Learning Analytics;Computer-supported Collaborative Learning;Social Learning},
pubstate = {published},
tppubtype = {inproceedings}
}
Studying in social isolation is a reality for many students that was further reinforced after the start of the COVID-19 pandemic. Research shows that isolation can lead to decreased learning efficiency and is intensified by the increased asynchronous online teaching during the pandemic. This change is not only challenging for students, but also for teachers, as students do not have a direct communication and feedback channel when learning content is presented in form of pre-recorded videos in a learning management system. In this paper, we present VGather2Learn Analytics, which is an extension to the already existing collaborative learning system VGather2Learn, which makes it possible for teachers to analyse the learning behavior of students in asynchronous video-teaching. The information presented in a dashboard will allow teachers to better understand how students interact while watching learning videos collaboratively and can improve online-teaching.18.Straßmann, Carolin; Eimler, Sabrina C.; Kololli, Linda; Arntz, Alexander; Sand, Katharina; Rietz, Annika
Effects of the Surroundings in Human-Robot Interaction: Stereotypical Perception of Robots and Its Anthropomorphism Proceedings Article
In: Salvendy, Gavriel; Wei, June (Hrsg.): Design, Operation and Evaluation of Mobile Communications, S. 363–377, Springer International Publishing, Cham, 2022, ISBN: 978-3-031-05014-5.
Abstract | BibTeX | Schlagwörter:
@inproceedings{10.1007/978-3-031-05014-5_30,
title = {Effects of the Surroundings in Human-Robot Interaction: Stereotypical Perception of Robots and Its Anthropomorphism},
author = {Carolin Straßmann and Sabrina C. Eimler and Linda Kololli and Alexander Arntz and Katharina Sand and Annika Rietz},
editor = {Gavriel Salvendy and June Wei},
isbn = {978-3-031-05014-5},
year = {2022},
date = {2022-01-01},
booktitle = {Design, Operation and Evaluation of Mobile Communications},
pages = {363--377},
publisher = {Springer International Publishing},
address = {Cham},
abstract = {Stereotypes and scripts guide human perception and expectations in everyday life. Research has found that a robot's appearance influences the perceived fit in different application domains (e.g. industrial or social) and that the role a robot is presented in predicts its perceived personality. However, it is unclear how the surroundings as such can elicit a halo effect leading to stereotypical perceptions. This paper presents the results of an experimental study in which 206 participants saw 8 cartoon pictures of the robot Pepper in different application domains in a within-subjects online study. Results indicate that the environment a robot is placed in has an effect on the users' evaluation of the robot's warmth, competence, status in society, competition, anthropomorphism, and morality. As the first impression has an effect on users' expectations and evaluation of the robot and the interaction with it, the effect of the application scenarios has to be considered carefully.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Stereotypes and scripts guide human perception and expectations in everyday life. Research has found that a robot's appearance influences the perceived fit in different application domains (e.g. industrial or social) and that the role a robot is presented in predicts its perceived personality. However, it is unclear how the surroundings as such can elicit a halo effect leading to stereotypical perceptions. This paper presents the results of an experimental study in which 206 participants saw 8 cartoon pictures of the robot Pepper in different application domains in a within-subjects online study. Results indicate that the environment a robot is placed in has an effect on the users' evaluation of the robot's warmth, competence, status in society, competition, anthropomorphism, and morality. As the first impression has an effect on users' expectations and evaluation of the robot and the interaction with it, the effect of the application scenarios has to be considered carefully.