Büro: 04.002
Telefon: +49 208 88 254-796
Nach dem Abschluss des Bachelorstudiums im Studiengang Mensch-Technik-Interaktion verblieb Alexander Arntz an der Hochschule Ruhr West, um dort den Masterstudiengang Informatik zu absolvieren.
Seit 2018 ist er als wissenschaftlicher Mitarbeiter am Institut Informatik an der Hochschule Ruhr West unter der Leitung von Prof. Dr. Sabrina Eimler (Human Factors & Gender Studies) tätig. Seine Arbeitsschwerpunkte liegen in der Konzeption, Entwicklung und Erforschung von Augmented Reality und Virtual Reality Systemen in verschiedenen Kontexten (Lernen, Aus- und Weiterbildung und Arbeitsunterstützung) sowie der Konzeption und Entwicklung von Anwendungsszenarien-Mensch-Roboter-Kollaboration.
Im Juni 2022 schloss Alexander Arntz an der Universität Duisburg-Essen als Doktorand der Abteilung INKO seine Promotion ab, die durch Prof. Dr. Ulrich Hoppe und Prof. Dr. Sabrina Eimler begleitet wurde.
Alexander Arntz ist Mitglied des Instituts Positive Computing und Mitglied im Graduierteninstitut NRW.
ARBEITS- UND FORSCHUNGSSCHWERPUNKTE
-
- Augmented und Virtual Reality
- AR-basierte Arbeitsunterstützung im Schwerindustriekontext (EFRE-gefördertes Projekt DamokleS 4.0)
- VR-/AR-Anwendungen zur Gestaltung innovativer Lehre (Projekt VR-Coop-Lab)
- AR, VR und IOT im Gebäudebetrieb 4.0
- Akzeptanz- und Technikfolgeforschung, z.B. im Bereich Arbeitswelten der Zukunft, Mensch-Roboter-Kollaboration
LEHRVERANSTALTUNGEN
-
- Human-Factors und Ergonomie (Master)
- ???
- ???
PROJEKTE
-
- VIPER: Virtuelle Plattform zur Erfahrung von Roboter
- ???
- ???
AUSGEWÄHLTE PUBLIKATIONEN
2024
44.Arntz, Alexander; Helgert, André; Straßmann, Carolin; Eimler, Sabrina C.
Enhancing Human-Robot Interaction Research by Using a Virtual Reality Lab Approach Proceedings Article
In: 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), S. 340-344, 2024.
Abstract | Links | BibTeX | Schlagwörter: Technological innovation;Solid modeling;Runtime;Human-robot interaction;Virtual environments;Physiology;Robots;Virtual Reality;Human-Robot Interaction;Empirical Studies;Research Platform;Study Tool;Wizard of Oz
@inproceedings{10445600,
title = {Enhancing Human-Robot Interaction Research by Using a Virtual Reality Lab Approach},
author = {Alexander Arntz and André Helgert and Carolin Straßmann and Sabrina C. Eimler},
doi = {10.1109/AIxVR59861.2024.00058},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)},
pages = {340-344},
abstract = {Human-robot interaction (HRI) research often faces limitations in real-world environments due to uncontrollable external factors. This applies in particular to field study setups in public spaces, as these can limit the validity of the study results, e.g. due to unpredictable and unsystematic changes in the environment, noise, people passing, etc. Especially for interdisciplinary studies involving psychological perspectives, virtual reality (VR) has emerged as a promising solution, offering realistic, controlled, and reproducible environments. Also, recent technological advancements enable detailed observation of human behavior and physiological responses via eye tracking, physiological assessments, and motion capture. To effectively add value by using VR as a tool, immersion, and presence in the virtual environment are essential preconditions. Besides, the manipulability of the VR environment during runtime is a bonus in exploring human behavior in interaction with robot-enriched spaces. As a methodological innovation in HRI studies, this paper presents a VR lab as a research tool that provides a virtual model of the robot Pepper along with interfaces for easy navigation and adaptive robot behavior. Moreover, the presented Wizard of Oz dashboard allows to flexibly react to the scenery by allowing the manipulation of several robot parameters during runtime. With the help of the VR lab, a framework for a variety of interdisciplinary research purposes in human-robot interaction (not only) in public spaces is provided.},
keywords = {Technological innovation;Solid modeling;Runtime;Human-robot interaction;Virtual environments;Physiology;Robots;Virtual Reality;Human-Robot Interaction;Empirical Studies;Research Platform;Study Tool;Wizard of Oz},
pubstate = {published},
tppubtype = {inproceedings}
}
Human-robot interaction (HRI) research often faces limitations in real-world environments due to uncontrollable external factors. This applies in particular to field study setups in public spaces, as these can limit the validity of the study results, e.g. due to unpredictable and unsystematic changes in the environment, noise, people passing, etc. Especially for interdisciplinary studies involving psychological perspectives, virtual reality (VR) has emerged as a promising solution, offering realistic, controlled, and reproducible environments. Also, recent technological advancements enable detailed observation of human behavior and physiological responses via eye tracking, physiological assessments, and motion capture. To effectively add value by using VR as a tool, immersion, and presence in the virtual environment are essential preconditions. Besides, the manipulability of the VR environment during runtime is a bonus in exploring human behavior in interaction with robot-enriched spaces. As a methodological innovation in HRI studies, this paper presents a VR lab as a research tool that provides a virtual model of the robot Pepper along with interfaces for easy navigation and adaptive robot behavior. Moreover, the presented Wizard of Oz dashboard allows to flexibly react to the scenery by allowing the manipulation of several robot parameters during runtime. With the help of the VR lab, a framework for a variety of interdisciplinary research purposes in human-robot interaction (not only) in public spaces is provided.43.Arntz, Alexander
Enabling Safe Empirical Studies for Human-Robot Collaboration: Implementation of a Sensor Array Driven Control Interface Proceedings Article
In: Kurosu, Masaaki; Hashizume, Ayako (Hrsg.): Human-Computer Interaction, S. 42–57, Springer Nature Switzerland, Cham, 2024, ISBN: 978-3-031-60412-6.
Abstract | BibTeX | Schlagwörter:
@inproceedings{10.1007/978-3-031-60412-6_4,
title = {Enabling Safe Empirical Studies for Human-Robot Collaboration: Implementation of a Sensor Array Driven Control Interface},
author = {Alexander Arntz},
editor = {Masaaki Kurosu and Ayako Hashizume},
isbn = {978-3-031-60412-6},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {Human-Computer Interaction},
pages = {42–57},
publisher = {Springer Nature Switzerland},
address = {Cham},
abstract = {In response to the growing relevance of collaborative robots, the need for empirical user studies in the domain of Human-Robot Collaboration become increasingly important. While collaborative robots incorporate internal safety features, their usage for user studies remains associated with inherent safety risks. This project addresses these challenges by introducing a toolbox for a robot arm to conduct Wizard-of-Oz studies by using advanced controls complemented by a sophisticated security system leveraging microcontrollers and human presence detection sensors. This approach unifies both control systems within a single application, seamlessly monitoring and synchronizing their respective inputs. The gamepad control scheme offers Wizard-of-Oz study supervisors an intuitive means of interacting with the robot, enabling precise and responsive control while maintaining safety. Meanwhile, the security system, built on microcontroller technology and human presence detection sensors, acts as a vigilant guardian, continuously assessing the robot's surroundings for potential risks. This integrated application not only empowers users with effective control over the xArm 7 but also provides real-time feedback on the security status, enhancing the overall safety and usability of collaborative robots in various industrial settings. By bridging the gap between human operators and robots, this project contributes to the evolution of safer and more user-friendly human-robot collaboration.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
In response to the growing relevance of collaborative robots, the need for empirical user studies in the domain of Human-Robot Collaboration become increasingly important. While collaborative robots incorporate internal safety features, their usage for user studies remains associated with inherent safety risks. This project addresses these challenges by introducing a toolbox for a robot arm to conduct Wizard-of-Oz studies by using advanced controls complemented by a sophisticated security system leveraging microcontrollers and human presence detection sensors. This approach unifies both control systems within a single application, seamlessly monitoring and synchronizing their respective inputs. The gamepad control scheme offers Wizard-of-Oz study supervisors an intuitive means of interacting with the robot, enabling precise and responsive control while maintaining safety. Meanwhile, the security system, built on microcontroller technology and human presence detection sensors, acts as a vigilant guardian, continuously assessing the robot's surroundings for potential risks. This integrated application not only empowers users with effective control over the xArm 7 but also provides real-time feedback on the security status, enhancing the overall safety and usability of collaborative robots in various industrial settings. By bridging the gap between human operators and robots, this project contributes to the evolution of safer and more user-friendly human-robot collaboration.42.Arntz, Alexander; Dia, Agostino Di; Riebner, Tim; Straßmann, Carolin; Eimler, Sabrina C.
Teamwork Makes the Dream Work: A Virtual Reality-based Human-Robot Collaboration Sandbox Simulating Multiple Teams Proceedings Article
In: 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), S. 335-339, 2024, ISSN: 2771-7453.
Abstract | Links | BibTeX | Schlagwörter: Robot kinematics;Virtual assistants;Virtual environments;Industrial robots;Libraries;Teamwork;Task analysis;Human-Robot Collaboration;Virtual Reality;Machine Learning;Artificial Intelligence
@inproceedings{10445597,
title = {Teamwork Makes the Dream Work: A Virtual Reality-based Human-Robot Collaboration Sandbox Simulating Multiple Teams},
author = {Alexander Arntz and Agostino Di Dia and Tim Riebner and Carolin Straßmann and Sabrina C. Eimler},
doi = {10.1109/AIxVR59861.2024.00057},
issn = {2771-7453},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)},
pages = {335-339},
abstract = {We present a virtual reality-based Human-Robot Collaboration sandbox that allows the representation of multiple teams composed of humans and robots. Within the sandbox, virtual robots and humans can collaborate with their respective partners and interact with other teams to coordinate the required procedures while accomplishing a shared task. For this purpose, the virtual reality sandbox is equipped with a variety of interaction mechanics that enable a range of different shared tasks. The network integration allows for multiple users within the virtual environment. The VR application contains a library of different industrial robots that can act autonomously controlled by machine learning agents and interact with the user through verbal commands. The sandbox is specifically designed to serve as a research tool to explore new concepts and validate existing approaches in the domain of Human-Robot Collaboration involving autonomous robots in a series of upcoming studies.},
keywords = {Robot kinematics;Virtual assistants;Virtual environments;Industrial robots;Libraries;Teamwork;Task analysis;Human-Robot Collaboration;Virtual Reality;Machine Learning;Artificial Intelligence},
pubstate = {published},
tppubtype = {inproceedings}
}
We present a virtual reality-based Human-Robot Collaboration sandbox that allows the representation of multiple teams composed of humans and robots. Within the sandbox, virtual robots and humans can collaborate with their respective partners and interact with other teams to coordinate the required procedures while accomplishing a shared task. For this purpose, the virtual reality sandbox is equipped with a variety of interaction mechanics that enable a range of different shared tasks. The network integration allows for multiple users within the virtual environment. The VR application contains a library of different industrial robots that can act autonomously controlled by machine learning agents and interact with the user through verbal commands. The sandbox is specifically designed to serve as a research tool to explore new concepts and validate existing approaches in the domain of Human-Robot Collaboration involving autonomous robots in a series of upcoming studies.2023
41.Dia, Agostino Di; Riebner, Tim; Arntz, Alexander; Jansen, Marc
Prototyping a Smart Contract Application for Fair Reward Distribution in Software Development Projects Proceedings Article
In: Prieto, Javier; Martínez, Francisco Luis Benítez; Ferretti, Stefano; Guardeño, David Arroyo; Nevado-Batalla, Pedro Tomás (Hrsg.): Blockchain and Applications, 4th International Congress, S. 131–141, Springer International Publishing, Cham, 2023, ISBN: 978-3-031-21229-1.
Abstract | BibTeX | Schlagwörter:
@inproceedings{10.1007/978-3-031-21229-1_13,
title = {Prototyping a Smart Contract Application for Fair Reward Distribution in Software Development Projects},
author = {Agostino Di Dia and Tim Riebner and Alexander Arntz and Marc Jansen},
editor = {Javier Prieto and Francisco Luis Benítez Martínez and Stefano Ferretti and David Arroyo Guardeño and Pedro Tomás Nevado-Batalla},
isbn = {978-3-031-21229-1},
year = {2023},
date = {2023-01-01},
booktitle = {Blockchain and Applications, 4th International Congress},
pages = {131--141},
publisher = {Springer International Publishing},
address = {Cham},
abstract = {The following work describes the development of a project management platform based on the waves blockchain technology. The aim is to use the platform to optimally reward both fairness and the achievement of quality standards in software development. Due to the outsourcing of software projects and the associated processes, situations arise in which different project participants work on the same tasks, while paid differently and in most cases independent of the delivered software quality. This is based on the respective contract conditions an individual software developer has negotiated, without factoring in the actual quality of the code the developer provided. The proposal of this work is a prototypical software application that takes the requirements of a project and measures the corresponding contributions of the developers based on their software quality. Project sponsors can also pay out the enrolled project partners via the platform.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
The following work describes the development of a project management platform based on the waves blockchain technology. The aim is to use the platform to optimally reward both fairness and the achievement of quality standards in software development. Due to the outsourcing of software projects and the associated processes, situations arise in which different project participants work on the same tasks, while paid differently and in most cases independent of the delivered software quality. This is based on the respective contract conditions an individual software developer has negotiated, without factoring in the actual quality of the code the developer provided. The proposal of this work is a prototypical software application that takes the requirements of a project and measures the corresponding contributions of the developers based on their software quality. Project sponsors can also pay out the enrolled project partners via the platform.2022
40.Straßmann, Carolin; Eimler, Sabrina C.; Kololli, Linda; Arntz, Alexander; Sand, Katharina; Rietz, Annika
Effects of the Surroundings in Human-Robot Interaction: Stereotypical Perception of Robots and Its Anthropomorphism Proceedings Article
In: Salvendy, Gavriel; Wei, June (Hrsg.): Design, Operation and Evaluation of Mobile Communications, S. 363–377, Springer International Publishing, Cham, 2022, ISBN: 978-3-031-05014-5.
Abstract | BibTeX | Schlagwörter:
@inproceedings{10.1007/978-3-031-05014-5_30,
title = {Effects of the Surroundings in Human-Robot Interaction: Stereotypical Perception of Robots and Its Anthropomorphism},
author = {Carolin Straßmann and Sabrina C. Eimler and Linda Kololli and Alexander Arntz and Katharina Sand and Annika Rietz},
editor = {Gavriel Salvendy and June Wei},
isbn = {978-3-031-05014-5},
year = {2022},
date = {2022-01-01},
booktitle = {Design, Operation and Evaluation of Mobile Communications},
pages = {363--377},
publisher = {Springer International Publishing},
address = {Cham},
abstract = {Stereotypes and scripts guide human perception and expectations in everyday life. Research has found that a robot's appearance influences the perceived fit in different application domains (e.g. industrial or social) and that the role a robot is presented in predicts its perceived personality. However, it is unclear how the surroundings as such can elicit a halo effect leading to stereotypical perceptions. This paper presents the results of an experimental study in which 206 participants saw 8 cartoon pictures of the robot Pepper in different application domains in a within-subjects online study. Results indicate that the environment a robot is placed in has an effect on the users' evaluation of the robot's warmth, competence, status in society, competition, anthropomorphism, and morality. As the first impression has an effect on users' expectations and evaluation of the robot and the interaction with it, the effect of the application scenarios has to be considered carefully.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Stereotypes and scripts guide human perception and expectations in everyday life. Research has found that a robot's appearance influences the perceived fit in different application domains (e.g. industrial or social) and that the role a robot is presented in predicts its perceived personality. However, it is unclear how the surroundings as such can elicit a halo effect leading to stereotypical perceptions. This paper presents the results of an experimental study in which 206 participants saw 8 cartoon pictures of the robot Pepper in different application domains in a within-subjects online study. Results indicate that the environment a robot is placed in has an effect on the users' evaluation of the robot's warmth, competence, status in society, competition, anthropomorphism, and morality. As the first impression has an effect on users' expectations and evaluation of the robot and the interaction with it, the effect of the application scenarios has to be considered carefully.39.Arntz, Alexander; Adler, Felix; Kitzmann, Dennis; Eimler, Sabrina C.
Augmented Reality Supported Real-Time Data Processing Using Internet of Things Sensor Technology Proceedings Article
In: Streitz, Norbert A.; Konomi, Shin'ichi (Hrsg.): Distributed, Ambient and Pervasive Interactions. Smart Living, Learning, Well-being and Health, Art and Creativity, S. 3–17, Springer International Publishing, Cham, 2022, ISBN: 978-3-031-05431-0.
Abstract | BibTeX | Schlagwörter:
@inproceedings{10.1007/978-3-031-05431-0_1,
title = {Augmented Reality Supported Real-Time Data Processing Using Internet of Things Sensor Technology},
author = {Alexander Arntz and Felix Adler and Dennis Kitzmann and Sabrina C. Eimler},
editor = {Norbert A. Streitz and Shin'ichi Konomi},
isbn = {978-3-031-05431-0},
year = {2022},
date = {2022-01-01},
booktitle = {Distributed, Ambient and Pervasive Interactions. Smart Living, Learning, Well-being and Health, Art and Creativity},
pages = {3--17},
publisher = {Springer International Publishing},
address = {Cham},
abstract = {Internet of things (IoT) devices increasingly permeate everyday life and provide vital and convenient information. Augmented reality (AR) enables the embedding of this information in the environment using visualizations that can contextualize data for various applications such as Smart Home. Current applications providing a visual representation of the information are often limited to graphs or bar charts, neglecting the variety of possible coherence between the subject and the visualization. We present a setup for real-time AR-based visualizations of data collected by IoT devices. Three distinct battery-powered IoT microcontroller systems were designed and programmed. Each is outfitted with numerous sensors, i.e. for humidity or temperature, to interact with the developed AR application through a network connection. The AR application was developed using Unity3D and the Vuforia AR SDK for Android-based mobile devices with the goal of providing processed and visualized information that is comprehensible for the respective context. Inspired by weather applications for mobile devices, the visualization contains animated dioramas, with changing attributes based on the input data from the IoT microcontroller. This work contains the configuration of the IoT microcontroller hardware, the network interface used, the development process of the AR application, and its usage, complemented by possible future extensions described in an outlook.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Internet of things (IoT) devices increasingly permeate everyday life and provide vital and convenient information. Augmented reality (AR) enables the embedding of this information in the environment using visualizations that can contextualize data for various applications such as Smart Home. Current applications providing a visual representation of the information are often limited to graphs or bar charts, neglecting the variety of possible coherence between the subject and the visualization. We present a setup for real-time AR-based visualizations of data collected by IoT devices. Three distinct battery-powered IoT microcontroller systems were designed and programmed. Each is outfitted with numerous sensors, i.e. for humidity or temperature, to interact with the developed AR application through a network connection. The AR application was developed using Unity3D and the Vuforia AR SDK for Android-based mobile devices with the goal of providing processed and visualized information that is comprehensible for the respective context. Inspired by weather applications for mobile devices, the visualization contains animated dioramas, with changing attributes based on the input data from the IoT microcontroller. This work contains the configuration of the IoT microcontroller hardware, the network interface used, the development process of the AR application, and its usage, complemented by possible future extensions described in an outlook.38.Dia, Agostino Di; Riebner, Tim; Arntz, Alexander; Eimler, Sabrina C.
Augmented-Reality-Based Real-Time Patient Information for Nursing Proceedings Article
In: Yamamoto, Sakae; Mori, Hirohiko (Hrsg.): Human Interface and the Management of Information: Applications in Complex Technological Environments, S. 195–208, Springer International Publishing, Cham, 2022, ISBN: 978-3-031-06509-5.
Abstract | BibTeX | Schlagwörter:
@inproceedings{10.1007/978-3-031-06509-5_14,
title = {Augmented-Reality-Based Real-Time Patient Information for Nursing},
author = {Agostino Di Dia and Tim Riebner and Alexander Arntz and Sabrina C. Eimler},
editor = {Sakae Yamamoto and Hirohiko Mori},
isbn = {978-3-031-06509-5},
year = {2022},
date = {2022-01-01},
booktitle = {Human Interface and the Management of Information: Applications in Complex Technological Environments},
pages = {195--208},
publisher = {Springer International Publishing},
address = {Cham},
abstract = {While the usage of digital systems in the medical sector has increased, nursing activities are still mostly performed without any form of digital assistance. Considering the complex and demanding procedures the medical personnel is confronted with, a high task load is expected which is prone to human errors. Solutions, however, need to match staff requirements and ideally involve them in the development process to ensure acceptance and usage. Based on desired application scenarios, we introduce a concept of an augmented reality (AR)-based patient data application that provides context-relevant information for nursing staff and doctors. Developed for the Hololens 2, the application allows the retrieval and synchronization of the patient data from the host network of the respective hospital information system. For this purpose, a system infrastructure consisting of several software components was developed to simulate the exchange between the AR device and the independent hospital environment. The paper outlines the conceptual approach based on requirements collected from nurses, related work, the technical implementation and discusses limitations and future developments.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
While the usage of digital systems in the medical sector has increased, nursing activities are still mostly performed without any form of digital assistance. Considering the complex and demanding procedures the medical personnel is confronted with, a high task load is expected which is prone to human errors. Solutions, however, need to match staff requirements and ideally involve them in the development process to ensure acceptance and usage. Based on desired application scenarios, we introduce a concept of an augmented reality (AR)-based patient data application that provides context-relevant information for nursing staff and doctors. Developed for the Hololens 2, the application allows the retrieval and synchronization of the patient data from the host network of the respective hospital information system. For this purpose, a system infrastructure consisting of several software components was developed to simulate the exchange between the AR device and the independent hospital environment. The paper outlines the conceptual approach based on requirements collected from nurses, related work, the technical implementation and discusses limitations and future developments.37.Arntz, Alexander; Straßmann, Carolin; Völker, Stefanie; Eimler, Sabrina C.
In: Frontiers in Robotics and AI, Bd. 9, 2022, ISSN: 2296-9144.
Abstract | Links | BibTeX | Schlagwörter:
@article{10.3389/frobt.2022.999308,
title = {Collaborating eye to eye: Effects of workplace design on the perception of dominance of collaboration robots},
author = {Alexander Arntz and Carolin Straßmann and Stefanie Völker and Sabrina C. Eimler},
url = {https://www.frontiersin.org/articles/10.3389/frobt.2022.999308},
doi = {10.3389/frobt.2022.999308},
issn = {2296-9144},
year = {2022},
date = {2022-01-01},
journal = {Frontiers in Robotics and AI},
volume = {9},
abstract = {The concept of Human-Robot Collaboration (HRC) describes innovative industrial work procedures, in which human staff works in close vicinity with robots on a shared task. Current HRC scenarios often deploy hand-guided robots or remote controls operated by the human collaboration partner. As HRC envisions active collaboration between both parties, ongoing research efforts aim to enhance the capabilities of industrial robots not only in the technical dimension but also in the robot’s socio-interactive features. Apart from enabling the robot to autonomously complete the respective shared task in conjunction with a human partner, one essential aspect lifted from the group collaboration among humans is the communication between both entities. State-of-the-art research has identified communication as a significant contributor to successful collaboration between humans and industrial robots. Non-verbal gestures have been shown to be contributing aspect in conveying the respective state of the robot during the collaboration procedure. Research indicates that, depending on the viewing perspective, the usage of non-verbal gestures in humans can impact the interpersonal attribution of certain characteristics. Applied to collaborative robots such as the Yumi IRB 14000, which is equipped with two arms, specifically to mimic human actions, the perception of the robots’ non-verbal behavior can affect the collaboration. Most important in this context are dominance emitting gestures by the robot that can reinforce negative attitudes towards robots, thus hampering the users’ willingness and effectiveness to collaborate with the robot. By using a 3 × 3 within-subjects design online study, we investigated the effect of dominance gestures (Akimbo, crossing arms, and large arm spread) working in a standing position with an average male height, working in a standing position with an average female height, and working in a seated position on the perception of dominance of the robot. Overall 115 participants (58 female and 57 male) with an average age of 23 years evaluated nine videos of the robot. Results indicated that all presented gestures affect a person’s perception of the robot in regards to its perceived characteristics and willingness to cooperate with the robot. The data also showed participants’ increased attribution of dominance based on the presented viewing perspective.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
The concept of Human-Robot Collaboration (HRC) describes innovative industrial work procedures, in which human staff works in close vicinity with robots on a shared task. Current HRC scenarios often deploy hand-guided robots or remote controls operated by the human collaboration partner. As HRC envisions active collaboration between both parties, ongoing research efforts aim to enhance the capabilities of industrial robots not only in the technical dimension but also in the robot’s socio-interactive features. Apart from enabling the robot to autonomously complete the respective shared task in conjunction with a human partner, one essential aspect lifted from the group collaboration among humans is the communication between both entities. State-of-the-art research has identified communication as a significant contributor to successful collaboration between humans and industrial robots. Non-verbal gestures have been shown to be contributing aspect in conveying the respective state of the robot during the collaboration procedure. Research indicates that, depending on the viewing perspective, the usage of non-verbal gestures in humans can impact the interpersonal attribution of certain characteristics. Applied to collaborative robots such as the Yumi IRB 14000, which is equipped with two arms, specifically to mimic human actions, the perception of the robots’ non-verbal behavior can affect the collaboration. Most important in this context are dominance emitting gestures by the robot that can reinforce negative attitudes towards robots, thus hampering the users’ willingness and effectiveness to collaborate with the robot. By using a 3 × 3 within-subjects design online study, we investigated the effect of dominance gestures (Akimbo, crossing arms, and large arm spread) working in a standing position with an average male height, working in a standing position with an average female height, and working in a seated position on the perception of dominance of the robot. Overall 115 participants (58 female and 57 male) with an average age of 23 years evaluated nine videos of the robot. Results indicated that all presented gestures affect a person’s perception of the robot in regards to its perceived characteristics and willingness to cooperate with the robot. The data also showed participants’ increased attribution of dominance based on the presented viewing perspective.36.Helgert, André; Zielinska, Laura; Groeneveld, Anna; Kloos, Chiara; Arntz, Alexander; Straßmann, Carolin; Eimler, Sabrina C.
DiSensity: Ein hochschulweites Virtual Reality Sensibilisierungsprogramm Proceedings Article
In: Söbke, Heinrich; Zender, Raphael (Hrsg.): Wettbewerbsband AVRiL 2022, S. 17-22, Gesellschaft für Informatik e.V., Bonn, 2022.
Abstract | Links | BibTeX | Schlagwörter:
@inproceedings{mci/Helgert2022,
title = {DiSensity: Ein hochschulweites Virtual Reality Sensibilisierungsprogramm},
author = {André Helgert and Laura Zielinska and Anna Groeneveld and Chiara Kloos and Alexander Arntz and Carolin Straßmann and Sabrina C. Eimler},
editor = {Heinrich Söbke and Raphael Zender},
doi = {10.18420/avril2022_03},
year = {2022},
date = {2022-01-01},
urldate = {2022-01-01},
booktitle = {Wettbewerbsband AVRiL 2022},
pages = {17-22},
publisher = {Gesellschaft für Informatik e.V.},
address = {Bonn},
abstract = {Diversitätsbezogene Herausforderungen begegnen den meisten Studierenden, Lehrenden und Mitarbeitenden immer wieder im Hochschulalltag. Manchmal wird uns dies nicht bewusst, da wir nicht ausreichend für das Thema sensibilisiert sind und uns Situationen aus unserer eigenen Perspektive unproblematisch erscheinen oder wir ihnen aus dem Weg gehen, wenn sie uns unangenehm sind. Aus diesen Erfahrungen kann eine Schieflage in der Kommunikation oder der Wahrnehmung der anderen Person entstehen, die Probleme, Konflikte oder ein Gefühl sozialer Isolation erzeugt. In diesem Beitrag wird eine immersive Virtual-Reality-Galerie vorgestellt, welche von Akteur:innen aus den Fach- und Servicebereichen und Studierenden entwickelt wird. Das Ziel ist es, die Sensibilität für Vielfalt und deren Bedeutung im Lehr- und Lerngeschehen hochschulweit und bei allen Akteursgruppen zu steigern. Mit dem Einsatz von multimedialen Inhalten und verschiedenen Interaktionsmechaniken in der virtuellen Welt, kann DiSensity als effiziente, kostengünstige und flexible Alternative zu bisherigen Diversitäts Trainings dienen.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Diversitätsbezogene Herausforderungen begegnen den meisten Studierenden, Lehrenden und Mitarbeitenden immer wieder im Hochschulalltag. Manchmal wird uns dies nicht bewusst, da wir nicht ausreichend für das Thema sensibilisiert sind und uns Situationen aus unserer eigenen Perspektive unproblematisch erscheinen oder wir ihnen aus dem Weg gehen, wenn sie uns unangenehm sind. Aus diesen Erfahrungen kann eine Schieflage in der Kommunikation oder der Wahrnehmung der anderen Person entstehen, die Probleme, Konflikte oder ein Gefühl sozialer Isolation erzeugt. In diesem Beitrag wird eine immersive Virtual-Reality-Galerie vorgestellt, welche von Akteur:innen aus den Fach- und Servicebereichen und Studierenden entwickelt wird. Das Ziel ist es, die Sensibilität für Vielfalt und deren Bedeutung im Lehr- und Lerngeschehen hochschulweit und bei allen Akteursgruppen zu steigern. Mit dem Einsatz von multimedialen Inhalten und verschiedenen Interaktionsmechaniken in der virtuellen Welt, kann DiSensity als effiziente, kostengünstige und flexible Alternative zu bisherigen Diversitäts Trainings dienen.2021
35.Arntz, Alexander; Eimler, Sabrina C.; Keßler, Dustin; Thomas, Jan; Helgert, André; Rehm, Markus; Graf, Eduard; Wientzek, Sebastian; Budur, Burak
Walking on the Bright Sight: Evaluating a Photovoltaics Virtual Reality Education Application Proceedings Article
In: 2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), S. 295-301, 2021.
Links | BibTeX | Schlagwörter:
@inproceedings{9644336,
title = {Walking on the Bright Sight: Evaluating a Photovoltaics Virtual Reality Education Application},
author = {Alexander Arntz and Sabrina C. Eimler and Dustin Keßler and Jan Thomas and André Helgert and Markus Rehm and Eduard Graf and Sebastian Wientzek and Burak Budur},
doi = {10.1109/AIVR52153.2021.00063},
year = {2021},
date = {2021-01-01},
booktitle = {2021 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)},
pages = {295-301},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}