Hussain, Muhammad Ayaz; Iossifidis, Ioannis In: arXiv:2309.04698 [cs.RO], 2023. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, BCI, Computer Science - Artificial Intelligence, Computer Science - Information Theory, Computer Science - Machine Learning, Exoskeleton Doliwa, Sebastian; Hussain, Muhammad Ayaz; Sziburis, Tim; Iossifidis, Ioannis Biologically Inspired Model for Timed Motion in Robotic Systems Proceedings Article In: 9th IEEE RAS/EMBS International Conference on Biomedical Robotics & Biomechatronics, IEEE, Seoul, South Korea, 2022. BibTeX | Schlagwörter: Autonomous robotics, Dynamical systems Doliwa, Sebastian; Hussain, Muhammad Ayaz; Sziburis, Tim; Iossifidis, Ioannis Biologically Inspired Model for Timed Motion in Robotic Systems Artikel In: arXiv:2106.15864 [cs, math], 2021. Abstract | BibTeX | Schlagwörter: attractor dynamics approach, Autonomous robotics, Dynamical systems Hussain, Muhammad Ayaz; Klaes, Christian; Iossifidis, Ioannis Toward a Model of Timed Arm Movement Based on Temporal Tuning of Neurons in Primary Motor (MI) and Posterior Parietal Cortex (PPC) Title Proceedings Article In: BC18 : Computational Neuroscience & Neurotechnology Bernstein Conference 2018, BCCN, 2018. Abstract | BibTeX | Schlagwörter: Autonomous robotics, Dynamical systems, movement model Iossifidis, Ioannis; Hussain, Muhammad Ayaz; Klaes, Christian Temporal stabilized arm movement for efficient neuroprosthetic control by individuals with tetraplegia Sonstige 2017. Abstract | BibTeX | Schlagwörter: Autonomous robotics, Dynamical systems, movement model, neuroprosthetic Iossifidis, Ioannis; Klaes, Christian Low dimensional representation of human arm movement for efficient neuroprosthetic control by individuals with tetraplegia Sonstige 2017. Abstract | BibTeX | Schlagwörter: Autonomous robotics, BCI, Dynamical systems, movement model, neuroprosthetic Klaes, Christian; Iossifidis, Ioannis Low dimensional representation of human arm movement for efficient neuroprosthetic control by individuals with tetraplegia Konferenz SfN Meeting 2017, 2017. BibTeX | Schlagwörter: Autonomous robotics, BCI, Dynamical systems, movement model, neuroprosthetic Iossifidis, Ioannis Simulated Framework for the Development and Evaluation of Redundant Robotic Systems Proceedings Article In: International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014, 2014. Abstract | BibTeX | Schlagwörter: Autonomous robotics, man machine interaction, simulated reality Iossifidis, Ioannis Development of a Haptic Interface for Safe Human Robot Collaboration Proceedings Article In: International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014, 2014. Abstract | BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction Iossifidis, Ioannis Development of a Haptic Interface for Safe Human Robot Collaboration Proceedings Article In: International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014, 2014. BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction Iossifidis, Ioannis Simulated Framework for the Development and Evaluation of Redundant Robotic Systems Proceedings Article In: International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014, 2014. BibTeX | Schlagwörter: Autonomous robotics, man machine interaction, simulated reality Iossifidis, Ioannis Development of a Haptic Interface for Safe Human Robot Collaboration Proceedings Article In: International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014, 2014. BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction Iossifidis, Ioannis Utilizing artificial skin for direct physical interaction Proceedings Article In: 2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013, 2013. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction Iossifidis, Ioannis; Rano, Ianki Modeling Human Arm Motion by Means of Attractor Dynamics Approach Proceedings Article In: Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013), 2013. Abstract | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, Dynamical systems, movement model Iossifidis, Ioannis Motion Constraint Satisfaction by Means of Closed Form Solution for Redundant Robot Arms Proceedings Article In: Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013), 2013. Abstract | BibTeX | Schlagwörter: Autonomous robotics, inverse kinematics, motion constraints, redundant robot Iossifidis, Ioannis Utilizing Artificial Skin for Direct Physical Interaction Proceedings Article In: Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013), 2013. Abstract | BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction Iossifidis, Ioannis Motion constraint satisfaction by means of closed form solution for redundant robot arms Proceedings Article In: 2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013, S. 2106–2111, 2013, ISBN: 978-1-4799-2744-9. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, inverse kinematics, motion constraints, redundant robot Rano, Inaki; Iossifidis, Ioannis Modelling human arm motion through the attractor dynamics approach Proceedings Article In: 2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013, S. 2088–2093, 2013, ISBN: 9781479927449. Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, Dynamical systems, movement model Noth, Sebastian; Edelbrunner, Johann; Iossifidis, Ioannis A Versatile Simulated Reality Framework: From Embedded Components to ADAS Proceedings Article In: International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2012, 2012. BibTeX | Schlagwörter: Autonomous robotics, Machine Learning, simulated reality, Simulation, virtual reality Iossifidis, Ioannis Sequence Generation for Grasping Tasks by Means of Dynamical Systems Konferenz BC12 : Computational Neuroscience $backslash$& Neurotechnology Bernstein Conference $backslash$& Neurex Annual Meeting 2012, 2012. BibTeX | Schlagwörter: Autonomous robotics, Dynamical systems, grasping, sequence generation Zibner, S K U; Faubel, Christian; Iossifidis, Ioannis; Schöner, G In: IEEE Transactions on Autonomous Mental Development, Bd. 3, Nr. 1, 2011, ISSN: 19430604. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, dynamic field theory (DFT), Dynamical systems, embodied cognition, neural processing Noth, Sebastian; Iossifidis, Ioannis Simulated reality environment for development and assessment of cognitive robotic systems Proceedings Article In: Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2011), 2011. Abstract | BibTeX | Schlagwörter: Autonomous robotics, Machine Learning, Simulation, virtual reality Noth, S; Iossifidis, Ioannis Benefits of ego motion feedback for interactive experiments in virtual reality scenarios Konferenz BC11 : Computational Neuroscience $backslash$& Neurotechnology Bernstein Conference $backslash$& Neurex Annual Meeting 2011, 2011. BibTeX | Schlagwörter: Autonomous robotics, Machine Learning, simulated reality, Simulation, virtual reality Reimann, Hendrik; Iossifidis, Ioannis; Schoner, Gregor; Schöner, Gregor Integrating orientation constraints into the attractor dynamics approach for autonomous manipulation Proceedings Article In: 2010 10th IEEE-RAS International Conference on Humanoid Robots, S. 294–301, IEEE, 2010, ISBN: 978-1-4244-8688-5. Abstract | Links | BibTeX | Schlagwörter: attractor dynamics approach, Autonomous robotics, Dynamical systems, inverse kinematics Zibner, S K U; Faubel, Christian; Iossifidis, Ioannis; Schöner, G; Spencer, J P Scenes and tracking with dynamic neural fields: How to update a robotic scene representation Proceedings Article In: 2010 IEEE 9th International Conference on Development and Learning, ICDL-2010 - Conference Program, 2010, ISBN: 9781424469024. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, dynamic field theory (DFT), Dynamical systems, embodied cognition, neural processing Zibner, Stephan; Faubel, Christian; Iossifidis, Ioannis; Schöner, Gregor; Spencer, John P Scene and Tracking with Dynamic Neural Field Approach Proceedings Article In: ISR / ROBOTIK 2010, Munich, Germany, 2010. Abstract | BibTeX | Schlagwörter: Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition Zibner, Stephan K U; Faubel, Christian; Iossifidis, Ioannis; Schöner, Gregor Scene Representation with Dynamic Neural Fields: An Example of Complex Cognitive Architectures Based on Dynamic Neural Field Theory Proceedings Article In: Proc. Int. Conf. on Development and Learning (ICDL10), 2010. BibTeX | Schlagwörter: Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition Zibner, Stephan K U; Faubel, Christian; Spencer, John P; Iossifidis, Ioannis; Schöner, Gregor Scenes and Tracking with Dynamic Neural Fields: How to Update a Robotic Scene Representation Proceedings Article In: Proc. Int. Conf. on Development and Learning (ICDL10), 2010. BibTeX | Schlagwörter: Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition Zibner, Stephan S K U; Faubel, Christian; Iossifidis, Ioannis; Schöner, Gregor Scene Representation for Anthropomorphic Robots: A Dynamic Neural Field Approach Proceedings Article In: ISR / ROBOTIK 2010, VDE VERLAG GmbH, Munich, Germany, 2010. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition Sandamirskaya, Yulia; Lipinski, John; Iossifidis, Ioannis; Schöner, G Natural human-robot interaction through spatial language: a dynamic neural fields approach Proceedings Article In: Proc. 19th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2010), S. 600–607, IEEE, 2010, ISSN: 1944-9445. Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, man machine interaction, movement model, speech recognition Reimann, Hendrik; Iossifidis, Ioannis; Schöner, Gregor End-effector obstacle avoidance using multiple dynamic variables Proceedings Article In: ISR / ROBOTIK 2010, Munich, Germany, 2010. Abstract | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model, obstacle avoidance Zibner, S K U; Faubel, Christian; Iossifidis, Ioannis; Schöner, G Scene representation for anthropomorphic robots: A dynamic neural field approach Proceedings Article In: Joint 41st International Symposium on Robotics and 6th German Conference on Robotics 2010, ISR/ROBOTIK 2010, 2010, ISBN: 9781617387197. Abstract | BibTeX | Schlagwörter: Autonomous robotics, dynamic field theory (DFT), Dynamical systems, embodied cognition, neural processing Grimm, Matthias; Iossifidis, Ioannis Behavioral Organization for Mobile Robotic Systems: An Attractor Dynamics Approach Proceedings Article In: ISR / ROBOTIK 2010, Munich, Germany, 2010. Abstract | BibTeX | Schlagwörter: Autonomous robotics, behavior generation, Dynamical systems, movement model, movile robot Tuma, M; Iossifidis, Ioannis; Schöner, G Temporal stabilization of discrete movement in variable environments: An attractor dynamics approach Proceedings Article In: 2009 IEEE International Conference on Robotics and Automation, S. 863–868, IEEE, 2009, ISBN: 978-1-4244-2788-8. Abstract | Links | BibTeX | Schlagwörter: attractor dynamics approach, Autonomous robotics, Dynamical systems, hopf oscillator Tuma, Matthias; Iossifidis, Ioannis; Schöner, Gregor Temporal Stabilization of Discrete Movement in Variable Environments: An Attractor Dynamics Approach Proceedings Article In: Proc. IEEE International Conference on Robotics and Automation ICRA '09, S. 863–868, Kobe, Japan, 2009. Abstract | BibTeX | Schlagwörter: attractor dynamics approach, Autonomous robotics, Dynamical systems, hopf oscillator Iossifidis, Ioannis Dynamische Systeme zur Steuerung anthropomorpher Roboterarme in autonomen Robotersystemen Buch Logos Verlag Berlin, 2006. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, Dynamical systems, inverse kinematics Iossifidis, Ioannis Dynamische Systeme zur Steuerung anthropomorpher Roboterarme in autonomen Robotersystemen Promotionsarbeit Faculty for Physics and Astronomy, Ruhr-University Bochum, 2006. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, Dynamical systems, inverse kinematics Iossifidis, Ioannis; Bruckhoff, Carsten; Theis, C; Grote, Claudia; Faubel, Christian; Schöner, G A cooperative robotic assistant for human environments Buch 2005, ISSN: 16107438. Abstract | BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction Iossifidis, Ioannis; Steinhage, A Behavior generation for Anthropomorphic robots by means of dynamical systems Buch 2005, ISSN: 16107438. Abstract | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Ħägele, Martin; Đillmann, Rüdiger; Iossifidis, Ioannis Advances in Ħuman Robot Interaction Buch Springer Press, 2004. Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis Advances in Human Robot Interaction (Springer Tracts in Advanced Robotics) Buch Springer, 2004, ISBN: 3540232117. Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis Advances in Human Robot Interaction Buch Springer Press, 2004. Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm Iossifidis, Ioannis; Steinhage, Axel Behavior Generation For Anthropomorphic Robots by Means of Dynamical Systems Buchabschnitt In: Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis (Hrsg.): Advances in Human Robot Interaction, Bd. 14/2004, Nr. ISBN: 3-540-23211-7,, S. 269–300, Springer Press, 2004, ISBN: 3-540-23211-7,. Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model Iossifidis, Ioannis; Lawitzky, Gisbert; Knoop, Stephan; Zöllner, Raoul Towards Benchmarking of Domestic Robotic Assistants Buchabschnitt In: Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis (Hrsg.): Advances in Human Robot Interaction, Bd. 14/2004, Nr. ISBN: 3-540-23211-7,, S. 403–414, Springer Press, 2004. Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, benchmarking, human robot collaboration, man machine interaction Iossifidis, Ioannis; Bruckhoff, Carsten; Theis, Christoph; Grote, Claudia; Faubel, Christian; Schöner, Gregor A Cooperative Robot Assistant CoRA For Human Environments Buchabschnitt In: Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis (Hrsg.): Advances in Human Robot Interaction, Bd. 14/2004, Nr. ISBN: 3-540-23211-7,, S. 385–401, Springer Press, 2004, ISBN: 3-540-23211-7,. Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model Iossifidis, Ioannis; Schöner, Gregor Attractor dynamics approach for autonomous collision-free path generation in 3d-space for an 7 dof robot arm Proceedings Article In: Proceedings of the ROBOTIK 2004, Leistungsstand - Anwendungen - Visionen - Trends, number 1841 in VDI-Berichte, S. 815–822, VDI/VDE VDI Verlag, München, Germany, 2004. BibTeX | Schlagwörter: arm movement model, Autonomous robotics, collision avoidance, Dynamical systems, inverse kinematics, movement model Seelen, Werner; Iossifidis, Ioannis; Steinhage, Axel Visually guided behavior of an autonomous robot with a neuronal architecture Proceedings Article In: 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001, IEEE Banff, Canada, 2001. Abstract | BibTeX | Schlagwörter: active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system Seelen, Werner; Iossifidis, Ioannis; Steinhage, Axel Visually guided behavior of an autonomous robot with a neuronal architecture Proceedings Article In: 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001, IEEE Banff, Canada, 2001. Abstract | BibTeX | Schlagwörter: active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, multi-modal man-machine interaction system Seelen, Werner; Iossifidis, Ioannis; Steinhage, Axel Visually guided behavior of an autonomous robot with a neuronal architecture Proceedings Article In: 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001, IEEE Banff, Canada, 2001. Abstract | BibTeX | Schlagwörter: active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system Iossifidis, Ioannis Visuelle Navigation auf einem autonomen mobilen Roboter Promotionsarbeit Fakultät für Physik, Technische Universität Dortmund, 1999. BibTeX | Schlagwörter: Autonomous robotics, Machine Learning, place cells, robot navigation2023
@article{ayazhussainAdvancementsUpperBody2023,
title = {Advancements in Upper Body Exoskeleton: Implementing Active Gravity Compensation with a Feedforward Controller},
author = {Muhammad Ayaz Hussain and Ioannis Iossifidis},
url = {https://doi.org/10.48550/arXiv.2309.04698},
doi = {10.48550/arXiv.2309.04698},
year = {2023},
date = {2023-09-09},
urldate = {2023-09-09},
journal = {arXiv:2309.04698 [cs.RO]},
abstract = {In this study, we present a feedforward control system designed for active gravity compensation on an upper body exoskeleton. The system utilizes only positional data from internal motor sensors to calculate torque, employing analytical control equations based on Newton-Euler Inverse Dynamics. Compared to feedback control systems, the feedforward approach offers several advantages. It eliminates the need for external torque sensors, resulting in reduced hardware complexity and weight. Moreover, the feedforward control exhibits a more proactive response, leading to enhanced performance. The exoskeleton used in the experiments is lightweight and comprises 4 Degrees of Freedom, closely mimicking human upper body kinematics and three-dimensional range of motion. We conducted tests on both hardware and simulations of the exoskeleton, demonstrating stable performance. The system maintained its position over an extended period, exhibiting minimal friction and avoiding undesired slewing.},
keywords = {Autonomous robotics, BCI, Computer Science - Artificial Intelligence, Computer Science - Information Theory, Computer Science - Machine Learning, Exoskeleton},
pubstate = {published},
tppubtype = {article}
}
2022
@inproceedings{doliwaBiologicallyInspiredModel2022,
title = {Biologically Inspired Model for Timed Motion in Robotic Systems},
author = {Sebastian Doliwa and Muhammad Ayaz Hussain and Tim Sziburis and Ioannis Iossifidis},
year = {2022},
date = {2022-08-12},
urldate = {2022-08-12},
booktitle = {9th IEEE RAS/EMBS International Conference on Biomedical Robotics & Biomechatronics},
publisher = {IEEE},
address = {Seoul, South Korea},
keywords = {Autonomous robotics, Dynamical systems},
pubstate = {published},
tppubtype = {inproceedings}
}
2021
@article{doliwaBiologicallyInspiredModel2021,
title = {Biologically Inspired Model for Timed Motion in Robotic Systems},
author = {Sebastian Doliwa and Muhammad Ayaz Hussain and Tim Sziburis and Ioannis Iossifidis},
year = {2021},
date = {2021-07-01},
urldate = {2021-07-01},
journal = {arXiv:2106.15864 [cs, math]},
abstract = {The goal of this work is the development of a motion model for sequentially timed movement actions in robotic systems under specific consideration of temporal stabilization, that is maintaining an approximately constant overall movement time (isochronous behavior). This is demonstrated both in simulation and on a physical robotic system for the task of intercepting a moving target in three-dimensional space. Motivated from humanoid motion, timing plays a vital role to generate a naturalistic behavior in interaction with the dynamic environment as well as adaptively planning and executing action sequences on-line. In biological systems, many of the physiological and anatomical functions follow a particular level of periodicity and stabilization, which exhibit a certain extent of resilience against external disturbances. A main aspect thereof is stabilizing movement timing against limited perturbations. Especially human arm movement, namely when it is tasked to reach a certain goal point, pose or configuration, shows a stabilizing behavior. This work incorporates the utilization of an extended Kalman filter (EKF) which was implemented to predict the target position while coping with non-linear system dynamics. The periodicity and temporal stabilization in biological systems was artificially generated by a Hopf oscillator, yielding a sinusoidal velocity profile for smooth and repeatable motion.},
keywords = {attractor dynamics approach, Autonomous robotics, Dynamical systems},
pubstate = {published},
tppubtype = {article}
}
2018
@inproceedings{bccn18,
title = {Toward a Model of Timed Arm Movement Based on Temporal Tuning of Neurons in Primary Motor (MI) and Posterior Parietal Cortex (PPC) Title},
author = {Muhammad Ayaz Hussain and Christian Klaes and Ioannis Iossifidis},
year = {2018},
date = {2018-01-01},
urldate = {2018-01-01},
booktitle = {BC18 : Computational Neuroscience & Neurotechnology Bernstein Conference 2018},
publisher = {BCCN},
abstract = {To study driver behavior we set up a lab with fixed base driving simulators. In order to compensate for the lack of physical feedback in this scenario, we aimed for another means of increasing the realism of our system. In the following, we propose an efficient method of head tracking and its integration in our driving simulation. Furthermore, we illuminate why this is a promising boost of the subjects immersion in the virtual world. Our idea for increasing the feeling of immersion is to give the subject feedback on head movements relative to the screen. A real driver sometimes moves his head in order to see something better or to look behind an occluding object. In addition to these intentional movements, a study conducted by Zirkovitz and Harris has revealed that drivers involuntarily tilt their heads when they go around corners in order to maximize the use of visual information available in the scene. Our system reflects the visual changes of any head movement and hence gives feedback on both involuntary and intentional motion. If, for example, subjects move to the left, they will see more from the right-hand side of the scene. If, on the other hand, they move upwards, a larger fraction of the engine hood will be visible. The same holds for the rear view mirror},
keywords = {Autonomous robotics, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {inproceedings}
}
2017
@misc{Iossifidis2017a,
title = {Temporal stabilized arm movement for efficient neuroprosthetic control by individuals with tetraplegia},
author = {Ioannis Iossifidis and Muhammad Ayaz Hussain and Christian Klaes},
year = {2017},
date = {2017-01-01},
urldate = {2017-01-01},
publisher = {SfN 2017},
abstract = {The generation of discrete movement with distinct and stable time courses characterizes each human movement and reflect the need to perform catching and interception tasks and for timed action sequences, incorporating dynamically changing environmental constraints. Several lines of evidence suggest neuronal mechanism for the initiation of movements i.e. in the supplementary motor area (SMA) and the premotor cortex and for movement planning mechanism generating velocity profiles satisfying time constraints. In order to meet the requirements of on-line evolving trajectories we propose a model, based on dynamical systems which describes goal directed trajectories in humans and generates trajectories for redundant anthropomorphic robotic arms The current study aim to evaluate the temporal characteristics of primary motor and posterior parietal cortex in patients with tetraplegia by using inception task implemented in virtual reality. The participants will be implanted with two 96-channel intracortical microelectrode arrays in the Primary Motor and Post Parietal Cortex. In the training phase the participants will be confronted with the observation of a robotic arm intercepting the bob of a pendulum at the lowest point of it's trajectory (maximum velocity) - the end effector reaches at the same time as the bob of the pendulum the lowest point of the trajectory performing a perfectly timed movement. The arm is positioned perpendicular to the oscillation plane exactly at the hight of the interception point to generate a one dimensional trajectory to the target. The time to contact between the robot's end effector and the bob of the pendulum is maintained constant and during the different sessions the distance between end effector and the point of interception is gradually increased. In order to catch up and to reach in time, either velocity formation or initiation time of the movement have to be changed. Both effects will be investigated independently. For the decoding of movement-related information we introduce a framework exploiting a deep learning approach with a convolutional neural networks.},
keywords = {Autonomous robotics, Dynamical systems, movement model, neuroprosthetic},
pubstate = {published},
tppubtype = {misc}
}
@misc{Iossifidis2017b,
title = {Low dimensional representation of human arm movement for efficient neuroprosthetic control by individuals with tetraplegia},
author = {Ioannis Iossifidis and Christian Klaes},
year = {2017},
date = {2017-01-01},
urldate = {2017-01-01},
publisher = {SfN 2017},
abstract = {Over the last decades the generation mechanism and the representation of goal- directed movements has been a topic of intensive neurophysiological research. The investigation in the motor, premotor, and parietal areas led to the discovery that the direction of hand's movement in space was encoded by populations of neurons in these areas together with many other movement parameters. These distributions of population activation reflect how movements are prepared ahead of movement initiation, as revealed by activity induced by cues that precede the imperative signal (Georgopoulos, 1991). Inspired by those findings a model based on dynamical systems was proposed both, to model goal directed trajectories in humans and to generate trajectories for redundant anthropomorphic robotic arms. The analysis of the attractor dynamics based on the qualitative comparison with measurements of resulting trajectories taken from arm movement experiments with humans (Grimme u. a., 2012) created a framework able to reproduce and to generate naturalistic human like arm trajectories (Iossifidis und Rano, 2013; Iossifidis, Schöner u. a., 2006). The main idea of the methodology is to choose low-dimensional, behavioral va- riables of the goal task can be represented as attractor states of those variables. The movement is generated through a dynamical system with attractors and repellers on the behavioral space, at the goal and constraint positions respectively. When the motion of the robot evolves according to the dynamics of these systems, the behavioral variables will be stabilized at their attractors. Movement is represented by the polar coordinates $phi$,$theta$ of the movement direction (heading direction) and the angular frequency $ømega$ of a hopf oscillator, generating the velocity profile of the arm movement. Therefore, the system dynamics will be expressed in terms of these variables. The target and each obstacle induce vector fields over these variables in a way that states where the hand is moving closer to the target are attractive, while states where it is moving towards an obstacle are repellant. Contributions from different sources are weighted by different factors, e.g. in the vicinity of an obstacle, the contribution from that obstacle must dominate the behavior to guarantee constraint satisfaction (collision prevention). Based on three parameters the presented framework is able to generate temporal stabilized (timed) discrete movements, dealing with disturbances and maintaining an approximately constant movement time. In the current study we will implant two 96-channel intracortical microelectrode arrays in the primary motor and the posterior parietal cortex (PPC) of an individual with tetraplegia. In the training phase the parameters of the dynamical systems will be tuned and optimized by machine learning algorithms. Rather controlling directly the arm movement and adjusting continuously parameters, the patient adjust by his or hers thoughts the three parameters of the dynamics, which remain almost constant during the movement. Only when the motion plan is changing the parameters have to be readjusted. The target directed trajectory evolves from the attractor solution of the dynamical systems equations, which means that the trajectory is generated while the system is in a stable stationary state, a fixed-point attractor. The increase of the degree of assistance lowers the cognitive load of the patient and enables the acknowledgement of the desired task without frustration. In addition we aim to replace the robotic manipulator by an exoskeleton for the upper body which will enable the patients to move his or hers own limbs, which would complete the development of a real neuroprosthetic device for every day use.},
keywords = {Autonomous robotics, BCI, Dynamical systems, movement model, neuroprosthetic},
pubstate = {published},
tppubtype = {misc}
}
@conference{nokey,
title = {Low dimensional representation of human arm movement for efficient neuroprosthetic control by individuals with tetraplegia},
author = {Christian Klaes and Ioannis Iossifidis},
year = {2017},
date = {2017-01-01},
urldate = {2017-01-01},
booktitle = {SfN Meeting 2017},
keywords = {Autonomous robotics, BCI, Dynamical systems, movement model, neuroprosthetic},
pubstate = {published},
tppubtype = {conference}
}
2014
@inproceedings{Iossifidis2014a,
title = {Simulated Framework for the Development and Evaluation of Redundant Robotic Systems},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014},
abstract = {In the current work we present a simulated environment for the development and evaluation of multi redundant open chain manipulators. The framework is implemented in Matlab and provides solutions for the kinematics and dynamics of an arbitrary open chain manipulator. For a anthropomorphic trunk-shoulder-arm configura- tion with in total nine degree of freedoms, a closed form solution of the inverse kinematics problem is derived. The attractor dynamics approach to motion generation was evaluated within this framework and the results are verified on the real anthropomorphic robotic assistant Cora.},
keywords = {Autonomous robotics, man machine interaction, simulated reality},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Iossifidis2014b,
title = {Development of a Haptic Interface for Safe Human Robot Collaboration},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014},
abstract = {In the context of the increasing number of collaborative workplaces in industrial environments, where humans and robots sharing the same workplace, safety and intuitive interaction is a prerequisite. This means, that the robot can (1) have contact with his own body and the surrounding objects, (2) the motion of the robot can be corrected online by the human user just by touching his artificial skin or (3) interrupt the action in dangerous situations. In the current work we introduce a haptic interface (artificial skin) which is utilized to cover the arms of an anthropomorphic robotic assistant. The touched induced input of the artificial skin is interpreted and fed into the motor control algorithm to generate the desired motion and to avoid harm for human and machine.},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Iossifidis2014c,
title = {Development of a Haptic Interface for Safe Human Robot Collaboration},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Iossifidis2014ab,
title = {Simulated Framework for the Development and Evaluation of Redundant Robotic Systems},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014},
keywords = {Autonomous robotics, man machine interaction, simulated reality},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Iossifidis2014e,
title = {Development of a Haptic Interface for Safe Human Robot Collaboration},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
2013
@inproceedings{Iossifidis2013c,
title = {Utilizing artificial skin for direct physical interaction},
author = {Ioannis Iossifidis},
doi = {10.1109/ROBIO.2013.6739562},
year = {2013},
date = {2013-01-01},
urldate = {2013-01-01},
booktitle = {2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013},
abstract = {Focusing on the development of flexible robots for industrial and household environments, we identify intuitive teaching as the key feature and direct physical interaction and guidance as the most important interface. In the current work we introduce a multi redundant robotic assistant equipped with a touch sensitive skin around the upper- and the forearm, in order to incorporate contact forces into the arm control. A context-sensitive interpretation of the contact forces is being used to guide the attention of the robot, to avoid obstacles and to move the robot arm directly by the human operator. textcopyright 2013 IEEE.},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Iossifidis2013a,
title = {Modeling Human Arm Motion by Means of Attractor Dynamics Approach},
author = {Ioannis Iossifidis and Ianki Rano},
year = {2013},
date = {2013-01-01},
booktitle = {Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013)},
abstract = {Autonomous robots with limited computational capacity call for control approaches that generate meaningful, goal-directed behavior without using a large amount of resources. The attractor dynamics approach to movement generation is a framework that links sensor data to motor commands via coupled dynamical systems that have attractors at behaviorally desired states. The low computational demands leave enough system resources for higher level function like forming a sequence of local goals to reach a distant one. The comparatively high performance of local behavior generation allows the global planning to be relatively simple.
In the present paper, we apply this approach to generate walking trajectories for a small humanoid robot, the Aldebaran Nao, that are goal-directed and avoid obstacles. The sensor information is a single camera in the head of the robot. The limited field of vision is compensated by head movements. The design of the dynamical system for motion generation and the choice of state variable makes a computationally expensive scene representation or local map building unnecessary.},
keywords = {arm movement model, Autonomous robotics, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {inproceedings}
}
In the present paper, we apply this approach to generate walking trajectories for a small humanoid robot, the Aldebaran Nao, that are goal-directed and avoid obstacles. The sensor information is a single camera in the head of the robot. The limited field of vision is compensated by head movements. The design of the dynamical system for motion generation and the choice of state variable makes a computationally expensive scene representation or local map building unnecessary.@inproceedings{Iossifidis2013db,
title = {Motion Constraint Satisfaction by Means of Closed Form Solution for Redundant Robot Arms},
author = {Ioannis Iossifidis},
year = {2013},
date = {2013-01-01},
booktitle = {Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013)},
abstract = {Autonomous robots with limited computational capacity call for control approaches that generate meaningful, goal-directed behavior without using a large amount of resources. The attractor dynamics approach to movement generation is a framework that links sensor data to motor commands via coupled dynamical systems that have attractors at behaviorally desired states. The low computational demands leave enough system resources for higher level function like forming a sequence of local goals to reach a distant one. The comparatively high performance of local behavior generation allows the global planning to be relatively simple.
In the present paper, we apply this approach to generate walking trajectories for a small humanoid robot, the Aldebaran Nao, that are goal-directed and avoid obstacles. The sensor information is a single camera in the head of the robot. The limited field of vision is compensated by head movements. The design of the dynamical system for motion generation and the choice of state variable makes a computationally expensive scene representation or local map building unnecessary.},
keywords = {Autonomous robotics, inverse kinematics, motion constraints, redundant robot},
pubstate = {published},
tppubtype = {inproceedings}
}
In the present paper, we apply this approach to generate walking trajectories for a small humanoid robot, the Aldebaran Nao, that are goal-directed and avoid obstacles. The sensor information is a single camera in the head of the robot. The limited field of vision is compensated by head movements. The design of the dynamical system for motion generation and the choice of state variable makes a computationally expensive scene representation or local map building unnecessary.@inproceedings{Iossifidis2013d,
title = {Utilizing Artificial Skin for Direct Physical Interaction},
author = {Ioannis Iossifidis},
year = {2013},
date = {2013-01-01},
booktitle = {Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013)},
abstract = {Autonomous robots with limited computational capacity call for control approaches that generate meaningful, goal-directed behavior without using a large amount of resources. The attractor dynamics approach to movement generation is a framework that links sensor data to motor commands via coupled dynamical systems that have attractors at behaviorally desired states. The low computational demands leave enough system resources for higher level function like forming a sequence of local goals to reach a distant one. The comparatively high performance of local behavior generation allows the global planning to be relatively simple. In the present paper, we apply this approach to generate walking trajectories for a small humanoid robot, the Aldebaran Nao, that are goal-directed and avoid obstacles. The sensor information is a single camera in the head of the robot. The limited field of vision is compensated by head movements. The design of the dynamical system for motion generation and the choice of state variable makes a computationally expensive scene representation or local map building unnecessary.},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Iossifidis2013b,
title = {Motion constraint satisfaction by means of closed form solution for redundant robot arms},
author = {Ioannis Iossifidis},
doi = {10.1109/ROBIO.2013.6739780},
isbn = {978-1-4799-2744-9},
year = {2013},
date = {2013-01-01},
booktitle = {2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013},
pages = {2106--2111},
abstract = {Generation of flexible goal directed movement describes the key skill of autonomous articulated robots. Critical points are still the acknowledgement of reaching and grasping task while satisfying static and dynamically changing constraints given by the environment or caused by the human operator in a collaborative situation. This means that the motion planning dynamics has to incorporate multiple contributions of different qualities which should be formulated in constraint specific reference frames and then transformed into the frame of joint velocities. Whereby the handling of the contribution to motion planning is determined by the solution of the inverse kinematics problem. In this work a closed form solution for the inverse kinematics problem for an eight degree of freedom arm is presented. The geometrical properties of the multi redundant arm and the resulting free parameter which determine it's null space motion are utilized to satisfy constraints of the desired motion. We implement this system on an eight DoF redundant manipulator and show its feasibility in a simulation. textcopyright 2013 IEEE.},
keywords = {Autonomous robotics, inverse kinematics, motion constraints, redundant robot},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Rano2013,
title = {Modelling human arm motion through the attractor dynamics approach},
author = {Inaki Rano and Ioannis Iossifidis},
doi = {10.1109/ROBIO.2013.6739777},
isbn = {9781479927449},
year = {2013},
date = {2013-01-01},
booktitle = {2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013},
pages = {2088--2093},
abstract = {Movement generation in robotics is an old problem with many excellent solutions. Most of them, however, look for optimality according to some metrics, but have no biological inspiration or cannot be used to imitate biological motion. For a human these techniques behave in a non-naturalistic way. This poses a problem for instance in human-robot interaction and, in general, for a good acceptance of robots in society. The present work presents a new analysis of the attractor dynamics approach to movement generation used in an anthropomorphic robot arm. Our analysis points to the possibility of using this approach to generate human-like arm trajectories in robots. One key property of human trajectories in pick-and-place tasks is the planarity of the trajectory of the end effector in 3D space. We show that this feature is also displayed by the attractor dynamic approach and, therefore, is a good candidate to the generation of naturalistic arm movements. textcopyright 2013 IEEE.},
keywords = {arm movement model, Autonomous robotics, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {inproceedings}
}
2012
@inproceedings{Noth2012b,
title = {A Versatile Simulated Reality Framework: From Embedded Components to ADAS},
author = {Sebastian Noth and Johann Edelbrunner and Ioannis Iossifidis},
year = {2012},
date = {2012-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2012},
keywords = {Autonomous robotics, Machine Learning, simulated reality, Simulation, virtual reality},
pubstate = {published},
tppubtype = {inproceedings}
}
@conference{Iossifidis2012,
title = {Sequence Generation for Grasping Tasks by Means of Dynamical Systems},
author = {Ioannis Iossifidis},
year = {2012},
date = {2012-01-01},
booktitle = {BC12 : Computational Neuroscience $backslash$& Neurotechnology Bernstein Conference $backslash$& Neurex Annual Meeting 2012},
keywords = {Autonomous robotics, Dynamical systems, grasping, sequence generation},
pubstate = {published},
tppubtype = {conference}
}
2011
@article{Zibner2011,
title = {Dynamic neural fields as building blocks of a cortex-inspired architecture for robotic scene representation},
author = {S K U Zibner and Christian Faubel and Ioannis Iossifidis and G Schöner},
doi = {10.1109/TAMD.2011.2109714},
issn = {19430604},
year = {2011},
date = {2011-01-01},
urldate = {2011-01-01},
journal = {IEEE Transactions on Autonomous Mental Development},
volume = {3},
number = {1},
abstract = {Based on the concepts of dynamic field theory (DFT), we present an architecture that autonomously generates scene representations by controlling gaze and attention, creating visual objects in the foreground, tracking objects, reading them into working memory, and taking into account their visibility. At the core of this architecture are three-dimensional dynamic neural fields (DNFs) that link feature to spatial information. These three-dimensional fields couple into lower dimensional fields, which provide the links to the sensory surface and to the motor systems. We discuss how DNFs can be used as building blocks for cognitive architectures, characterize the critical bifurcations in DNFs, as well as the possible coupling structures among DNFs. In a series of robotic experiments, we demonstrate how the DNF architecture provides the core functionalities of a scene representation. textcopyright 2011 IEEE.},
keywords = {Autonomous robotics, dynamic field theory (DFT), Dynamical systems, embodied cognition, neural processing},
pubstate = {published},
tppubtype = {article}
}
@inproceedings{Noth2011,
title = {Simulated reality environment for development and assessment of cognitive robotic systems},
author = {Sebastian Noth and Ioannis Iossifidis},
year = {2011},
date = {2011-01-01},
urldate = {2011-01-01},
booktitle = {Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2011)},
abstract = {Simulated reality environment incorporating humans and physically plausible behaving robots, providing natural interaction channels, with the option to link simulator to real perception and motion, is gaining importance for the development of cognitive, intuitive interacting and collaborating robotic systems.
In the present work we introduce a head tracking system which is utilized to incorporate human ego motion in simulated environment improving immersion in the context of human-robot collaborative tasks.},
keywords = {Autonomous robotics, Machine Learning, Simulation, virtual reality},
pubstate = {published},
tppubtype = {inproceedings}
}
In the present work we introduce a head tracking system which is utilized to incorporate human ego motion in simulated environment improving immersion in the context of human-robot collaborative tasks.@conference{Noth2011a,
title = {Benefits of ego motion feedback for interactive experiments in virtual reality scenarios},
author = {S Noth and Ioannis Iossifidis},
year = {2011},
date = {2011-01-01},
urldate = {2011-01-01},
booktitle = {BC11 : Computational Neuroscience $backslash$& Neurotechnology Bernstein Conference $backslash$& Neurex Annual Meeting 2011},
keywords = {Autonomous robotics, Machine Learning, simulated reality, Simulation, virtual reality},
pubstate = {published},
tppubtype = {conference}
}
2010
@inproceedings{Reimann2010a,
title = {Integrating orientation constraints into the attractor dynamics approach for autonomous manipulation},
author = {Hendrik Reimann and Ioannis Iossifidis and Gregor Schoner and Gregor Schöner},
url = {http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5686349},
doi = {10.1109/ICHR.2010.5686349},
isbn = {978-1-4244-8688-5},
year = {2010},
date = {2010-12-01},
urldate = {2010-12-01},
booktitle = {2010 10th IEEE-RAS International Conference on Humanoid Robots},
pages = {294--301},
publisher = {IEEE},
abstract = {When autonomous robots generate behavior in complex environments they must satisfy multiple different constraints such as moving toward a target, avoidance of obstacles, or alignment of the gripper with a particular orientation. It is often convenient to represent each type of constraint in a specific reference frame, so that the satisfaction of all constraints requires transformation into a shared base frame. In the attractor dynamics approach, behavior is generated as an attractor solution of a dynamical system that is formulated in such a base frame to enable control. Each constraint contributes an attractive (for targets) or repulsive (for obstacles) component to the vector field. Here we show how these dynamic contributions can be formulated in different reference frames suited to each constraint and then be transformed and integrated within the base frame. Building on earlier work, we show how the orientation of the gripper can be integrated with other constraints on the movement of the manipulator. We also show, how an attractor dynamics of “neural” activation variables can be designed that activates and deactivates the different contributions to the vector field over time to generate a sequence of component movements. As a demonstration, we treat a manipulation task in which grasping oblong cylindrical objects is decomposed into an ensemble of separate constraints that are integrated and resolved using the attractor dynamics approach. The system is implemented on the small humanoid robot Nao, and illustrated in two exemplary movement tasks.},
keywords = {attractor dynamics approach, Autonomous robotics, Dynamical systems, inverse kinematics},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Zibner2010,
title = {Scenes and tracking with dynamic neural fields: How to update a robotic scene representation},
author = {S K U Zibner and Christian Faubel and Ioannis Iossifidis and G Schöner and J P Spencer},
doi = {10.1109/DEVLRN.2010.5578837},
isbn = {9781424469024},
year = {2010},
date = {2010-01-01},
urldate = {2010-01-01},
booktitle = {2010 IEEE 9th International Conference on Development and Learning, ICDL-2010 - Conference Program},
abstract = {We present an architecture based on the Dynamic Field Theory for the problem of scene representation. At the core of this architecture are three-dimensional neural fields linking feature to spatial information. These three-dimensional fields are coupled to lower-dimensional fields that provide both a close link to the sensory surface and a close link to motor behavior. We highlight the updating mechanism of this architecture, both when a single object is selected and followed by the robot's head in smooth pursuit and in multi-item tracking when several items move simultaneously. textcopyright 2010 IEEE.},
keywords = {Autonomous robotics, dynamic field theory (DFT), Dynamical systems, embodied cognition, neural processing},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Zibneri,
title = {Scene and Tracking with Dynamic Neural Field Approach},
author = {Stephan Zibner and Christian Faubel and Ioannis Iossifidis and Gregor Schöner and John P Spencer},
year = {2010},
date = {2010-01-01},
booktitle = {ISR / ROBOTIK 2010},
address = {Munich, Germany},
abstract = {An internal representation of a scene is essential to generate actions on scene objects. A stabilized storage of object location and features offers the flexibility to process queries phrased in human-based terms relating to objects, which may not be in the current camera view. Scene representation is therefore an internal representation of the surrounding world that is stabilized against head and body movement. It contains associated information about location and features of objects. Because objects and bodies move, scene representation is not a one-time process, but a constantly scene- adapting mechanism of scanning for, storing, updating, and deleting information.
Our novel architecture incorporates the generation of autonomous scanning sequences on real-time camera images. The head can then be oriented towards a selected object and the color feature can be extracted. Object location and feature information are associatively stored in a three-dimensional Dynamic Neural Field. Changes in the scene, even for multiple objects, can be tracked simultaneously. The stored information is used to generate behavior for cued recall. Cues can be table regions, features, or object labels. The robot demonstrates a successful recall by centering its gaze on the stated object.},
keywords = {Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition},
pubstate = {published},
tppubtype = {inproceedings}
}
Our novel architecture incorporates the generation of autonomous scanning sequences on real-time camera images. The head can then be oriented towards a selected object and the color feature can be extracted. Object location and feature information are associatively stored in a three-dimensional Dynamic Neural Field. Changes in the scene, even for multiple objects, can be tracked simultaneously. The stored information is used to generate behavior for cued recall. Cues can be table regions, features, or object labels. The robot demonstrates a successful recall by centering its gaze on the stated object.@inproceedings{Zibnersubmittedb,
title = {Scene Representation with Dynamic Neural Fields: An Example of Complex Cognitive Architectures Based on Dynamic Neural Field Theory},
author = {Stephan K U Zibner and Christian Faubel and Ioannis Iossifidis and Gregor Schöner},
year = {2010},
date = {2010-01-01},
booktitle = {Proc. Int. Conf. on Development and Learning (ICDL10)},
keywords = {Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Zibner2010c,
title = {Scenes and Tracking with Dynamic Neural Fields: How to Update a Robotic Scene Representation},
author = {Stephan K U Zibner and Christian Faubel and John P Spencer and Ioannis Iossifidis and Gregor Schöner},
year = {2010},
date = {2010-01-01},
booktitle = {Proc. Int. Conf. on Development and Learning (ICDL10)},
keywords = {Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Zibner2010ab,
title = {Scene Representation for Anthropomorphic Robots: A Dynamic Neural Field Approach},
author = {Stephan S K U Zibner and Christian Faubel and Ioannis Iossifidis and Gregor Schöner},
url = {http://www.vde-verlag.de/proceedings-en/453273138.html},
year = {2010},
date = {2010-01-01},
booktitle = {ISR / ROBOTIK 2010},
number = {Isr},
publisher = {VDE VERLAG GmbH},
address = {Munich, Germany},
abstract = {An internal representation of a scene is essential to generate actions on scene objects. A stabilized storage of object location and features offers the flexibility to process queries phrased in human-based terms relating to objects, which may not be in the current camera view. Scene representation is therefore an internal representation of the surrounding world that is stabilized against head and body movement. It contains associated information about location and features of objects. Because objects and bodies move, scene representation is not a one-time process, but a constantly scene- adapting mechanism of scanning for, storing, updating, and deleting information.
Our novel architecture incorporates the generation of autonomous scanning sequences on real-time camera images. The head can then be oriented towards a selected object and the color feature can be extracted. Object location and feature information are associatively stored in a three-dimensional Dynamic Neural Field. Changes in the scene, even for multiple objects, can be tracked simultaneously. The stored information is used to generate behavior for cued recall. Cues can be table regions, features, or object labels. The robot demonstrates a successful recall by centering its gaze on the stated object.},
keywords = {Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition},
pubstate = {published},
tppubtype = {inproceedings}
}
Our novel architecture incorporates the generation of autonomous scanning sequences on real-time camera images. The head can then be oriented towards a selected object and the color feature can be extracted. Object location and feature information are associatively stored in a three-dimensional Dynamic Neural Field. Changes in the scene, even for multiple objects, can be tracked simultaneously. The stored information is used to generate behavior for cued recall. Cues can be table regions, features, or object labels. The robot demonstrates a successful recall by centering its gaze on the stated object.@inproceedings{Sandamirskayasubmitted,
title = {Natural human-robot interaction through spatial language: a dynamic neural fields approach},
author = {Yulia Sandamirskaya and John Lipinski and Ioannis Iossifidis and G Schöner},
url = {http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5598671},
issn = {1944-9445},
year = {2010},
date = {2010-01-01},
booktitle = {Proc. 19th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2010)},
pages = {600--607},
publisher = {IEEE},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, man machine interaction, movement model, speech recognition},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Reimannd,
title = {End-effector obstacle avoidance using multiple dynamic variables},
author = {Hendrik Reimann and Ioannis Iossifidis and Gregor Schöner},
year = {2010},
date = {2010-01-01},
booktitle = {ISR / ROBOTIK 2010},
address = {Munich, Germany},
abstract = {The avoidance of obstacles is a crucial part of the generation of behavior for autonomos robotic agents. A standard method to produce trajectories to a given target that avoids a number of possibly mobile obstacles is the potential field approach introduced by Khatib, where an artificial potential field is constructed around target and obstacles, with the target acting as a global minimum and the obstacles as local maxima, the gradient of which is used to determine the (artificial) force acting on the robot at any moment. While the potential field approach has been used extensively for vehicle motion in a plane, applications for robotic manipulators suffer from a high level of complexity due to the formulation of constraints as forces necessitating the inclusion of dynamic properties of the manipulator into the system. We pursue a different solution to the problem of manipulator obstacle avoidance based on the dynamic approach to robotics, which states that all behavioral constraints for the generation of movement should be formulated as attractors or repellors of a dynamical systems. The problem of behavior design is thus separated from the control problem of how to realize the designed behavior, bringing the advantage of simplicity in the formulation of the former.},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model, obstacle avoidance},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Zibner2010b,
title = {Scene representation for anthropomorphic robots: A dynamic neural field approach},
author = {S K U Zibner and Christian Faubel and Ioannis Iossifidis and G Schöner},
isbn = {9781617387197},
year = {2010},
date = {2010-01-01},
urldate = {2010-01-01},
booktitle = {Joint 41st International Symposium on Robotics and 6th German Conference on Robotics 2010, ISR/ROBOTIK 2010},
volume = {2},
abstract = {For autonomous robotic systems, the ability to represent a scene, to memorize and track objects and their associated features is a prerequisite for reasonable interactive behavior. In this paper, we present a biologically inspired architecture for scene representation that is based on Dynamic Field Theory. At the core of the architecture we make use of three-dimensional Dynamic Neural Fields for representing space-feature associations. These associations are built up autonomously in a sequential way and they are maintained and continuously updated. We demonstrate these capabilities in two experiments on an anthropomorphic robotic platform. In the first experiment we show the sequential scanning of a scene. The second experiment demonstrates the maintenance of associations for objects, which get out of view, and the correct update of the scene representation, if such objects are removed.},
keywords = {Autonomous robotics, dynamic field theory (DFT), Dynamical systems, embodied cognition, neural processing},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Grimm2010b,
title = {Behavioral Organization for Mobile Robotic Systems: An Attractor Dynamics Approach},
author = {Matthias Grimm and Ioannis Iossifidis},
year = {2010},
date = {2010-01-01},
booktitle = {ISR / ROBOTIK 2010},
address = {Munich, Germany},
abstract = {Autonomous systems generate different behaviors based on the perceived environmental situation. The organization of a set of behaviors plays an important role in the field of autonomous robotics. The organization architecture must be flexible, so that behavioral changes are possible if the sensory information changes. Furthermore, behavioral organization must be stable, so that small changes in sensory information do not lead to oscillations. To achieve this, all behaviors, but also the underlying organization architecture, are based on continuous dynamical systems. They are characterized by a set of dynamical variables, also referred to as state variables. These variables represent the activation or deactivation of a particular behavior. Elementary behaviors are dependent on the sensor input in a way, that changes of the sensorial information lead to qualitatively different behaviors. The so-called sensor context denotes whether a behavior is applicable in the current sensor situation or not. However, for complex systems consisting of many elementary behaviors, it is necessary to take logical conditions into account to generate a sequence of behaviors. Furthermore, some elementary behaviors can or even must run in parallel, while others exclude each other. This internal information requires knowledge about the logical interaction of the behaviors and is stored within binary matrices. This makes the overall organization structure very flexible and easy to extend. We present the architecture using the example of approaching and passing a door. The robot has to navigate from one room to another while simultaneously avoiding obstacles in its pathway.},
keywords = {Autonomous robotics, behavior generation, Dynamical systems, movement model, movile robot},
pubstate = {published},
tppubtype = {inproceedings}
}
2009
@inproceedings{Tuma2009b,
title = {Temporal stabilization of discrete movement in variable environments: An attractor dynamics approach},
author = {M Tuma and Ioannis Iossifidis and G Schöner},
url = {http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5152562},
doi = {10.1109/ROBOT.2009.5152562},
isbn = {978-1-4244-2788-8},
year = {2009},
date = {2009-05-01},
urldate = {2009-05-01},
booktitle = {2009 IEEE International Conference on Robotics and Automation},
pages = {863--868},
publisher = {IEEE},
abstract = {The ability to generate discrete movement with distinct and stable time courses is important for interaction scenarios both between different robots and with human partners, for catching and interception tasks, and for timed action sequences. In dynamic environments, where trajectories are evolving online, this is not a trivial task. The dynamical systems approach to robotics provides a framework for robust incorporation of fluctuating sensor information, but control of movement time is usually restricted to rhythmic motion and realized through stable limit cycles. The present work uses a Hopf oscillator to produce discrete motion and formulates an online adaptation rule to stabilize total movement time against a wide range of disturbances. This is integrated into a dynamical systems framework for the sequencing of movement phases and for directional navigation, using 2D-planar motion as an example. The approach is demonstrated on a Khepera mobile unit in order to show its reliability even when depending on low-level sensor information.},
keywords = {attractor dynamics approach, Autonomous robotics, Dynamical systems, hopf oscillator},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Tuma2009,
title = {Temporal Stabilization of Discrete Movement in Variable Environments: An Attractor Dynamics Approach},
author = {Matthias Tuma and Ioannis Iossifidis and Gregor Schöner},
year = {2009},
date = {2009-01-01},
booktitle = {Proc. IEEE International Conference on Robotics and Automation ICRA '09},
pages = {863--868},
address = {Kobe, Japan},
abstract = {The ability to generate discrete movement with distinct and stable time courses
is important for interaction scenarios both between different robots and with human partners,
for catching and interception tasks, and for timed action sequences.
In dynamic environments, where trajectories are evolving on-line, this is not a trivial task.
The dynamical systems approach to robotics provides a framework for robust
incorporation of fluctuating sensor information, but control of movement time is usually
restricted to rhythmic motion and realized through stable limit cycles. The present work
uses a Hopf oscillator to produce discrete motion and formulates an on-line adaptation rule
to stabilize total movement time against a wide range of disturbances. This is integrated into
a dynamical systems framework for the sequencing of movement phases and for directional navigation, using 2D-planar motion
as an example. The approach is demonstrated on a Khepera mobile unit in order to show its
reliability even when depending on low-level sensor information.},
keywords = {attractor dynamics approach, Autonomous robotics, Dynamical systems, hopf oscillator},
pubstate = {published},
tppubtype = {inproceedings}
}
is important for interaction scenarios both between different robots and with human partners,
for catching and interception tasks, and for timed action sequences.
In dynamic environments, where trajectories are evolving on-line, this is not a trivial task.
The dynamical systems approach to robotics provides a framework for robust
incorporation of fluctuating sensor information, but control of movement time is usually
restricted to rhythmic motion and realized through stable limit cycles. The present work
uses a Hopf oscillator to produce discrete motion and formulates an on-line adaptation rule
to stabilize total movement time against a wide range of disturbances. This is integrated into
a dynamical systems framework for the sequencing of movement phases and for directional navigation, using 2D-planar motion
as an example. The approach is demonstrated on a Khepera mobile unit in order to show its
reliability even when depending on low-level sensor information.2006
@book{Iossifidis2006b,
title = {Dynamische Systeme zur Steuerung anthropomorpher Roboterarme in autonomen Robotersystemen},
author = {Ioannis Iossifidis},
url = {http://www.logos-verlag.de/cgi-bin/engbuchmid?isbn=1305&lng=deu&id=},
year = {2006},
date = {2006-08-01},
urldate = {2006-08-01},
number = {ISBN: 3-8325-1305-1},
pages = {160},
publisher = {Logos Verlag Berlin},
abstract = {Das übergeordnete Forschungsgebiet, in das sich die vorliegende Arbeit einbettet, befasst sich mit der Erforschung von informationsverabeitenden Prozessen im Gehirn und der Anwendung der resultierenden Erkenntnisse auf technische Systeme.
In Analogie zu biologischen Systemen, deren Beschaffenheit aus den Anforderungen der Umwelt an ihr Verhalten resultiert, leitet sich die Anthropomorphie als Entwurfsprinzip für die Struktur des mit den Menschen interagierenden robotischen Assistenzsystemen ab.
Der Autor behandelt in der vorliegende Arbeit das Problem der Erzeugung von Motorverhalten im dreidimensionalen Raum am Beispiel eines anthropomorphen Roboterarmes in einem anthropomorphen robotischen Assistenzsystem.
Entwickelt wurde hierbei ein allgemeiner Ansatz, der die Konzepte der Erzeugung von Motorverhalten im 3D-Raum, der Voraussimulation dynamischer Systeme zur Systemdiagnose und zur Suche gewünschter Systemzustände, sowie ein Konzept der Organisation von Verhalten enthält und vereinigt.
Nichtlineare dynamische Systeme bilden das mathematische Fundament, die einheitlich, formale Sprache des Ansatzes, mit der sowohl das Motorverhalten des Roboters als auch dessen zeitkontinuierliche Teilsysteme rückgekoppelt werden.},
keywords = {Autonomous robotics, Dynamical systems, inverse kinematics},
pubstate = {published},
tppubtype = {book}
}
In Analogie zu biologischen Systemen, deren Beschaffenheit aus den Anforderungen der Umwelt an ihr Verhalten resultiert, leitet sich die Anthropomorphie als Entwurfsprinzip für die Struktur des mit den Menschen interagierenden robotischen Assistenzsystemen ab.
Der Autor behandelt in der vorliegende Arbeit das Problem der Erzeugung von Motorverhalten im dreidimensionalen Raum am Beispiel eines anthropomorphen Roboterarmes in einem anthropomorphen robotischen Assistenzsystem.
Entwickelt wurde hierbei ein allgemeiner Ansatz, der die Konzepte der Erzeugung von Motorverhalten im 3D-Raum, der Voraussimulation dynamischer Systeme zur Systemdiagnose und zur Suche gewünschter Systemzustände, sowie ein Konzept der Organisation von Verhalten enthält und vereinigt.
Nichtlineare dynamische Systeme bilden das mathematische Fundament, die einheitlich, formale Sprache des Ansatzes, mit der sowohl das Motorverhalten des Roboters als auch dessen zeitkontinuierliche Teilsysteme rückgekoppelt werden.@phdthesis{Iossifidis2006c,
title = {Dynamische Systeme zur Steuerung anthropomorpher Roboterarme in autonomen Robotersystemen},
author = {Ioannis Iossifidis},
url = {http://www.logos-verlag.de/cgi-bin/engbuchmid?isbn=1305&lng=deu&id=},
year = {2006},
date = {2006-01-01},
urldate = {2006-01-01},
number = {ISBN: 3-8325-1305-1},
pages = {160},
publisher = {Logos Verlag Berlin},
address = {Bochum, Germany},
school = {Faculty for Physics and Astronomy, Ruhr-University Bochum},
abstract = {Das übergeordnete Forschungsgebiet, in das sich die vorliegende Arbeit einbettet, befasst sich mit der Erforschung von informationsverabeitenden Prozessen im Gehirn und der Anwendung der resultierenden Erkenntnisse auf technische Systeme. In Analogie zu biologischen Systemen, deren Beschaffenheit aus den Anforderungen der Umwelt an ihr Verhalten resultiert, leitet sich die Anthropomorphie als Entwurfsprinzip für die Struktur des mit den Menschen interagierenden robotischen Assistenzsystemen ab. Der Autor behandelt in der vorliegende Arbeit das Problem der Erzeugung von Motorverhalten im dreidimensionalen Raum am Beispiel eines anthropomorphen Roboterarmes in einem anthropomorphen robotischen Assistenzsystem. Entwickelt wurde hierbei ein allgemeiner Ansatz, der die Konzepte der Erzeugung von Motorverhalten im 3D-Raum, der Voraussimulation dynamischer Systeme zur Systemdiagnose und zur Suche gewünschter Systemzustände, sowie ein Konzept der Organisation von Verhalten enthält und vereinigt. Nichtlineare dynamische Systeme bilden das mathematische Fundament, die einheitlich, formale Sprache des Ansatzes, mit der sowohl das Motorverhalten des Roboters als auch dessen zeitkontinuierliche Teilsysteme rückgekoppelt werden.},
keywords = {Autonomous robotics, Dynamical systems, inverse kinematics},
pubstate = {published},
tppubtype = {phdthesis}
}
2005
@book{Iossifidis2005a,
title = {A cooperative robotic assistant for human environments},
author = {Ioannis Iossifidis and Carsten Bruckhoff and C Theis and Claudia Grote and Christian Faubel and G Schöner},
issn = {16107438},
year = {2005},
date = {2005-01-01},
urldate = {2005-01-01},
booktitle = {Springer Tracts in Advanced Robotics},
volume = {14},
abstract = {CoRA is a robotic assistant whose task is to collaborate with a human operator on simple manipulation or handling tasks. Its sensory channels comprising vision, audition, haptics, and force sensing are used to extract perceptual information about speech, gestures and gaze of the operator, and object recognition. The anthropomorphic robot arm makes goal-directed movements to pick up and hand-over objects. The human operator may mechanically interact with the arm by pushing it away (haptics) or by taking an object out of the robot's gripper (force sensing). The design objective has been to exploit the human operator's intuition by modeling the mechanical structure, the senses, and the behaviors of the assistant on human anatomy, human perception, and human motor behavior. textcopyright Springer-Verlag Berlin Heidelberg 2005.},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {book}
}
@book{Iossifidis2005c,
title = {Behavior generation for Anthropomorphic robots by means of dynamical systems},
author = {Ioannis Iossifidis and A Steinhage},
issn = {16107438},
year = {2005},
date = {2005-01-01},
urldate = {2005-01-01},
booktitle = {Springer Tracts in Advanced Robotics},
volume = {14},
abstract = {This article describes the current state of our research on anthropomorphic robots. Our aim is to make the reader familiar with the two basic principles our work is based on: anthropomorphism and dynamics. The principle of anthropomorphism means a restriction to human-like robots which use version, audition and touch as their only sensors so that natural man-machine interaction is possible. The principle of dynamics stands for the mathematical framework based on which our robots generate their behavior. Both principles have their root in the idea that concepts of biological behavior and information processing can be exploited to control technical systems. textcopyright Springer-Verlag Berlin Heidelberg 2005.},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {book}
}
2004
@book{Prassler2004,
title = {Advances in Ħuman Robot Interaction},
author = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Ħägele and Rüdiger Đillmann and Ioannis Iossifidis},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Ħägele and Rüdiger Đillmann and Ioannis Iossifidis},
url = {http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-102-22-35029562-0,00.html?changeHeader=true},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Ħuman Robot Interaction},
volume = {14/2004},
number = {ISBN: 3-540-23211-7},
pages = {414},
publisher = {Springer Press},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {Human Robot Interaction and Cooperation
Motion Coordination
Multi-Modal Robot Interfaces
Physical Interaction between Humans and Robots
Robot Learning
Visual Instruction of Robots},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {book}
}
Motion Coordination
Multi-Modal Robot Interfaces
Physical Interaction between Humans and Robots
Robot Learning
Visual Instruction of Robots@book{Prassler2004c,
title = {Advances in Human Robot Interaction (Springer Tracts in Advanced Robotics)},
author = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.amazon.co.uk/Advances-Interaction-Springer-Advanced-Robotics/dp/3540232117},
isbn = {3540232117},
year = {2004},
date = {2004-01-01},
pages = {414},
publisher = {Springer},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm},
pubstate = {published},
tppubtype = {book}
}
@book{Prassler2004b,
title = {Advances in Human Robot Interaction},
author = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-102-22-35029562-0,00.html?changeHeader=true},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Human Robot Interaction},
volume = {14/2004},
pages = {414},
publisher = {Springer Press},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {Human Robot Interaction and Cooperation Motion Coordination Multi-Modal Robot Interfaces Physical Interaction between Humans and Robots Robot Learning Visual Instruction of Robots},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm},
pubstate = {published},
tppubtype = {book}
}
@incollection{Iossifidis2004e,
title = {Behavior Generation For Anthropomorphic Robots by Means of Dynamical Systems},
author = {Ioannis Iossifidis and Axel Steinhage},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.springerlink.com/index/96DD6AB012CF71E7},
doi = {0.1007/b97960},
isbn = {3-540-23211-7,},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Human Robot Interaction},
volume = {14/2004},
number = {ISBN: 3-540-23211-7,},
pages = {269--300},
publisher = {Springer Press},
chapter = {6},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {This article describes the current state of our research on anthropomorphic robots. Our aim is to make the reader familiar with the two basic principles our work is based on: anthropomorphism and dynamics. The principle of anthropomorphism means a restriction to human-like robots which use version, audition and touch as their only sensors so that natural man-machine interaction is possible. The principle of dynamics stands for the mathematical framework based on which our robots generate their behavior. Both principles have their root in the idea that concepts of biological behavior and information processing can be exploited to control technical systems.},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {incollection}
}
@incollection{Iossifidis2004c,
title = {Towards Benchmarking of Domestic Robotic Assistants},
author = {Ioannis Iossifidis and Gisbert Lawitzky and Stephan Knoop and Raoul Zöllner},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.springerlink.com/index/AB4F63B9DADFE299},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Human Robot Interaction},
volume = {14/2004},
number = {ISBN: 3-540-23211-7,},
pages = {403--414},
publisher = {Springer Press},
chapter = {7},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {As service robotics research advances rapidly, availability of objective, reproducible test specifications and evaluation criteria and also of benchmarking is more and more felt to be desirable in the community. As a first step towards benchmarking, in this paper we propose a formalization of tests - exemplified for domestic grasp&place tasks. The underlying philosophy of our approach is to confront the robot system in a black-box manner with requirements of a ldquorational customerrdquo, and characterize the performance of the system in an objective way by the outcomes of a test-suite tailored to this scenario. A formalized single test description consists of a clear and reproducible specification of the robotrsquos task and the full context on the one hand, and a number of figures which objectively characterize the test result on the other hand. We illustrate this methodology for the domestic assistance scenario.},
keywords = {Autonomous robotics, benchmarking, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {incollection}
}
@incollection{Iossifidis2004d,
title = {A Cooperative Robot Assistant CoRA For Human Environments},
author = {Ioannis Iossifidis and Carsten Bruckhoff and Christoph Theis and Claudia Grote and Christian Faubel and Gregor Schöner},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.springerlink.com/index/91656F7B99CD2C2C},
doi = {10.1007/b97960},
isbn = {3-540-23211-7,},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Human Robot Interaction},
volume = {14/2004},
number = {ISBN: 3-540-23211-7,},
pages = {385--401},
publisher = {Springer Press},
chapter = {7},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {CoRA is a robotic assistant whose task is to collaborate with a human operator on simple manipulation or handling tasks. Its sensory channels comprising vision, audition, haptics, and force sensing are used to extract perceptual information about speech, gestures and gaze of the operator, and object recognition. The anthropomorphic robot arm makes goal-directed movements to pick up and hand-over objects. The human operator may mechanically interact with the arm by pushing it away (haptics) or by taking an object out of the robotrsquos gripper (force sensing). The design objective has been to exploit the human operatorrsquos intuition by modeling the mechanical structure, the senses, and the behaviors of the assistant on human anatomy, human perception, and human motor behavior.},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {incollection}
}
@inproceedings{Iossifidis2004a,
title = {Attractor dynamics approach for autonomous collision-free path generation in 3d-space for an 7 dof robot arm},
author = {Ioannis Iossifidis and Gregor Schöner},
year = {2004},
date = {2004-01-01},
booktitle = {Proceedings of the ROBOTIK 2004, Leistungsstand - Anwendungen - Visionen - Trends, number 1841 in VDI-Berichte},
pages = {815--822},
publisher = {VDI Verlag},
address = {München, Germany},
organization = {VDI/VDE},
keywords = {arm movement model, Autonomous robotics, collision avoidance, Dynamical systems, inverse kinematics, movement model},
pubstate = {published},
tppubtype = {inproceedings}
}
2001
@inproceedings{Seelen2001,
title = {Visually guided behavior of an autonomous robot with a neuronal architecture},
author = {Werner Seelen and Ioannis Iossifidis and Axel Steinhage},
year = {2001},
date = {2001-01-01},
booktitle = {2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001},
address = {Banff, Canada},
organization = {IEEE},
series = {CIRA 2001, Workshop Vision-Based Object Recognition in Robotics},
abstract = {We constructed two Robot Systems. Both have a "neuronal architecture". The first (ARNOLD) is able to explore visually an unknown environement, to navigate in this environment and to use his 7DOF-arm to grasp and transport objects. The system can be guided by gestures and a limited set of spoken commands. The second system (CORA) is stationary and shall cooperate with a human at a production line in an interactive assembly process. Our contribution is focussed on to the vision problems. In both cases we use a 2DOF stereo camera system. The visual navigation is based on "place fields" obtained by correlating the current view with stored views captured at strategic points. This can be combined with a trajectory finding on the basis of nonlinear dynamics. Obstacles are avoided by repellors in the trajectory-equation and by inverse perspective mapping. Position and form of objects are evaluated in the sense of finding an appropriate grasping configuration for selected objects. The scene analysis in the CORA-system presupposes the estimation of the view-direction of the human partner. Than a limited set of objects can be detected and tracked if this is necessary (Hausdorff distance). The actual analysis of the entire scene relies on the relation of the detected objects to eachother within the environement, on the task to be fulfilled and on the step that is reached within the entire task. The different necessary estimations and detections within the sequences are coded in terms of Neural fields. In this way the visual perception, the interactive communication and the visually guided behaviour is realised in the same formate.},
keywords = {active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Seelen2001c,
title = {Visually guided behavior of an autonomous robot with a neuronal architecture},
author = {Werner Seelen and Ioannis Iossifidis and Axel Steinhage},
year = {2001},
date = {2001-01-01},
booktitle = {2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001},
address = {Banff, Canada},
organization = {IEEE},
series = {CIRA 2001, Workshop Vision-Based Object Recognition in Robotics},
abstract = {We constructed two Robot Systems. Both have a "neuronal architecture". The first (ARNOLD) is able to explore visually an unknown environement, to navigate in this environment and to use his 7DOF-arm to grasp and transport objects. The system can be guided by gestures and a limited set of spoken commands. The second system (CORA) is stationary and shall cooperate with a human at a production line in an interactive assembly process. Our contribution is focussed on to the vision problems. In both cases we use a 2DOF stereo camera system. The visual navigation is based on "place fields" obtained by correlating the current view with stored views captured at strategic points. This can be combined with a trajectory finding on the basis of nonlinear dynamics. Obstacles are avoided by repellors in the trajectory-equation and by inverse perspective mapping. Position and form of objects are evaluated in the sense of finding an appropriate grasping configuration for selected objects. The scene analysis in the CORA-system presupposes the estimation of the view-direction of the human partner. Than a limited set of objects can be detected and tracked if this is necessary (Hausdorff distance). The actual analysis of the entire scene relies on the relation of the detected objects to eachother within the environement, on the task to be fulfilled and on the step that is reached within the entire task. The different necessary estimations and detections within the sequences are coded in terms of Neural fields. In this way the visual perception, the interactive communication and the visually guided behaviour is realised in the same formate.},
keywords = {active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, multi-modal man-machine interaction system},
pubstate = {published},
tppubtype = {inproceedings}
}
@inproceedings{Seelen2001b,
title = {Visually guided behavior of an autonomous robot with a neuronal architecture},
author = {Werner Seelen and Ioannis Iossifidis and Axel Steinhage},
year = {2001},
date = {2001-01-01},
booktitle = {2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001},
address = {Banff, Canada},
organization = {IEEE},
series = {CIRA 2001, Workshop Vision-Based Object Recognition in Robotics},
abstract = {We constructed two Robot Systems. Both have a "neuronal architecture". The first (ARNOLD) is able to explore visually an unknown environement, to navigate in this environment and to use his 7DOF-arm to grasp and transport objects. The system can be guided by gestures and a limited set of spoken commands. The second system (CORA) is stationary and shall cooperate with a human at a production line in an interactive assembly process. Our contribution is focussed on to the vision problems. In both cases we use a 2DOF stereo camera system. The visual navigation is based on "place fields" obtained by correlating the current view with stored views captured at strategic points. This can be combined with a trajectory finding on the basis of nonlinear dynamics. Obstacles are avoided by repellors in the trajectory-equation and by inverse perspective mapping. Position and form of objects are evaluated in the sense of finding an appropriate grasping configuration for selected objects. The scene analysis in the CORA-system presupposes the estimation of the view-direction of the human partner. Than a limited set of objects can be detected and tracked if this is necessary (Hausdorff distance). The actual analysis of the entire scene relies on the relation of the detected objects to eachother within the environement, on the task to be fulfilled and on the step that is reached within the entire task. The different necessary estimations and detections within the sequences are coded in terms of Neural fields. In this way the visual perception, the interactive communication and the visually guided behaviour is realised in the same formate.},
keywords = {active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system},
pubstate = {published},
tppubtype = {inproceedings}
}
1999
@phdthesis{Iossifidis1999c,
title = {Visuelle Navigation auf einem autonomen mobilen Roboter},
author = {Ioannis Iossifidis},
year = {1999},
date = {1999-01-01},
school = {Fakultät für Physik, Technische Universität Dortmund},
keywords = {Autonomous robotics, Machine Learning, place cells, robot navigation},
pubstate = {published},
tppubtype = {phdthesis}
}