Lehrgebiet: Theoretische Informatik und künstliche Intelligenz
Büro: 01.214
Labor: 04.105
Telefon: +49 208 88254-806
E-Mail:
Ioannis Iossifidis studierte Physik (Schwerpunkt: theoretische Teilchenphysik) an der Universität Dortmund und promovierte 2006 an der Fakultät für Physik und Astronomie der Ruhr-Universität Bochum.
Am Institut für Neuroinformatik leitete Prof. Dr. Iossifidis die Arbeitsgruppe Autonome Robotik und nahm mit seiner Forschungsgruppe erfolgreich an zahlreichen, vom BmBF und der EU, geförderten Forschungsprojekten aus dem Bereich der künstlichen Intelligenz teil. Seit dem 1. Oktober 2010 arbeitet er an der HRW am Institut Informatik und hält den Lehrstuhl für Theoretische Informatik – Künstliche Intelligenz.
Prof. Dr. Ioannis Iossifidis entwickelt seit über 20 Jahren biologisch inspirierte anthropomorphe, autonome Robotersysteme, die zugleich Teil und Ergebnis seiner Forschung im Bereich der rechnergestützten Neurowissenschaften sind. In diesem Rahmen entwickelte er Modelle zur Informationsverarbeitung im menschlichen Gehirn und wendete diese auf technische Systeme an.
Ausgewiesene Schwerpunkte seiner wissenschaftlichen Arbeit der letzten Jahre sind die Modellierung menschlicher Armbewegungen, der Entwurf von sogenannten «Simulierten Realitäten» zur Simulation und Evaluation der Interaktionen zwischen Mensch, Maschine und Umwelt sowie die Entwicklung von kortikalen exoprothetischen Komponenten. Entwicklung der Theorie und Anwendung von Algorithmen des maschinellen Lernens auf Basis tiefer neuronaler Architekturen bilden das Querschnittsthema seiner Forschung.
Ioannis Iossifidis’ Forschung wurde u.a. mit Fördermitteln im Rahmen großer Förderprojekte des BmBF (NEUROS, MORPHA, LOKI, DESIRE, Bernstein Fokus: Neuronale Grundlagen des Lernens etc.), der DFG («Motor‐parietal cortical neuroprosthesis with somatosensory feedback for restoring hand and arm functions in tetraplegic patients») und der EU (Neural Dynamics – EU (STREP), EUCogII, EUCogIII ) honoriert und gehört zu den Gewinnern der Leitmarktwettbewerbe Gesundheit.NRW und IKT.NRW 2019.
ARBEITS- UND FORSCHUNGSSCHWERPUNKTE
- Computational Neuroscience
- Brain Computer Interfaces
- Entwicklung kortikaler exoprothetischer Komponenten
- Theorie neuronaler Netze
- Modellierung menschlicher Armbewegungen
- Simulierte Realität
WISSENSCHAFTLICHE EINRICHTUNGEN
- Labor mit Verlinkung
- ???
- ???
LEHRVERANSTALTUNGEN
- ???
- ???
- ???
PROJEKTE
- Projekt mit Verlinkung
- ???
- ???
WISSENSCHAFTLICHE MITARBEITER*INNEN
Felix Grün
Büro: 02.216 (Campus Bottrop)
Marie Schmidt
Büro: 02.216 (Campus Bottrop)
Aline Xavier Fidencio
Gastwissenschaftlerin
Muhammad Ayaz Hussain
Doktorand
Tim Sziburis
Doktorand
Farhad Rahmat
studentische Hilfskraft
AUSGEWÄHLTE PUBLIKATIONEN
-
2014
16.Iossifidis, Ioannis
Simulated Framework for the Development and Evaluation of Redundant Robotic Systems Proceedings Article
In: International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014, 2014.
Abstract | BibTeX | Schlagwörter: Autonomous robotics, man machine interaction, simulated reality
@inproceedings{Iossifidis2014a,
title = {Simulated Framework for the Development and Evaluation of Redundant Robotic Systems},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014},
abstract = {In the current work we present a simulated environment for the development and evaluation of multi redundant open chain manipulators. The framework is implemented in Matlab and provides solutions for the kinematics and dynamics of an arbitrary open chain manipulator. For a anthropomorphic trunk-shoulder-arm configura- tion with in total nine degree of freedoms, a closed form solution of the inverse kinematics problem is derived. The attractor dynamics approach to motion generation was evaluated within this framework and the results are verified on the real anthropomorphic robotic assistant Cora.},
keywords = {Autonomous robotics, man machine interaction, simulated reality},
pubstate = {published},
tppubtype = {inproceedings}
}
In the current work we present a simulated environment for the development and evaluation of multi redundant open chain manipulators. The framework is implemented in Matlab and provides solutions for the kinematics and dynamics of an arbitrary open chain manipulator. For a anthropomorphic trunk-shoulder-arm configura- tion with in total nine degree of freedoms, a closed form solution of the inverse kinematics problem is derived. The attractor dynamics approach to motion generation was evaluated within this framework and the results are verified on the real anthropomorphic robotic assistant Cora.15.Iossifidis, Ioannis
Development of a Haptic Interface for Safe Human Robot Collaboration Proceedings Article
In: International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014, 2014.
Abstract | BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction
@inproceedings{Iossifidis2014b,
title = {Development of a Haptic Interface for Safe Human Robot Collaboration},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2014, PECCS2014},
abstract = {In the context of the increasing number of collaborative workplaces in industrial environments, where humans and robots sharing the same workplace, safety and intuitive interaction is a prerequisite. This means, that the robot can (1) have contact with his own body and the surrounding objects, (2) the motion of the robot can be corrected online by the human user just by touching his artificial skin or (3) interrupt the action in dangerous situations. In the current work we introduce a haptic interface (artificial skin) which is utilized to cover the arms of an anthropomorphic robotic assistant. The touched induced input of the artificial skin is interpreted and fed into the motor control algorithm to generate the desired motion and to avoid harm for human and machine.},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
In the context of the increasing number of collaborative workplaces in industrial environments, where humans and robots sharing the same workplace, safety and intuitive interaction is a prerequisite. This means, that the robot can (1) have contact with his own body and the surrounding objects, (2) the motion of the robot can be corrected online by the human user just by touching his artificial skin or (3) interrupt the action in dangerous situations. In the current work we introduce a haptic interface (artificial skin) which is utilized to cover the arms of an anthropomorphic robotic assistant. The touched induced input of the artificial skin is interpreted and fed into the motor control algorithm to generate the desired motion and to avoid harm for human and machine.14.Iossifidis, Ioannis
Development of a Haptic Interface for Safe Human Robot Collaboration Proceedings Article
In: International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014, 2014.
BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction
@inproceedings{Iossifidis2014c,
title = {Development of a Haptic Interface for Safe Human Robot Collaboration},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
13.Iossifidis, Ioannis
Simulated Framework for the Development and Evaluation of Redundant Robotic Systems Proceedings Article
In: International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014, 2014.
BibTeX | Schlagwörter: Autonomous robotics, man machine interaction, simulated reality
@inproceedings{Iossifidis2014ab,
title = {Simulated Framework for the Development and Evaluation of Redundant Robotic Systems},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014},
keywords = {Autonomous robotics, man machine interaction, simulated reality},
pubstate = {published},
tppubtype = {inproceedings}
}
12.Iossifidis, Ioannis
Development of a Haptic Interface for Safe Human Robot Collaboration Proceedings Article
In: International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014, 2014.
BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction
@inproceedings{Iossifidis2014e,
title = {Development of a Haptic Interface for Safe Human Robot Collaboration},
author = {Ioannis Iossifidis},
year = {2014},
date = {2014-01-01},
booktitle = {International Conference on Pervasive and Embedded and Communication Systems, 2012, PECCS2014},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
2013
11.Iossifidis, Ioannis
Utilizing artificial skin for direct physical interaction Proceedings Article
In: 2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013, 2013.
Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction
@inproceedings{Iossifidis2013c,
title = {Utilizing artificial skin for direct physical interaction},
author = {Ioannis Iossifidis},
doi = {10.1109/ROBIO.2013.6739562},
year = {2013},
date = {2013-01-01},
urldate = {2013-01-01},
booktitle = {2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013},
abstract = {Focusing on the development of flexible robots for industrial and household environments, we identify intuitive teaching as the key feature and direct physical interaction and guidance as the most important interface. In the current work we introduce a multi redundant robotic assistant equipped with a touch sensitive skin around the upper- and the forearm, in order to incorporate contact forces into the arm control. A context-sensitive interpretation of the contact forces is being used to guide the attention of the robot, to avoid obstacles and to move the robot arm directly by the human operator. textcopyright 2013 IEEE.},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
Focusing on the development of flexible robots for industrial and household environments, we identify intuitive teaching as the key feature and direct physical interaction and guidance as the most important interface. In the current work we introduce a multi redundant robotic assistant equipped with a touch sensitive skin around the upper- and the forearm, in order to incorporate contact forces into the arm control. A context-sensitive interpretation of the contact forces is being used to guide the attention of the robot, to avoid obstacles and to move the robot arm directly by the human operator. textcopyright 2013 IEEE.10.Iossifidis, Ioannis
Utilizing Artificial Skin for Direct Physical Interaction Proceedings Article
In: Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013), 2013.
Abstract | BibTeX | Schlagwörter: Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction
@inproceedings{Iossifidis2013d,
title = {Utilizing Artificial Skin for Direct Physical Interaction},
author = {Ioannis Iossifidis},
year = {2013},
date = {2013-01-01},
booktitle = {Proc. IEEE/RSJ International Conference on Robotics and Biomimetics (RoBio2013)},
abstract = {Autonomous robots with limited computational capacity call for control approaches that generate meaningful, goal-directed behavior without using a large amount of resources. The attractor dynamics approach to movement generation is a framework that links sensor data to motor commands via coupled dynamical systems that have attractors at behaviorally desired states. The low computational demands leave enough system resources for higher level function like forming a sequence of local goals to reach a distant one. The comparatively high performance of local behavior generation allows the global planning to be relatively simple. In the present paper, we apply this approach to generate walking trajectories for a small humanoid robot, the Aldebaran Nao, that are goal-directed and avoid obstacles. The sensor information is a single camera in the head of the robot. The limited field of vision is compensated by head movements. The design of the dynamical system for motion generation and the choice of state variable makes a computationally expensive scene representation or local map building unnecessary.},
keywords = {Autonomous robotics, direct physical interaction, haptic interface, human robot collaboration, man machine interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
Autonomous robots with limited computational capacity call for control approaches that generate meaningful, goal-directed behavior without using a large amount of resources. The attractor dynamics approach to movement generation is a framework that links sensor data to motor commands via coupled dynamical systems that have attractors at behaviorally desired states. The low computational demands leave enough system resources for higher level function like forming a sequence of local goals to reach a distant one. The comparatively high performance of local behavior generation allows the global planning to be relatively simple. In the present paper, we apply this approach to generate walking trajectories for a small humanoid robot, the Aldebaran Nao, that are goal-directed and avoid obstacles. The sensor information is a single camera in the head of the robot. The limited field of vision is compensated by head movements. The design of the dynamical system for motion generation and the choice of state variable makes a computationally expensive scene representation or local map building unnecessary.2010
9.Sandamirskaya, Yulia; Lipinski, John; Iossifidis, Ioannis; Schöner, G
Natural human-robot interaction through spatial language: a dynamic neural fields approach Proceedings Article
In: Proc. 19th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2010), S. 600–607, IEEE, 2010, ISSN: 1944-9445.
Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, man machine interaction, movement model, speech recognition
@inproceedings{Sandamirskayasubmitted,
title = {Natural human-robot interaction through spatial language: a dynamic neural fields approach},
author = {Yulia Sandamirskaya and John Lipinski and Ioannis Iossifidis and G Schöner},
url = {http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5598671},
issn = {1944-9445},
year = {2010},
date = {2010-01-01},
booktitle = {Proc. 19th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2010)},
pages = {600--607},
publisher = {IEEE},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, man machine interaction, movement model, speech recognition},
pubstate = {published},
tppubtype = {inproceedings}
}
8.Zibner, Stephan K U; Faubel, Christian; Iossifidis, Ioannis; Schöner, Gregor
Scene Representation Based on Dynamic Field Theory: From Human to Machine Artikel
In: Front. Comput. Neurosci. Conference Abstract: Bernstein Conference on Computational Neuroscience, 2010.
Links | BibTeX | Schlagwörter: dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition
@article{Zibner2010a,
title = {Scene Representation Based on Dynamic Field Theory: From Human to Machine},
author = {Stephan K U Zibner and Christian Faubel and Ioannis Iossifidis and Gregor Schöner},
doi = {10.3389/conf.fncom.2010.51.00019},
year = {2010},
date = {2010-01-01},
journal = {Front. Comput. Neurosci. Conference Abstract: Bernstein Conference on Computational Neuroscience},
keywords = {dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition},
pubstate = {published},
tppubtype = {article}
}
7.Zibner, Stephan S K U; Faubel, Christian; Iossifidis, Ioannis; Schöner, Gregor
Scene Representation for Anthropomorphic Robots: A Dynamic Neural Field Approach Proceedings Article
In: ISR / ROBOTIK 2010, VDE VERLAG GmbH, Munich, Germany, 2010.
Abstract | Links | BibTeX | Schlagwörter: Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition
@inproceedings{Zibner2010ab,
title = {Scene Representation for Anthropomorphic Robots: A Dynamic Neural Field Approach},
author = {Stephan S K U Zibner and Christian Faubel and Ioannis Iossifidis and Gregor Schöner},
url = {http://www.vde-verlag.de/proceedings-en/453273138.html},
year = {2010},
date = {2010-01-01},
booktitle = {ISR / ROBOTIK 2010},
number = {Isr},
publisher = {VDE VERLAG GmbH},
address = {Munich, Germany},
abstract = {An internal representation of a scene is essential to generate actions on scene objects. A stabilized storage of object location and features offers the flexibility to process queries phrased in human-based terms relating to objects, which may not be in the current camera view. Scene representation is therefore an internal representation of the surrounding world that is stabilized against head and body movement. It contains associated information about location and features of objects. Because objects and bodies move, scene representation is not a one-time process, but a constantly scene- adapting mechanism of scanning for, storing, updating, and deleting information.
Our novel architecture incorporates the generation of autonomous scanning sequences on real-time camera images. The head can then be oriented towards a selected object and the color feature can be extracted. Object location and feature information are associatively stored in a three-dimensional Dynamic Neural Field. Changes in the scene, even for multiple objects, can be tracked simultaneously. The stored information is used to generate behavior for cued recall. Cues can be table regions, features, or object labels. The robot demonstrates a successful recall by centering its gaze on the stated object.},
keywords = {Autonomous robotics, dynamic neural field, Dynamical systems, man machine interaction, scene representation, speech recognition},
pubstate = {published},
tppubtype = {inproceedings}
}
An internal representation of a scene is essential to generate actions on scene objects. A stabilized storage of object location and features offers the flexibility to process queries phrased in human-based terms relating to objects, which may not be in the current camera view. Scene representation is therefore an internal representation of the surrounding world that is stabilized against head and body movement. It contains associated information about location and features of objects. Because objects and bodies move, scene representation is not a one-time process, but a constantly scene- adapting mechanism of scanning for, storing, updating, and deleting information.
Our novel architecture incorporates the generation of autonomous scanning sequences on real-time camera images. The head can then be oriented towards a selected object and the color feature can be extracted. Object location and feature information are associatively stored in a three-dimensional Dynamic Neural Field. Changes in the scene, even for multiple objects, can be tracked simultaneously. The stored information is used to generate behavior for cued recall. Cues can be table regions, features, or object labels. The robot demonstrates a successful recall by centering its gaze on the stated object.