
Lehrgebiet: Theoretische Informatik und künstliche Intelligenz
Büro: 01.214
Labor: 04.105
Telefon: +49 208 88254-806
E-Mail:
🛜 http://lab.iossifidis.net

Ioannis Iossifidis studierte Physik (Schwerpunkt: theoretische Teilchenphysik) an der Universität Dortmund und promovierte 2006 an der Fakultät für Physik und Astronomie der Ruhr-Universität Bochum.
Am Institut für Neuroinformatik leitete Prof. Dr. Iossifidis die Arbeitsgruppe Autonome Robotik und nahm mit seiner Forschungsgruppe erfolgreich an zahlreichen, vom BmBF und der EU, geförderten Forschungsprojekten aus dem Bereich der künstlichen Intelligenz teil. Seit dem 1. Oktober 2010 arbeitet er an der HRW am Institut Informatik und hält den Lehrstuhl für Theoretische Informatik – Künstliche Intelligenz.
Prof. Dr. Ioannis Iossifidis entwickelt seit über 20 Jahren biologisch inspirierte anthropomorphe, autonome Robotersysteme, die zugleich Teil und Ergebnis seiner Forschung im Bereich der rechnergestützten Neurowissenschaften sind. In diesem Rahmen entwickelte er Modelle zur Informationsverarbeitung im menschlichen Gehirn und wendete diese auf technische Systeme an.
Ausgewiesene Schwerpunkte seiner wissenschaftlichen Arbeit der letzten Jahre sind die Modellierung menschlicher Armbewegungen, der Entwurf von sogenannten «Simulierten Realitäten» zur Simulation und Evaluation der Interaktionen zwischen Mensch, Maschine und Umwelt sowie die Entwicklung von kortikalen exoprothetischen Komponenten. Entwicklung der Theorie und Anwendung von Algorithmen des maschinellen Lernens auf Basis tiefer neuronaler Architekturen bilden das Querschnittsthema seiner Forschung.
Ioannis Iossifidis’ Forschung wurde u.a. mit Fördermitteln im Rahmen großer Förderprojekte des BmBF (NEUROS, MORPHA, LOKI, DESIRE, Bernstein Fokus: Neuronale Grundlagen des Lernens etc.), der DFG («Motor‐parietal cortical neuroprosthesis with somatosensory feedback for restoring hand and arm functions in tetraplegic patients») und der EU (Neural Dynamics – EU (STREP), EUCogII, EUCogIII ) honoriert und gehört zu den Gewinnern der Leitmarktwettbewerbe Gesundheit.NRW und IKT.NRW 2019.
ARBEITS- UND FORSCHUNGSSCHWERPUNKTE
- Computational Neuroscience
- Brain Computer Interfaces
- Entwicklung kortikaler exoprothetischer Komponenten
- Theorie neuronaler Netze
- Modellierung menschlicher Armbewegungen
- Simulierte Realität
WISSENSCHAFTLICHE EINRICHTUNGEN
- Labor mit Verlinkung
- ???
- ???
LEHRVERANSTALTUNGEN
- ???
- ???
- ???
PROJEKTE
- Projekt mit Verlinkung
- ???
- ???
WISSENSCHAFTLICHE MITARBEITER*INNEN

Felix Grün
Büro: 02.216 (Campus Bottrop)

Marie Schmidt
Büro: 02.216 (Campus Bottrop)

Aline Xavier Fidencio
Gastwissenschaftlerin

Muhammad Ayaz Hussain
Doktorand

Tim Sziburis
Doktorand

Farhad Rahmat
studentische Hilfskraft
GOOGLE SCHOLAR PROFIL

Artikel
Schmidt, Marie Dominique; Glasmachers, Tobias; Iossifidis, Ioannis
From Motion to Muscle Artikel
In: arXiv: 2201.11501 [cs.LG], 2022.
Links | BibTeX | Schlagwörter: BCI, Machine Learning, movement model, muscle signal generator, recurrent neural network
@article{schmidt2022motion,
title = {From Motion to Muscle},
author = {Marie Dominique Schmidt and Tobias Glasmachers and Ioannis Iossifidis},
doi = {https://doi.org/10.48550/arXiv.2201.11501},
year = {2022},
date = {2022-01-01},
urldate = {2022-01-01},
journal = {arXiv: 2201.11501 [cs.LG]},
keywords = {BCI, Machine Learning, movement model, muscle signal generator, recurrent neural network},
pubstate = {published},
tppubtype = {article}
}
Bücher
Iossifidis, Ioannis; Steinhage, A
Behavior generation for Anthropomorphic robots by means of dynamical systems Buch
2005, ISSN: 16107438.
Abstract | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model
@book{Iossifidis2005c,
title = {Behavior generation for Anthropomorphic robots by means of dynamical systems},
author = {Ioannis Iossifidis and A Steinhage},
issn = {16107438},
year = {2005},
date = {2005-01-01},
urldate = {2005-01-01},
booktitle = {Springer Tracts in Advanced Robotics},
volume = {14},
abstract = {This article describes the current state of our research on anthropomorphic robots. Our aim is to make the reader familiar with the two basic principles our work is based on: anthropomorphism and dynamics. The principle of anthropomorphism means a restriction to human-like robots which use version, audition and touch as their only sensors so that natural man-machine interaction is possible. The principle of dynamics stands for the mathematical framework based on which our robots generate their behavior. Both principles have their root in the idea that concepts of biological behavior and information processing can be exploited to control technical systems. textcopyright Springer-Verlag Berlin Heidelberg 2005.},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {book}
}
Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis
Advances in Human Robot Interaction (Springer Tracts in Advanced Robotics) Buch
Springer, 2004, ISBN: 3540232117.
Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm
@book{Prassler2004c,
title = {Advances in Human Robot Interaction (Springer Tracts in Advanced Robotics)},
author = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.amazon.co.uk/Advances-Interaction-Springer-Advanced-Robotics/dp/3540232117},
isbn = {3540232117},
year = {2004},
date = {2004-01-01},
pages = {414},
publisher = {Springer},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm},
pubstate = {published},
tppubtype = {book}
}
Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis
Advances in Human Robot Interaction Buch
Springer Press, 2004.
Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm
@book{Prassler2004b,
title = {Advances in Human Robot Interaction},
author = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-102-22-35029562-0,00.html?changeHeader=true},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Human Robot Interaction},
volume = {14/2004},
pages = {414},
publisher = {Springer Press},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {Human Robot Interaction and Cooperation Motion Coordination Multi-Modal Robot Interfaces Physical Interaction between Humans and Robots Robot Learning Visual Instruction of Robots},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, inverse kinematics, movement model, redundant robot arm},
pubstate = {published},
tppubtype = {book}
}
Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Ħägele, Martin; Đillmann, Rüdiger; Iossifidis, Ioannis
Advances in Ħuman Robot Interaction Buch
Springer Press, 2004.
Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model
@book{Prassler2004,
title = {Advances in Ħuman Robot Interaction},
author = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Ħägele and Rüdiger Đillmann and Ioannis Iossifidis},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Ħägele and Rüdiger Đillmann and Ioannis Iossifidis},
url = {http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-102-22-35029562-0,00.html?changeHeader=true},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Ħuman Robot Interaction},
volume = {14/2004},
number = {ISBN: 3-540-23211-7},
pages = {414},
publisher = {Springer Press},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {Human Robot Interaction and Cooperation
Motion Coordination
Multi-Modal Robot Interfaces
Physical Interaction between Humans and Robots
Robot Learning
Visual Instruction of Robots},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {book}
}
Motion Coordination
Multi-Modal Robot Interfaces
Physical Interaction between Humans and Robots
Robot Learning
Visual Instruction of Robots
Konferenzen
Klaes, Christian; Iossifidis, Ioannis
Low dimensional representation of human arm movement for efficient neuroprosthetic control by individuals with tetraplegia Konferenz
SfN Meeting 2017, 2017.
BibTeX | Schlagwörter: Autonomous robotics, BCI, Dynamical systems, movement model, neuroprosthetic
@conference{nokey,
title = {Low dimensional representation of human arm movement for efficient neuroprosthetic control by individuals with tetraplegia},
author = {Christian Klaes and Ioannis Iossifidis},
year = {2017},
date = {2017-01-01},
urldate = {2017-01-01},
booktitle = {SfN Meeting 2017},
keywords = {Autonomous robotics, BCI, Dynamical systems, movement model, neuroprosthetic},
pubstate = {published},
tppubtype = {conference}
}
Buchabschnitte
Iossifidis, Ioannis; Steinhage, Axel
Behavior Generation For Anthropomorphic Robots by Means of Dynamical Systems Buchabschnitt
In: Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis (Hrsg.): Advances in Human Robot Interaction, Bd. 14/2004, Nr. ISBN: 3-540-23211-7,, S. 269–300, Springer Press, 2004, ISBN: 3-540-23211-7,.
Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model
@incollection{Iossifidis2004e,
title = {Behavior Generation For Anthropomorphic Robots by Means of Dynamical Systems},
author = {Ioannis Iossifidis and Axel Steinhage},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.springerlink.com/index/96DD6AB012CF71E7},
doi = {0.1007/b97960},
isbn = {3-540-23211-7,},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Human Robot Interaction},
volume = {14/2004},
number = {ISBN: 3-540-23211-7,},
pages = {269--300},
publisher = {Springer Press},
chapter = {6},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {This article describes the current state of our research on anthropomorphic robots. Our aim is to make the reader familiar with the two basic principles our work is based on: anthropomorphism and dynamics. The principle of anthropomorphism means a restriction to human-like robots which use version, audition and touch as their only sensors so that natural man-machine interaction is possible. The principle of dynamics stands for the mathematical framework based on which our robots generate their behavior. Both principles have their root in the idea that concepts of biological behavior and information processing can be exploited to control technical systems.},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {incollection}
}
Iossifidis, Ioannis; Bruckhoff, Carsten; Theis, Christoph; Grote, Claudia; Faubel, Christian; Schöner, Gregor
A Cooperative Robot Assistant CoRA For Human Environments Buchabschnitt
In: Prassler, Erwin; Lawitzky, Gisbert; Stopp, Andreas; Grunwald, Gerhard; Hägele, Martin; Dillmann, Rüdiger; Iossifidis, Ioannis (Hrsg.): Advances in Human Robot Interaction, Bd. 14/2004, Nr. ISBN: 3-540-23211-7,, S. 385–401, Springer Press, 2004, ISBN: 3-540-23211-7,.
Abstract | Links | BibTeX | Schlagwörter: arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model
@incollection{Iossifidis2004d,
title = {A Cooperative Robot Assistant CoRA For Human Environments},
author = {Ioannis Iossifidis and Carsten Bruckhoff and Christoph Theis and Claudia Grote and Christian Faubel and Gregor Schöner},
editor = {Erwin Prassler and Gisbert Lawitzky and Andreas Stopp and Gerhard Grunwald and Martin Hägele and Rüdiger Dillmann and Ioannis Iossifidis},
url = {http://www.springerlink.com/index/91656F7B99CD2C2C},
doi = {10.1007/b97960},
isbn = {3-540-23211-7,},
year = {2004},
date = {2004-01-01},
booktitle = {Advances in Human Robot Interaction},
volume = {14/2004},
number = {ISBN: 3-540-23211-7,},
pages = {385--401},
publisher = {Springer Press},
chapter = {7},
series = {Springer Tracts in Advanced Robotics STAR},
abstract = {CoRA is a robotic assistant whose task is to collaborate with a human operator on simple manipulation or handling tasks. Its sensory channels comprising vision, audition, haptics, and force sensing are used to extract perceptual information about speech, gestures and gaze of the operator, and object recognition. The anthropomorphic robot arm makes goal-directed movements to pick up and hand-over objects. The human operator may mechanically interact with the arm by pushing it away (haptics) or by taking an object out of the robotrsquos gripper (force sensing). The design objective has been to exploit the human operatorrsquos intuition by modeling the mechanical structure, the senses, and the behaviors of the assistant on human anatomy, human perception, and human motor behavior.},
keywords = {arm movement model, Autonomous robotics, behavior generation, Dynamical systems, movement model},
pubstate = {published},
tppubtype = {incollection}
}
Proceedings Articles
Sziburis, Tim; Blex, Susanne; Iossifidis, Ioannis
Variability Study of Human Hand Motion during 3D Center-out Tasks Captured for the Diagnosis of Movement Disorders Proceedings Article
In: BC23 : Computational Neuroscience & Neurotechnology Bernstein Conference 2022, BCCN Bernstein Network Computational Network, 2023.
Abstract | BibTeX | Schlagwörter: movement model
@inproceedings{sziburisVariabilityStudyHuman2023,
title = {Variability Study of Human Hand Motion during 3D Center-out Tasks Captured for the Diagnosis of Movement Disorders},
author = {Tim Sziburis and Susanne Blex and Ioannis Iossifidis},
year = {2023},
date = {2023-09-15},
urldate = {2023-09-15},
booktitle = {BC23 : Computational Neuroscience & Neurotechnology Bernstein Conference 2022},
publisher = {BCCN Bernstein Network Computational Network},
abstract = {Variability analysis bears the potential to differentiate between healthy and pathological human movements [1]. Our study is conducted in the context of developing a portable glove for the diagnosis of movement disorders. This proposal has methodical as well as technical requirements. Generally, the identification of movement disorders via an analysis of motion data needs to be confirmed within the given setup. Typically, rhythmic movements like gait or posture control are examined for their variability, but here, the characteristic pathological traits of arm movement like tremors are under observation. In addition, the usability of a portable sensor instead of a stationary tracking system has to be validated. In this part of the project, human motion data are recorded redundantly by both an optical tracking system and an IMU. In our setup, a small cylinder is transported in three-dimensional space from a unified start position to one of nine target positions, which are equidistantly aligned on a semicircle. 10 trials are performed per target and hand, resulting in 180 trials per participant in total. 31 participants (11 female and 20 male) without known movement disorders, aged between 21 and 78 years, took part in the study. In addition, the 10-item EHI is used. The purpose of the analysis is to compare different variability measures to uncover differences between trials (intra-subject variability) and participants (inter-subject variability), especially in terms of age and handedness effects. Particularly, a novel variability measure is introduced which makes use of the characteristic planarity of the examined hand paths [2]. For this, the angle of the plane which best fits the travel phase of the trajectory is determined. In addition to neurological motivation, the advantage of this measure is that it allows the comparison of trials of different time spans and to different target directions without depending on trajectory warping. In the future, measurements of the same experimental setup with patients experiencing movement disorders are planned. For the subsequent pathological analysis, this study provides a basis in terms of methodological considerations and ground truth data of healthy participants. In parallel, the captured motion data are modelled utilizing dynamical systems (extended attractor dynamics approach). For this approach, the recorded and modelled data can be compared by the variability measures examined in this study.},
keywords = {movement model},
pubstate = {published},
tppubtype = {inproceedings}
}
Schmidt, Marie Dominique; Glasmachers, Tobias; Iossifidis, Ioannis
Motion Intention Prediction Proceedings Article
In: FENS, Forum 2022, FENS, Federation of European Neuroscience Societies, 2022.
Abstract | BibTeX | Schlagwörter: BCI, Machine Learning, movement model, muscle signal generator, recurrent neural network
@inproceedings{schmidtMotionIntentionPrediction2022a,
title = {Motion Intention Prediction},
author = {Marie Dominique Schmidt and Tobias Glasmachers and Ioannis Iossifidis},
year = {2022},
date = {2022-01-01},
urldate = {2022-01-01},
booktitle = {FENS, Forum 2022},
publisher = {FENS, Federation of European Neuroscience Societies},
abstract = {Motion intention prediction is the key to robot-assisted rehabilitation systems. These can rely on various biological signals. One commonly used signal is the muscle activity measured by an electromyogram that occurs between 50-100 milliseconds before the actual movement, allowing a real-world application to assist in time. We show that upper limb motion can be estimated from the corresponding muscle activity. To this end, eight-arm muscles are mapped to the joint angle, velocity, and acceleration of the shoulder, elbow, and wrist. For this purpose, we specifically develop an artificial neural network that estimates complex motions involving multiple upper limb joints. The network model is evaluated concerning its ability to generalize across subjects as well as for new motions. This is achieved through training on multiple subjects and additional transfer learning methods so that the prediction for new subjects is significantly improved. In particular, this is beneficial for a robust real-world application. Furthermore, we investigate the importance of the different parameters such as angle, velocity, and acceleration for simple and complex motions. Predictions for simple motions along with the main components of complex motions achieve excellent accuracy while joints that do not play a dominant role during the motion have comparatively lower accuracy.},
keywords = {BCI, Machine Learning, movement model, muscle signal generator, recurrent neural network},
pubstate = {published},
tppubtype = {inproceedings}
}