Lehrgebiet: Theoretische Informatik und künstliche Intelligenz
Büro: 01.214
Labor: 04.105
Telefon: +49 208 88254-806
E-Mail:
Ioannis Iossifidis studierte Physik (Schwerpunkt: theoretische Teilchenphysik) an der Universität Dortmund und promovierte 2006 an der Fakultät für Physik und Astronomie der Ruhr-Universität Bochum.
Am Institut für Neuroinformatik leitete Prof. Dr. Iossifidis die Arbeitsgruppe Autonome Robotik und nahm mit seiner Forschungsgruppe erfolgreich an zahlreichen, vom BmBF und der EU, geförderten Forschungsprojekten aus dem Bereich der künstlichen Intelligenz teil. Seit dem 1. Oktober 2010 arbeitet er an der HRW am Institut Informatik und hält den Lehrstuhl für Theoretische Informatik – Künstliche Intelligenz.
Prof. Dr. Ioannis Iossifidis entwickelt seit über 20 Jahren biologisch inspirierte anthropomorphe, autonome Robotersysteme, die zugleich Teil und Ergebnis seiner Forschung im Bereich der rechnergestützten Neurowissenschaften sind. In diesem Rahmen entwickelte er Modelle zur Informationsverarbeitung im menschlichen Gehirn und wendete diese auf technische Systeme an.
Ausgewiesene Schwerpunkte seiner wissenschaftlichen Arbeit der letzten Jahre sind die Modellierung menschlicher Armbewegungen, der Entwurf von sogenannten «Simulierten Realitäten» zur Simulation und Evaluation der Interaktionen zwischen Mensch, Maschine und Umwelt sowie die Entwicklung von kortikalen exoprothetischen Komponenten. Entwicklung der Theorie und Anwendung von Algorithmen des maschinellen Lernens auf Basis tiefer neuronaler Architekturen bilden das Querschnittsthema seiner Forschung.
Ioannis Iossifidis’ Forschung wurde u.a. mit Fördermitteln im Rahmen großer Förderprojekte des BmBF (NEUROS, MORPHA, LOKI, DESIRE, Bernstein Fokus: Neuronale Grundlagen des Lernens etc.), der DFG («Motor‐parietal cortical neuroprosthesis with somatosensory feedback for restoring hand and arm functions in tetraplegic patients») und der EU (Neural Dynamics – EU (STREP), EUCogII, EUCogIII ) honoriert und gehört zu den Gewinnern der Leitmarktwettbewerbe Gesundheit.NRW und IKT.NRW 2019.
ARBEITS- UND FORSCHUNGSSCHWERPUNKTE
- Computational Neuroscience
- Brain Computer Interfaces
- Entwicklung kortikaler exoprothetischer Komponenten
- Theorie neuronaler Netze
- Modellierung menschlicher Armbewegungen
- Simulierte Realität
WISSENSCHAFTLICHE EINRICHTUNGEN
- Labor mit Verlinkung
- ???
- ???
LEHRVERANSTALTUNGEN
- ???
- ???
- ???
PROJEKTE
- Projekt mit Verlinkung
- ???
- ???
WISSENSCHAFTLICHE MITARBEITER*INNEN
Felix Grün
Büro: 02.216 (Campus Bottrop)
Marie Schmidt
Büro: 02.216 (Campus Bottrop)
Aline Xavier Fidencio
Gastwissenschaftlerin
Muhammad Ayaz Hussain
Doktorand
Tim Sziburis
Doktorand
Farhad Rahmat
studentische Hilfskraft
AUSGEWÄHLTE PUBLIKATIONEN
-
2001
9.Iossifidis, Ioannis; Steinhage, A
Dynamical systems: A framework for man machine interaction Buch
2001, ISBN: 9608052440.
Abstract | BibTeX | Schlagwörter: Anthropomorphic Robot, Dynamic Approach, Dynamical systems, Machine Learning
@book{Iossifidis2001a,
title = {Dynamical systems: A framework for man machine interaction},
author = {Ioannis Iossifidis and A Steinhage},
isbn = {9608052440},
year = {2001},
date = {2001-01-01},
urldate = {2001-01-01},
booktitle = {Advances in Automation, Multimedia and Video Systems, and Modern Computer Science},
abstract = {We present an architecture to generate behavior for anthropomorphic robots. The goal is to equip the robots with the capacity to interact naturally with a human sharing the same interaction-channels. Motivated by the research on biological systems, our basic assumption is that the behavior to perform determines the external and internal structure of the behaving system. We describe the anthropomorphic design of our robots and present a distributed control system that generates human-like navigation and manipulation behavior. As the mathematical framework for this purpose we have developed a control system which is entirely based on dynamical systems in the form of instantiated dynamics and neural fields.},
keywords = {Anthropomorphic Robot, Dynamic Approach, Dynamical systems, Machine Learning},
pubstate = {published},
tppubtype = {book}
}
We present an architecture to generate behavior for anthropomorphic robots. The goal is to equip the robots with the capacity to interact naturally with a human sharing the same interaction-channels. Motivated by the research on biological systems, our basic assumption is that the behavior to perform determines the external and internal structure of the behaving system. We describe the anthropomorphic design of our robots and present a distributed control system that generates human-like navigation and manipulation behavior. As the mathematical framework for this purpose we have developed a control system which is entirely based on dynamical systems in the form of instantiated dynamics and neural fields.8.Iossifidis, Ioannis; Steinhage, Axel
Dynamical Systems: A Framework for Man Machine Interaction Proceedings Article
In: Kluev, V V; D'Attellis, C E; Mastorakis, N E (Hrsg.): Proceedings of the International Conference On Automation and Information: Theory and Applications 2001, (AITA 2001), WSES WSES Press, Skiathos Island, Greece, 2001.
Abstract | BibTeX | Schlagwörter: Anthropomorphic Robot, Dynamic Approach, Dynamical systems, Machine Learning
@inproceedings{Iossifidis2001b,
title = {Dynamical Systems: A Framework for Man Machine Interaction},
author = {Ioannis Iossifidis and Axel Steinhage},
editor = {V V Kluev and C E D'Attellis and N E Mastorakis},
year = {2001},
date = {2001-01-01},
booktitle = {Proceedings of the International Conference On Automation and Information: Theory and Applications 2001, (AITA 2001)},
number = {ISBN:960-8052-44-0},
publisher = {WSES Press},
address = {Skiathos Island, Greece},
organization = {WSES},
abstract = {We present an architecture to generate behavior for anthropomorphic robots. The goal is to equip the robots with the capacity to interact naturally with a human sharing the same interaction- channels. Motivated by the research on biological systems, our basic assumption is that the behavior to perform determines the external and internal structure of the behaving system. We describe the anthropomorphic design of our robots and present a distributed control system that generates human- like navigation and manipulation behavior. As the mathematical framework for this purpose we have developed a control system which is entirely based on dynamical systems in the form of instantiated dynamics and neural fields.},
keywords = {Anthropomorphic Robot, Dynamic Approach, Dynamical systems, Machine Learning},
pubstate = {published},
tppubtype = {inproceedings}
}
We present an architecture to generate behavior for anthropomorphic robots. The goal is to equip the robots with the capacity to interact naturally with a human sharing the same interaction- channels. Motivated by the research on biological systems, our basic assumption is that the behavior to perform determines the external and internal structure of the behaving system. We describe the anthropomorphic design of our robots and present a distributed control system that generates human- like navigation and manipulation behavior. As the mathematical framework for this purpose we have developed a control system which is entirely based on dynamical systems in the form of instantiated dynamics and neural fields.7.Iossifidis, Ioannis; Steinhage, Axel
Control of an 8 DoF Manipulator by Means of Neural Fields Proceedings Article
In: Halme, Aarne; Chatila, Raja; Prassler, Erwin (Hrsg.): Proceedings of the International Conference On Field and Service Robotics, FSR2001, Yleisjäljennös-Painopörssi, Helsinki, Finland, 2001.
Abstract | BibTeX | Schlagwörter: inverse kinematics, Machine Learning, neural fields, robot manipulator control
@inproceedings{Iossifidis2001c,
title = {Control of an 8 DoF Manipulator by Means of Neural Fields},
author = {Ioannis Iossifidis and Axel Steinhage},
editor = {Aarne Halme and Raja Chatila and Erwin Prassler},
year = {2001},
date = {2001-01-01},
booktitle = {Proceedings of the International Conference On Field and Service Robotics, FSR2001},
publisher = {Yleisjäljennös-Painopörssi},
address = {Helsinki, Finland},
abstract = {In this article we present a new approach for the control of a redundant robot arm. Our approach contains two parts: first, we have implemented a concept which deals with the underdetermined problem of the inverse kinematics of robot arms with more than six degrees of freedom. This concept guarantees a one-to-one mapping between the task coordinates (position and orientation) and the joint coordinates of the robot arm and allows to use the additional degrees of freedom for additional task requirements such as obstacle avoidance and smoothness of the trajectory. Second, we apply a mathe- matical concept known as neural fields to the control of the end-effector's position. The application of neural fields to the problem of trajectory generation solves two problems: a smooth end-effector trajec- tory is generated and obstacles are avoided. After presenting our hardware platform, an anthropomorphic assistance robot, we will describe the basic concepts of our approach.},
keywords = {inverse kinematics, Machine Learning, neural fields, robot manipulator control},
pubstate = {published},
tppubtype = {inproceedings}
}
In this article we present a new approach for the control of a redundant robot arm. Our approach contains two parts: first, we have implemented a concept which deals with the underdetermined problem of the inverse kinematics of robot arms with more than six degrees of freedom. This concept guarantees a one-to-one mapping between the task coordinates (position and orientation) and the joint coordinates of the robot arm and allows to use the additional degrees of freedom for additional task requirements such as obstacle avoidance and smoothness of the trajectory. Second, we apply a mathe- matical concept known as neural fields to the control of the end-effector's position. The application of neural fields to the problem of trajectory generation solves two problems: a smooth end-effector trajec- tory is generated and obstacles are avoided. After presenting our hardware platform, an anthropomorphic assistance robot, we will describe the basic concepts of our approach.6.Theis, Christoph; Iossifidis, Ioannis; Steinhage, Axel
Image processing methods for interactive robot control Proceedings Article
In: Proc. 10th IEEE International Workshop on Robot and Human Interactive Communication, S. 424–429, 2001.
Abstract | Links | BibTeX | Schlagwörter: active stereo camera system, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system
@inproceedings{Theis2001,
title = {Image processing methods for interactive robot control},
author = {Christoph Theis and Ioannis Iossifidis and Axel Steinhage},
doi = {10.1109/ROMAN.2001.981941},
year = {2001},
date = {2001-01-01},
urldate = {2001-01-01},
booktitle = {Proc. 10th IEEE International Workshop on Robot and Human Interactive Communication},
pages = {424--429},
abstract = {In this paper we describe a straight forward technique for tracking a human hand based on images acquired by an active stereo camera system. We demonstrate the implementation of this method on an anthropomorphic assistance robot as part of a multi-modal man-machine interaction system: detecting the hand-position, the robot can interprete a human pointing gesture as the specification of a target object to grasp},
keywords = {active stereo camera system, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system},
pubstate = {published},
tppubtype = {inproceedings}
}
In this paper we describe a straight forward technique for tracking a human hand based on images acquired by an active stereo camera system. We demonstrate the implementation of this method on an anthropomorphic assistance robot as part of a multi-modal man-machine interaction system: detecting the hand-position, the robot can interprete a human pointing gesture as the specification of a target object to grasp5.Seelen, Werner; Iossifidis, Ioannis; Steinhage, Axel
Visually guided behavior of an autonomous robot with a neuronal architecture Proceedings Article
In: 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001, IEEE Banff, Canada, 2001.
Abstract | BibTeX | Schlagwörter: active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system
@inproceedings{Seelen2001,
title = {Visually guided behavior of an autonomous robot with a neuronal architecture},
author = {Werner Seelen and Ioannis Iossifidis and Axel Steinhage},
year = {2001},
date = {2001-01-01},
booktitle = {2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA 2001},
address = {Banff, Canada},
organization = {IEEE},
series = {CIRA 2001, Workshop Vision-Based Object Recognition in Robotics},
abstract = {We constructed two Robot Systems. Both have a "neuronal architecture". The first (ARNOLD) is able to explore visually an unknown environement, to navigate in this environment and to use his 7DOF-arm to grasp and transport objects. The system can be guided by gestures and a limited set of spoken commands. The second system (CORA) is stationary and shall cooperate with a human at a production line in an interactive assembly process. Our contribution is focussed on to the vision problems. In both cases we use a 2DOF stereo camera system. The visual navigation is based on "place fields" obtained by correlating the current view with stored views captured at strategic points. This can be combined with a trajectory finding on the basis of nonlinear dynamics. Obstacles are avoided by repellors in the trajectory-equation and by inverse perspective mapping. Position and form of objects are evaluated in the sense of finding an appropriate grasping configuration for selected objects. The scene analysis in the CORA-system presupposes the estimation of the view-direction of the human partner. Than a limited set of objects can be detected and tracked if this is necessary (Hausdorff distance). The actual analysis of the entire scene relies on the relation of the detected objects to eachother within the environement, on the task to be fulfilled and on the step that is reached within the entire task. The different necessary estimations and detections within the sequences are coded in terms of Neural fields. In this way the visual perception, the interactive communication and the visually guided behaviour is realised in the same formate.},
keywords = {active stereo camera system, Autonomous robotics, human hand tracking, human pointing gesture, image processing, interactive robot control, Machine Learning, multi-modal man-machine interaction system},
pubstate = {published},
tppubtype = {inproceedings}
}
We constructed two Robot Systems. Both have a "neuronal architecture". The first (ARNOLD) is able to explore visually an unknown environement, to navigate in this environment and to use his 7DOF-arm to grasp and transport objects. The system can be guided by gestures and a limited set of spoken commands. The second system (CORA) is stationary and shall cooperate with a human at a production line in an interactive assembly process. Our contribution is focussed on to the vision problems. In both cases we use a 2DOF stereo camera system. The visual navigation is based on "place fields" obtained by correlating the current view with stored views captured at strategic points. This can be combined with a trajectory finding on the basis of nonlinear dynamics. Obstacles are avoided by repellors in the trajectory-equation and by inverse perspective mapping. Position and form of objects are evaluated in the sense of finding an appropriate grasping configuration for selected objects. The scene analysis in the CORA-system presupposes the estimation of the view-direction of the human partner. Than a limited set of objects can be detected and tracked if this is necessary (Hausdorff distance). The actual analysis of the entire scene relies on the relation of the detected objects to eachother within the environement, on the task to be fulfilled and on the step that is reached within the entire task. The different necessary estimations and detections within the sequences are coded in terms of Neural fields. In this way the visual perception, the interactive communication and the visually guided behaviour is realised in the same formate.1999
4.Iossifidis, Ioannis
Visuelle Navigation Auf Einem Autonomen Mobilen Roboter thesis
1999.
BibTeX | Schlagwörter: Machine Learning
@thesis{iossifidisVisuelleNavigationAuf1999b,
title = {Visuelle Navigation Auf Einem Autonomen Mobilen Roboter},
author = {Ioannis Iossifidis},
year = {1999},
date = {1999-01-01},
institution = {Fakultät für Physik, Technische Universität Dortmund},
keywords = {Machine Learning},
pubstate = {published},
tppubtype = {thesis}
}
3.Iossifidis, Ioannis
Visuelle Navigation auf einem autonomen mobilen Roboter Promotionsarbeit
Fakultät für Physik, Technische Universität Dortmund, 1999.
BibTeX | Schlagwörter: Autonomous robotics, Machine Learning, place cells, robot navigation
@phdthesis{Iossifidis1999c,
title = {Visuelle Navigation auf einem autonomen mobilen Roboter},
author = {Ioannis Iossifidis},
year = {1999},
date = {1999-01-01},
school = {Fakultät für Physik, Technische Universität Dortmund},
keywords = {Autonomous robotics, Machine Learning, place cells, robot navigation},
pubstate = {published},
tppubtype = {phdthesis}
}
2.Iossifidis, Ioannis
Visuelle Navigation auf einem autonomen mobilen Roboter Promotionsarbeit
Fakultät für Physik, Technische Universität Dortmund, 1999.
BibTeX | Schlagwörter: Machine Learning
@phdthesis{Iossifidis1999,
title = {Visuelle Navigation auf einem autonomen mobilen Roboter},
author = {Ioannis Iossifidis},
year = {1999},
date = {1999-01-01},
school = {Fakultät für Physik, Technische Universität Dortmund},
keywords = {Machine Learning},
pubstate = {published},
tppubtype = {phdthesis}
}
1.Iossifidis, Ioannis
Visuelle Navigation auf einem autonomen mobilen Roboter Promotionsarbeit
Fakultät für Physik, Technische Universität Dortmund, 1999.
BibTeX | Schlagwörter: Autonomous robotics, Machine Learning, place cells, robot navigation
@phdthesis{Iossifidis1999b,
title = {Visuelle Navigation auf einem autonomen mobilen Roboter},
author = {Ioannis Iossifidis},
year = {1999},
date = {1999-01-01},
school = {Fakultät für Physik, Technische Universität Dortmund},
keywords = {Autonomous robotics, Machine Learning, place cells, robot navigation},
pubstate = {published},
tppubtype = {phdthesis}
}