CINACS
Cross-modal Interaction in Natural and Artificial Cognitive Systems
Topic: "Real world planning based on the retrieval of episodic memories"
Within this international graduate colleg we will investigate the principles of cross-modal interactions in natural and cognitive systems to implement them in artificial systems. Research will primarily consider three sensory systems (vision, hearing and haptics) and their interactions. We will study multisensory interaction in natural systems with behavioural, electrophysiological and neuroimaging techniques. Different paradigms including cross-modal association learning, sensorimotor control, cross-modal illusions and multisensory language perception will be used to uncover the principles of multisensory processes and multimodal representation.
My part within the CINACS project mainly focus on the development of grounded multi-modal memory in robots. I will investigate approaches to planning real-world perception and action based on multimodal memories. A system should allow reasoning about following actions with respect to the situated context by using past experience. Thus the information of past events - the experience - should be stored in a kind of episodic memory like the human archetype uses.
Some of the main questions during my Ph.D. will be (a) how to represent episodic information and (b) how to access the information of this memory concept by an episodical information retrieval system and finally (c) how to plan and reason about the next action to apply for our TASER service-robot to solve the main goal.