Cognitive robotics for collaboration
For robots to become an effective component of our society, it is necessary that these agents become primarily cognitive systems, endowed with a cognitive architecture that enables them to adapt, acquire experience, act and interact pro-actively and predictively with the environment and communicate with the human partners. Human communication depends on mutual understanding: I know how to communicate because I entertain and adapt a model of you, which enables me to select an effective way to convey to you what I want and to have an intuition of your internal states – what you need, prefer, fear or desire. Such intuition enables me to perceive properties that would be otherwise not accessible to my perception, as goals, emotions or effort, and to establish a common ground, a shared perception with the partner. Our contribution to the roadmap toward the architecture of cognitive systems leverages on the use of a humanoid robot (iCub) as a tool of investigation of human cognitive abilities. Moreover, the robot plays a crucial role to test some of our assumptions on how to build a cognitive interactive agent. We attempt at modeling the minimal skills necessary for cognitive development, starting from the visual features that enable to recognize the presence and the behavior of other agents in the scene and to foster automatic coordination in human-robot interactive tasks. In a dual approach, we are trying to understand how to modulate robot behavior to elicit better human understanding and to express different characteristics of the interaction: from the mood to the level of commitment. This approach is propaedeutic to the creation of a cognitive system, by helping in the definition of what is relevant to attend to, starting from signals originating from the intrinsic characteristics of the human body. In parallel, we are investigating novel learning frameworks, to enable the robot to adapt autonomously to the partner’s needs and preferences, to foster the establishment of a solid relation with the machine, going beyond the single interaction. We believe that only a structured effort toward cognition will in the future lead to more humane machines, able to see the world and people as we do and engage with them in a meaningful manner.
Alessandra received her Ph.D. in Humanoid Technologies from the University of Genova (Italy) in 2010. After a Post Doc at the Italian Institute of Technology (IIT) and two research periods in USA and Japan, she became the scientific responsible of the Cognitive Robotics and Interaction Laboratory of the RBCS Dept. at IIT. After being Assistant Professor in Bioengineering at DIBRIS University of Genoa, she is now Tenure-Track Researcher at the Italian Institute of Technology, head of the COgNiTive Architecture for Collaborative Technologies (CONTACT) unit. In 2018 she has been awarded the ERC Starting Grant wHiSPER, focused on the investigation of joint perception between humans and robots. She published more than 60 papers and abstracts and participated in the coordination of the CODEFROR European IRSES project. She is an Associate Editor of Robots and Autonomous Systems, Cognitive Systems Research and the International Journal of Humanoid Robotics and she has served as a member of the Program Committee for the International Conference on Human-Agent Interaction and IEEE International conference on Development and Learning and Epigenetic Robotics. The scientific aim of her research is to investigate the sensory and motor mechanisms underlying mutual understanding in human-human and human-robot interaction.
2020-11-03 at 3:00 pm