Bidirectional Mental State Alignment for Human-Machine Collaboration

Xiaofeng Gao
PhD, 2022
Zhu, Song-Chun
For machines working alongside with humans, it is necessary to understand humans' mental states, including desires, beliefs and intentions for better interactions. In addition, humans also need to understand machines' capabilities and limitations to trust and rely on machines approriately. Achieving such bidrectional mental state alignment is crucial to the success of human-machine collaboration. This dissertation addresses this core challenge from both directions, spanning the domains of embodied artificial intelligence, autonomous driving and human-robot interaction. In the first direction of machines understanding humans, I propose a virtual environment for embodied agents and human users to work on daily activities via simulation. To enable embodied agents to better understand and execute human commands, I propose a benchmark allowing them to actively ask questions to resolve language ambiguities. For autonomous driving systems to have an accurate mental model of drivers, I propose a novel protocol to evaluate the effects of human-machine interfaces on drivers' situational awareness in different traffic conditions. In the second direction, I study how robots can generate communicative actions to be better understood by humans. I propose i) an action parsing algorithm based on an And-Or graph representation to generate explanations of task plans, ii) a task and motion planning framework to calibrate robot reachable workspace by expressive motions. My works culminate in building a computational framework for bidirectional value alignment, which is evaluated in the human-machine collaborative scout exploration game.
2022