Human-Robot Interaction in Virtual Reality

We have built a virtual-reality simulator to design quadrotor robots that increase the independence of elderly persons living alone. The simulator lets us measure under what conditions a robot makes someone feel intrigued, or alarmed, or unconcerned, extending to nonhumanoids the studies of emotion that have been done for robots with facial expressions. This better model of how the virtual robot is perceived lets us agilely improve its appearance and behavior, containing the combinatorial explosion of parameters to study in costlier real-world experiments. The residence and the robot are rendered with Unity, and viewed with an Oculus Rift. Audio corresponding to flight maneuvers is resynthesized from recordings of an actual quadrotor. Flight dynamics using L1 adaptive control run in MATLAB and Simulink. Through menus operated by a Leap Motion hand tracker, the human subject commands the quadrotor to fly to various rooms (Unity waypoints), along precomputed paths that avoid collisions. The flight path is varied through a mathematical mapping to Laban Motion Factors (e.g., directness, sustainedness, lightness, and boundness) to provoke a range of responses. The subject’s emotion is estimated from heart rate, skin conductance, and head tilt from the Rift’s IMU (which is a quick metric for discomfort level).

[youtube]https://youtu.be/gRN_dfu94dY[/youtube]
Video submitted for the IEEE-VR Conference