Human Centered Robotic System Design

The goal of this project is to provide the foundations to address human related concerns that arise in multiple human-robot systems, where robots have to perform tasks in the presence of (and in cooperation with) humans. In particular, we are targeting the fundamental understanding of two issues that are crucial in the integration of robotic systems into real-life human populated environments: first, how humans perceive autonomous mobile robots as a function of robots’ appearance and behavior; second, how to design and control mobile robots to improve the level of comfort and perceived safety of the people present in the environment.

YouTube Preview Image
Working with our collaborators in psychology we have set up experiments using galvanic skin response (GSR), heart rate, and head motion sensors to measure a person’s comfort levels, in real-time, as they interact in a virtual environment with different UAVs. This method allows for rapid and inexpensive testing in an environment that is much more controlled than real-life flight. With this information we can then determine which types of flight characteristics contribute to someone’s perceived discomfort and which flight paths are the most comfortable. We can also use this virtual reality platform to test how the physical appearance of a vehicle itself affects a person’s level of perceived safety.



  • H.-J. Yoon and C. Widdowson, “Regression of Human Physiological Arousal Induced by Flying Robots Using Deep Recurrent Neural Networks”, Coordinated Science Lab Student Conference 2017, University of Illinois at Urbana-Champaign, 2017
  • C. Widdowson, H.-J. Yoon, V. Cichella, F. Wang, and N. Hovakimyan, “VR environment for the study of co-located interaction between small UAVs and humans”, 8th International Conference on Applied Human Factors and Ergonomics and the Affiliated Conferences, Los Angeles, USA, 2017
  • T. Marinho, A. Lakshmanan, V. Cichella, C. Widdowson, H. Cui, R. Mitchell Jones, B. Sebastian, and C. Goudeseune, “VR Study of Human-Multicopter Interaction in a Residential Setting” IEEE Virtual Reality (VR), March 2016.


NICER Robotics Awarded CSL Video of the Month
NICER Robotics – Virtual Reality with L1 Adaptive Control Quad

To read more about the NSF/EAGER and their funding of this project, click here.