Virginia Tech® home

Biometrically Secure Human Robot Interaction

Researchers will develop behavioral biometric algorithms to secure non-verbal, proximal human-robot interaction systems, providing autonomous robots with the capability of assessing the behavior of human partners.

Project funded by: CCI Hub


Project Investigators

Principal Investigator (PI): Patrick Martin, assistant professor, Virginia Commonwealth University Department of Electrical and Computer Engineering

Co-PI: Kate Sicchio,  assistant professor, Virginia Commonwealth University Department of Dance and Choreography and Department of Kinetic Imaging

Co-PI: Joseph Shelton, assistant professor, Virginia State University Department of Computer Science

Rationale and Background

Autonomous robots will soon participate as frequent collaborators across multiple economic sectors, such as agriculture, manufacturing, health care, transportation, and national security.

Present-day human-robot interaction (HRI) methods will be augmented, or supplanted, by non-verbal communication.

When integrating autonomous systems into applications where humans are in close proximity, security issues could lead to physical harm and loss of privileged information. Recent studies have demonstrated there are multiple cybersecurity issues with HRI, but they’re focused on the computing and networking aspects.

As as non-verbal, proximal HRI becomes a more prevalent communication method, the potential for cyber-physical security issues increases. 

Methodology

The technical approach will investigate methods for robots to learn models of trustworthy, gesture-based HRI and use those models to assess whether the human’s interaction is nominal, or potentially a threat to a task. 

The project will leverage expertise from movement and dance, autonomous robotics, and behavioral biometrics to develop new HRI data sets and human interaction assessment algorithms. 

Dance and choreography are key to this work, as choreographers create situations for movement-based, non-verbal communication and dancers perform goal-oriented motions to express intent. 

These skills will support the creation of motion data sets and influence how our HRI’s algorithmic trust model will identify human interaction abnormalities and affect how autonomous robots should respond.

The team will execute a series of research and development tasks:

  • Gesture-based HRI prototype and data collection.
  • Human trustworthiness modeling.
  • Evaluation and performance development.
  • Inverting the relationship such that the robot assesses the user’s trustworthiness.
  • Incorporating gesture-based biometric data as part of the user assessment.
  • Measuring the effectiveness of the gesture-based HRI and human trustworthiness model with user studies at the end of each development iteration.

Projected Outcomes

This research will increase autonomous robots’ ability to secure human interaction by assessing the proximal, biometric behavior of their human partners.  Researchers will:

  • Develop a gesture-based proximal HRI dataset to inform algorithm development: Provide a new, open HRI dataset that other researchers can leverage.
  • Develop a human trustworthiness framework using biometric and behavioral data: Secure human-robot teaming by including gesture biometrics and user behaviors.
  • Create a live performance that integrates humans and machines with secure HRI: Raise public awareness of this proposal’s intellectual and technological results.
  • Provide PIs with the preliminary results required to develop proposals for federally funded research: Researchers anticipate that the intellectual property would be unique enough to develop patents related to securing human-robot interaction.