BioInstrumentation

Vision-Based Human-Robot Interaction

Navid Fallahinia, Ph.D. Candidate

Human-robot interaction (HRI) refers to the communication and interaction between a human and a robotic system either directly through physical interactions or via a user interface through gesture recognition. Although advances in physical human-robot interaction (pHRI) have opened up novel application domains, these techniques are highly dependent on the ability to sense the physical interactions between the human and environment. On the other hand, in vision-based human-robot interaction (vHRI), visual feedback of the human
pose and gesture is used to detect and recognize meaningful expressions of human movement and intent. However, in this approach, the robotic system has very limited knowledge about the interaction forces between the human and the object/environment that occur in grasping or manipulation tasks, since no force measurement is used. This project aims to bridge the gap between these two human-robot interaction approaches by using the fingernail imaging method, which can rely solely on visual feedback to estimate both human hand pose and grasp force in real-time.

This research will enable a co-robot to detect the individual finger forces of a human partner using a technique that does not interfere with the human’s haptic sense. This ability could be used in a wide range of applications. Vision-based HRI can help with the development of ergonomic assistive devices and rehabilitation technology. Inferring human intent for collaborative robot planning can also help with the improvement of visually guided surgical robotic systems.

 

Human Grasp Study via Fingernail Imaging

 

Navid Fallahinia, Ph.D. Candidate

Fingernail imaging has been proven to be an effective method for estimating human contact forces during grasping or manipulation on multiple fingers using vision alone. As a result, this method does not require instrumented objects and, thus, can be extended to any grasping object. Moreover, fingernail imaging performs force estimation without constraining how the human grasps an object, including finger placement and force collaboration among the fingers. However, fingernail imaging is limited by the ability to maintain a proper view
of the human hand and fingers with one or more cameras. To overcome this problem, we propose a scalable system of multiple collaborative robots with cameras mounted on their end-effectors. These robots will follow the motion of the human’s hand, constantly imaging the fingernails to track finger forces as the human manipulates objects.

This system provides a method for totally unconstrained grasp force measurement during real dexterous manipulation tasks. The increased understanding of precision grasping could be used in machine learning and artificial manipulators, as robots observe, and duplicate grasps learned from human subjects. A co-robot could learn to imitate the grasp forces, not simply the kinematic motions, of the human teacher. Unconstrained grasping experiments using fingernail imaging may also improve the development of prosthetic devices, due to an improved understanding of human grasping and its relationship with the central nervous system.

 

Publications:

  • Fallahinia, N., and Mascaro, S., (2020). Real-Time Tactile Force Estimation of Fingertip via Fingernail Imaging Using Deep Neural Networks. IEEE Robotics and Automation Letters (Under Review).
  • Fallahinia, N., and Mascaro, S., (2020). The Effect of Contact Surface on Tactile Touch Force Prediction Using Fingernail Imaging. IEEE Transactions on Haptics (Under Review).
  • Grieve, T., Fallahinia, N., Hollerbach, J., and Mascaro, S., 2020. Image Registration Using Active Appearance Models for Calibration of Fingernail Imaging. Computer Vision and Image Understanding.(Under Review).
  • Fallahinia, N., and Mascaro, S., (2020). Comparison of Constrained and Unconstrained Human Grasp Forces Using Fingernail Imaging and Visual Servoing. IEEE International Conference on Robotics and Automation (ICRA) 2020.
  • Fallahinia, N., and Mascaro, S., (2020). The Effect of Contact Surface Curvature on The Accuracy of Fingernail Imaging for Tactile Force Measurement. 2020 IEEE Haptic Symposium.
  • Fallahinia, N., and Mascaro, S., (2019). Feasibility Study of Force Measurement for Multi-digit Unconstrained Grasping via Fingernail Imaging and Visual Servoing. ASME Letters in Dynamic Systems and Control.
  • Fallahinia, N., Harris, S. and Mascaro, S., (2018), September. Grasp Force Sensing Using Visual Servoing and Fingernail Imaging. In ASME 2018 Dynamic Systems and Control Conference.
  • Grieve, T. R., Hollerbach, J. M., & Mascaro, S. A. (2016). 3-d fingertip touch force prediction using fingernail imaging with automated calibration. IEEE Transactions on Robotics.
  • Grieve, T. R., Hollerbach, J. M., & Mascaro, S. A. (2016). Optimizing fingernail imaging calibration for 3d force magnitude prediction. IEEE Transactions on Haptics.
  • Grieve, T. R., Doyle, C. E., Hollerbach, J. M., & Mascaro, S. A. (2014). Calibration of fingernail imaging for multidigit force measurement. In 2014 IEEE Haptics Symposium.
  • Grieve, T. R., Hollerbach, J. M., & Mascaro, S. A. (2013). Fingernail image registration using active appearance models. In 2013 IEEE International Conference on Robotics and Automation (ICRA).
  • Grieve, T. R., Hollerbach, J. M., & Mascaro, S. A. (2013). Force prediction by fingernail imaging using active appearance models. In 2013 World Haptics Conference.