TeleSign: A Flexible and Intuitive Gesture-Based Teleoperation Framework Academic Article in Scopus uri icon

abstract

  • This paper presents TeleSign, a novel and flexible gesture-based teleoperation framework inspired by American Sign Language. Using either a static or wearable RGB camera, TeleSign integrates Machine Learning with NVIDIA Isaac Sim and ROS2 to enable intuitive control of a manipulator's end effector through hand gestures, allowing for precise and responsive movements. We evaluated TeleSign through multiple experiments, assessing the performance of a Random Forest classifier, gathering user feedback, and testing its effectiveness in complex manipulation tasks. The results demonstrate strong performance across all areas, highlighting TeleSign's potential for efficient and intuitive robot teleoperation. © 2025 IEEE.

publication date

  • January 1, 2025