Skip to content

Repository containing additional information regarding the Gesture-based Feedback in Human-Robot Interaction for Object Manipulation implementation

Notifications You must be signed in to change notification settings

Coelhomatias/Gesture-based-Feedback-in-Human-Robot-Interaction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Gesture-based-Feedback-in-Human-Robot-Interaction

The steps the robot takes in the simulation

Gesture-based Feedback in Human-Robot Interaction for Object Manipulation

Human-Robot Interaction is a currently highly active research area with many advances in interfaces that allow humans and robots to have bi-directional feedback of their intentions. However, in an industrial setting, current robot feedback methods struggle to successfully deliver messages since the environment makes it difficult and inconvenient for the user to perceive them. This paper proposes a novel method for robot feedback, leveraging the addition of social cues to robot movement to notify the human of its intentions. Through the use of robotic gestures, we believe it is possible to successfully convey the robots’ goals in interactions with humans. To verify this hypothesis, a proof of concept was developed in a simulated environment using a robotic arm manipulator that notifies the user using gestures when it needs to correct the pose of an object.

Implementation

Motion Planner Algorithm

Unity's Object Pose Estimation Demo was used as a reference for the implementation of the social cues. The tutorial was modified to fit our needs.

Firstly, an improvement to the overall motion planner was made by replacing the RRTConnect algorithm with the RRT* algorithm. This alteration improved the movements of the robot significantly. The animations below show examples of the robot's movements with the two motion planning algorithms (Up: RRTConnect, Down: RRT*).

The robot's path is constructed with the RRTConnect algorithm The robot's path is constructed with the RRTstar algorithm

Circular Motion

The implemented proof of concept consists in restraining the pick-and-place action of the robot to a specific cube orientation. As such, incomming messages from Unity were checked for the orientation of the cube in the Y axis. If the cube is in the correct orientation the robot will place it in the goal position. Otherwise, the robot will notify the user with a circular gesture that the cube needs to be rotated. The user can interact with the cube to rotate it using the keyboard.

The Robot's Behavior

About

Repository containing additional information regarding the Gesture-based Feedback in Human-Robot Interaction for Object Manipulation implementation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published