Head control of a 3DoF robot arm using Visual-SLAM and IMU inspired by head-bobbing in birds
Head control of a 3DoF robot arm using Visual-SLAM and IMU inspired by head-bobbing in birds
Birds have remarkable head stabilization capabilities. One common example of this is head-bobbing, which refers to the head-shaking motion that birds exhibit while walking. This movement can be divided into two phases: the hold phase, during which the head remains stationary in space, and the thrust phase, when the head moves forward suddenly. Birds utilize their necks to keep their heads still while walking, achieving stable vision. If such movements could be applied to robots, it would be possible to realize a robot arm capable of stable manipulation while moving. The goal of this study is to realize head-bobbing with a 3DoF robot arm. It is believed that birds primarily use information from their visual and vestibular systems to perform head-bobbing. Therefore, we will estimate the head posture of the robot arm using Visual SLAM for vision and IMU for the vestibular system. Based on the estimated head posture, PID control is utilized to maintain a steady position of the head during the hold phase, as well as to execute a swift forward movement of the head during the thrust phase. The proposed system allows for head-bobbing even in the absence of a specific target or odometry during movement.

