Fruit Ninja
January 2024 - April 2024
For the course Robot Autonomy (16-662), I worked on the project Fruit Ninja. The goal was to emulate the mobile game “Fruit Ninja” using a Franka Emika Panda robotic arm in a simplified manner where one “fruit” (an orange table tennis ball) is tossed into the workspace and the robotic arm needs to intersect the path. The system does execute successfully in general when the ball is tossed into the workspace, but the workspace is relatively small and the system is not robust. The accompanying video shows a few successful demonstrations.
First, the Azure Kinect DK depth camera captured an image of the scene and used a fine-tuned YOLO v8 model to detect the ball. Next, the 3D position of the ball is computed using the camera intrinsics and extrinsics and stored. The history of positions is used in the kinematic equations to predict the location at which the ball will intersect the plane where the robotic arm sits. This information is communicated to the motion planning subsystem which commands the robotic arm to take small, but rapid steps to move the end effector toward the predicted landing position. As the landing position is refined, the arm is already nearby and refines its positioning.
The code for the perception system is hosted on GitHub.