Robotic cooking through pose extraction from human natural cooking using OpenPose
Robotic cooking is a diffcult task, as the translation from raw recipe instructions to robotic actions involving precise motions and tool-handling is challenging. This paper introduces automated recipe technique recognition based on recording the pose trajectories of human demonstrators carrying out a pancake recipe, as an intuitive method for untrained demonstrators to show a robot how to carry out their recipe variant. A Kinect 2 sensor and the OpenPose neural network are used to extract key timings from the demonstrations, which are replicated when the recipe is carried out by a UR5 arm. Comparing several human-cooked pancakes with their robot-replicated counterparts, the robot's pancake quality scores were found to be only slightly inferior to the human ones, suggesting that the key parameters selected did encompass the most important variations in cooking technique. Furthermore, we discuss preliminary results in inferring the relationship between the cooking parameters and the quality scores.