Personal service robots will be in the short future part of our world by assisting humans in their daily chores. A highly efficient way of communication with people is through basic gestures. In this work, we present an efficient body gestures’ interface that gives the user practical communication to control a personal service robot. The robot can interpret two body gestures of the subject and performs actions related to those gestures. The service robot’s setup consists of a Pioneer P3-DX research robot, a Kinect sensor and a portable workstation. The gesture recognition system developed is based on tracking the skeleton of the user to get the body parts relative 3D positions. In addition, the system takes depth images from the sensor and extracts their Haar features, which will train the Adaboost algorithm to classify the gesture. The system was developed using the ROS framework, showing good performance during experimental evaluation with users. Our body gesturebased interface may serve as a baseline to develop practical and natural interfaces to communicate with service robots in the near future.