Grasping

Summary

This section gives details about the grasping handler service in the grasping package. This service allows DE NIRO to move his arm to a specified location and grab an object.

Hardware Requirements

  • Baxter Robot

Software Requirements

  • Access to Baxter interface
  • Inverse kinematics solver (a modified version of ik_service_client.py is used here)

Setup Tests

The following steps are required in order to run this module independently of the brain

  1. Switch into DE NIRO’s environment by running the baxter.sh script.

  2. In a terminal window, run roslaunch object_recognition detection_drivers.launch.

  3. In another window, run roslaunch object_recognition object_recognition.launch.

  4. In another window, run rosrun grasping grasping_status_server.py. The definition of the service class can be found in the GraspingStatus.srv file.

  5. Send a request to this server to initialise the robot, move the arms to the neutral position and calibrate the grippers.

    from grasping.srv import GraspingStatus, GraspingStatusResponse
    client = GraspingStatusClient('grasping_status_service', GraspingStatus)
    client.make_request(True)
    

Implementation

The following figure gives a high-level overview on where DE NIRO moved during the work of Team Fezzik.

_images/grasp_seq.jpg

The service class for grasping has been defined in the GraspingHandler.srv file as

geometry_msgs/Point object_location
---
uint8 grasping_status

A geometry_msgs/Point instance has 3 attributes: x, y, z.

Note

We make the assumption that a point has been received from the Kinect which has a different coordinate system than DE NIRO. In the Kinect’s frame of reference, positive x is left, positive y is up and positive z is forward. In DE NIRO’s base frame of reference, positive x is forward and away from him, positive y is to his left and positive z is upwards.

The specific arm to move (“right” or “left”) is set on the ROS parameter server as "limb_to_move" e.g.

rospy.set_param("limb_to_move", "right")

Server Behaviour

This describes the behaviour of the GraspingHandlerServer class found in grasping_handler_server.py.

There are three possible return statuses from the server: CANT_REACH=0, OBJECT_GRABBED=1 and OBJECT_MOVED=2

Upon receiving a request, the Kinect coordinates will be transformed into DE NIRO’s frame of reference. DE NIRO will then attempt to move his arm, as specified on the ROS parameter server, to the requested location. He will do this via an intermediate point (currently -15cm along the x-axis).

At this point, if he is unable to move to this intermediate position, CANT_REACH will be returned. If successful, DE NIRO will move to this intermediate position and then attempt to move to the final position. Again, if he can’t reach this, CANT_REACH will be returned and the arm will be returned to the neutral position.

Note

The neutral position for DE NIRO’s left and right arms is set on the ROS parameter server in the brain/scripts/initialise_parameters.py file. The parameters are "left_arm_neutral" and "right_arm_neutral" respectively.

If DE NIRO is able to move his arm to the final position, he will attempt to grip the object. If the object has been moved, DE NIRO will realise this as the gripper position will be 0. The arm will return back to the neutral position and the server will return OBJECT_MOVED.

If he does grip the object, DE NIRO will raise the object up into the ‘Hero’ position to celebrate.

Client Behaviour

This describes the behaviour of the GraspingHandlerClient class found in grasping_handler_client.py.

The client makes use of the ObjectRecognitionServer to receive the object’s location. A request is then made to the GraspingHandlerServer to move the arm to the specific location.

We noticed that there are certain ‘blind spots’ where DE NIRO cannot move his arm even though it appears to be directly in front of him. To overcome this limitation, we programmed the client to make two requests, one with each arm. He will only use his second arm if he can’t reach the object with the first. He will not do this, if the object has been moved.

The order is decided based on the object’s location. If the object’s location along the x-axis in DE NIRO’s frame of reference is negative i.e. to his left, then the first request will be with DE NIRO’s left arm and the second with his right. If positive, then with the right arm.

If the grab is successful, the "grabbed_object" ROS parameter is set to True else False. This is done to inform how the OfferingObject state in the brain is executed.

Usage

Run the grasping handler server rosrun grasping grasping_handler_server.py. Then simply send a request to this server. The service name has been set to grasping_handler_service. This can be changed in the file above.

from grasping.srv import GraspingHandler, GraspingHandlerResponse
from object_recognition.srv import ObjectRecognition, ObjectRecognitionResponse

client_object_location = ObjectRecognitionClient('object_recognition_service', ObjectRecognition)
rospy.set_param("requested_object", "water") # Pick up water
object_location = client_object_location.make_request()

client_grasping = GraspingHandlerClient('grasping_handler_service', GraspingHandler)
client_grasping.make_request(object_location)

The response will be 0, 1 or 2 as defined above in the Server Behaviour section.

Limitations

  • The inverse kinematic solver will sometimes return a trajectory which is sub-optimal. There is a danger that this sub-optimal path will cause DE NIRO to hit obstacles. Therefore someone must always be ready to either press the emergency stop or to manipulate his arms manually.