RAPP Platform Wiki
v0.6.0
RAPP Platform is a collection of ROS nodes and back-end processes that aim to deliver ready-to-use generic services to robots
|
In the current tutorial we will showcase the way to create a more complex application which utilizes external (out of RAPP) packages. This application will be create as a ROS node, in order to take advantage of other ROS packages.
In this application we will track an object using the NAO front camera and the OpenTLD algorithm, as well as its wrapper used in the PANDORA robotics team.
Thus for this tutorial we will need:
This application's code can be found here in the form of a ROS package.
Check here#naoqi-python-libraries-setup) for details.
Check here#rapp-robots-api-libraries-setup) for details.
The installation procedure can be found here for the Ubuntu 14.04 distribution.
After ROS installation, install the NaoQi ROS packages by:
```bash sudo apt-get install ros-indigo-naoqi-* sudo apt-get install ros-indigo-nao-bringup ros-indigo-nao-apps ros-indigo-usb-cam ```
Initially, a catkin_repository
must be created and the ROS packages cloned in it:
```bash
mkdir -p ~/rapp_nao/rapps && cd ~/rapp_nao/rapps mkdir -p rapp_catkin_ws/src && cd rapp_catkin_ws/src
catkin_init_workspace
git clone https://github.com/pandora-auth-ros-pkg/pandora_tld.git git clone https://github.com/pandora-auth-ros-pkg/open_tld.git
cd ~/rapp_nao/rapps/rapp_catkin_ws && catkin_make -j1
echo 'source ~/rapp_nao/rapps/rapp_catkin_ws/devel/setup.bash –extend' >> ~/.bashrc source ~/.bashrc ```
Next, you must change the input image topic to TLD in order to bind the callback from the NAO camera. The ~/rapp_nao/rapps/rapp_catkin_ws/src/pandora_tld/config/predator_topics.yaml
must include the following:
```yaml predator_alert : /vision/predator_alert input_image_topic: /nao_robot/camera/top/camera/image_raw begin_hunt_topic: /predator/hunt ```
Initially a ROS package must be created that will facilitate the code. The package will have rospy
as dependency and will be named tld_tracker_nao
:
```bash cd ~/rapp_nao/rapps/rapp_catkin_ws/src catkin_create_pkg tld_tracker_nao rospy geometry_msgs sensor_msgs std_msgs mkdir -p tld_tracker_nao/scripts touch tld_tracker_nao/scripts/tracker.py chmod +x tld_tracker_nao/scripts/tracker.py ```
The contents of tracker.py
follow:
```python #!/usr/bin/env python
from rapp_robot_api import RappRobot
from geometry_msgs.msg import Polygon from geometry_msgs.msg import Twist
from naoqi_bridge_msgs.msg import JointAnglesWithSpeed
import rospy import sys import time
class NaoTldTracker:
def __init__(self): self.rh = RappRobot() # Use naoqi_brdge to move the head self.joint_pub = rospy.Publisher('/joint_angles', JointAnglesWithSpeed, queue_size=1) # NAO stands up self.rh.motion.enableMotors() self.rh.humanoid_motion.goToPosture("Stand", 0.7) self.lost_object_counter = 20 self.lock_motion = False self.hunt_initiated = False # These are the NAO body velocities. They are automatically published # in the self.set_velocities_callback self.x_vel = 0 self.y_vel = 0 self.theta_vel = 0 # Subscription to the TLD tracking alerts self.predator_sub = rospy.Subscriber("/vision/predator_alert", \ Polygon, self.track_bounding_box) # Timer callback to check if tracking has been lost rospy.Timer(rospy.Duration(0.1), self.lost_object_callback) # Callback to set the velocities periodically rospy.Timer(rospy.Duration(0.1), self.set_velocities_callback) # Callback of the TLD tracker. Called when the object has been detected def track_bounding_box(self, polygon): self.hunt_initiated = True # Set the timeout counter to 2 seconds self.lost_object_counter = 20 # Velocities message initialization joint = JointAnglesWithSpeed() joint.joint_names.append("HeadYaw") joint.joint_names.append("HeadPitch") joint.speed = 0.1 joint.relative = True # Get center of detected object and calculate the head turns target_x = polygon.points[0].x + 0.5 * polygon.points[1].x target_y = polygon.points[0].y + 0.5 * polygon.points[1].y sub_x = target_x - 320 / 2.0 sub_y = target_y - 240 / 2.0 var_x = (sub_x / 160.0) var_y = (sub_y / 120.0) joint.joint_angles.append(-var_x * 0.05) joint.joint_angles.append(var_y * 0.05) ans = self.rh.humanoid_motion.getJointAngles(['HeadYaw', 'HeadPitch']) head_pitch = ans['angles'][1] head_yaw = ans['angles'][0] # Get the sonar measurements sonars = self.rh.sensors.getSonarsMeasurements() # Check if NAO is close to an obstacle if sonars['sonars']['front_left'] <= 0.3 or sonars['sonars']['front_right'] <= 0.3: self.lock_motion = True rospy.loginfo("Locked due to sonars") # Check if NAOs head looks way too down or up elif head_pitch >= 0.4 or head_pitch <= -0.4: self.lock_motion = True rospy.loginfo("Locked due to head pitch") # Else approach the object elif self.lock_motion is False: self.theta_vel = head_yaw * 0.1 if -0.2 < head_yaw < 0.2: print "Approaching" self.x_vel = 0.5 self.y_vel = 0.0 else: self.x_vel = 0.0 self.y_vel = 0.0 print "Centering" self.joint_pub.publish(joint) # Check the battery levels batt = self.rh.sensors.getBatteryLevels() battery = batt['levels'][0] if battery < 25: self.rh.audio.setVolume(100) self.rh.audio.speak("My battery is low") self.predator_sub.unregister() self.rh.humanoid_motion.goToPosture("Sit", 0.7) self.rh.motion.disableMotors() sys.exit(1) # Callback invoked every 0.1 seconds to check for lost of object tracking def lost_object_callback(self, event): # Continues only after the user has selected an object if self.hunt_initiated: self.lost_object_counter -= 1 # If 2 seconds have passed without tracking activity the robot stops if self.lost_object_counter < 0: self.lock_motion = True self.x_vel = 0.0 self.y_vel = 0.0 self.theta_vel = 0.0 rospy.loginfo("Locked due to 2 seconds of non-tracking") self.predator_sub.unregister() # Callback to periodically (0.1 sec) set velocities, except from the # case where the robot has locked its motion def set_velocities_callback(self, event): if not self.lock_motion: self.rh.motion.moveByVelocity(self.x_vel, self.y_vel, \ self.theta_vel) else: self.rh.motion.moveByVelocity(0, 0, 0)
if name == "__main__": rospy.init_node('nao_tld_tracker', anonymous = True) nao = NaoTldTracker() rospy.spin() ```
The next step is to create a ROS launcher to deploy the node:
```bash cd ~/rapp_nao/rapps/rapp_catkin_ws/src/tld_tracker_nao/ mkdir launch && cd launch touch tld_tracker_nao.launch ```
The tld_tracker_nao.launch
must contain:
```xml <launch>
<arg name="nao_ip" value="$(arg nao_ip)">
<node name="nao_tld_tracker_node" type="tracker.py" pkg="tld_tracker_nao" output="screen"> </launch> ```
Furthermore, we need another launch file called nao_bringup.launch
. First we create it:
```bash cd ~/rapp_nao/rapps/rapp_catkin_ws/src/tld_tracker_nao/launch touch nao_bringup.launch ```
And then we fill it with:
```xml <launch>
<arg name="nao_ip"> <arg name="nao_port" default="$(optenv NAO_PORT 9559)">
<node pkg="diagnostic_aggregator" type="aggregator_node" name="diag_agg" clear_params="true"> <rosparam command="load" file="$(find nao_bringup)/config/nao_analysers.yaml"> </node>
<arg name="version" value="V40">
<arg name="nao_ip" value="$(arg nao_ip)">
<arg name="nao_ip" value="$(arg nao_ip)">
<arg name="nao_ip" value="$(arg nao_ip)"> <arg name="nao_port" value="$(arg nao_port)">
<arg name="nao_ip" value="$(arg nao_ip)"> <arg name="resolution" value="1"> <arg name="nao_ip" value="$(arg nao_ip)"> <arg name="source" value="1">
<arg name="nao_ip" value="$(arg nao_ip)"> <arg name="memory_key" value="Device/SubDeviceList/US/Left/Sensor/Value"> <arg name="frame_id" value="LSonar_frame"> <arg name="nao_ip" value="$(arg nao_ip)"> <arg name="memory_key" value="Device/SubDeviceList/US/Right/Sensor/Value"> <arg name="frame_id" value="RSonar_frame">
<arg name="nao_ip" value="$(arg nao_ip)">
</launch> ```
Finally, the application can be launched by:
```bash roslaunch tld_tracker_nao tld_tracker_nao.launch ```
If everything is correct, a TLD window must be raised, showing live the stream from the NAO front camera. Click on this window, press r
and select a rectangle with your mouse for the NAO to track. Have in mind that the object-to-track must have sufficient features for the TLD to succeed (e.g. a white wall won't work).
Then, NAO should center the object with its head and start moving towards there. There are four reasons for NAO to stop:
Below you can see an example where NAO tracks an orange!