11. Lab 6: Real-robot lab

Worksheet Contact: Xiao Wang (x.wang16@leeds.ac.uk)

Note

You can access the robots to play with what you learnt from the labs, or test the modules for your project starting from Week 6 onwards. Access to the robots will be provided upon request during lab hours.

11.1. Setup your PC and learn how to use the real-robot

Before beginning the exercises, please carefully read this page and set up your account accordingly.

11.2. Exercises

Please complete the following exercises to ensure that you understand how to fully operate the robot, including actuation, using the camera for OpenCV tasks, and navigation and mapping.

Note

Access lab6 files - Instructions Then execute the following commands:

  • In a terminal, go to cd ~/ros2_ws/src.

  • Run: git clone git@github.com:COMP3631-2026/lab6 lab6

    • Note the ” lab6” at the end of the the command: git clone [...] lab6.

  • Run cd $HOME/ros2_ws and then colcon build to build the new package.

First things first, make sure that you:

  1. SSH to your robot (if you don’t know how, you should read this page first).

  2. Start the robot drivers using:

ros2 launch turtlebot3_bringup robot.launch.py

That’s the only command you need to run on the robot. Keep the terminal window with the SSH session open, however.

The rest of the commands mentioned in this worksheet should be carried out on a desktop PC, not on the robot.

Here is an easy way to get started:

  1. Enter into a singularity environment using the ros on the desktop PC.

  2. Launch Code using code. Use the embedded terminals in Code to execute the rest of the commands, however …

  3. When you open a new terminal in Code either (1) run workon_real_robot pick the name from the list, (2) directly run workon_real_robot <name_of_robot> or (3) simply put workon_real_robot <name_of_robot> in your .bashrc to not have to do this with every new terminal window.

11.2.1. Exercise 1: Attempt to run the firstwalk.py

Warning

Please never command the robot to drive faster than 0.2 m/s.

In a terminal window on the desktop PC, run:

ros2 run lab6 firstwalk

This is essentially the same file as in lab 3, with no modifications to make it work on a real robot, thanks to ROS abstraction. If the robot moves back and forth, then you’ve done it! Great job. Feel free to optionally play with different velocities (don’t forget to colcon build after the changes).

11.2.2. Exercise 2: make the robot to move on a circle and square

Now, try creating square.py and circle.py files under ~/ros2_ws/src/lab6/lab6 on the PC. Make the robot move in a circle and a square (you can refer to your solution from lab3). Remember to modify setup.py to let ROS know about your new scripts. After making these changes, run colcon build under ~/ros2_ws.

Test that your scripts are working on the real-robot by running:

ros2 run lab6 circle

and then:

ros2 run lab6 square

11.2.3. Exercise 3: working with image frames

There is a camera on the TurtleBot. The camera runs when you launch the turtlebot3_bringup.

Initially, we considered using a third-party library for the sensor that was publishing the image view to the same topic as the one you subscribed to in the simulation labs (i.e /camera/image_raw). However, this resulted in slow streaming with a lot of lag. Therefore, we developed our own ROS node that reads the image frame directly from the robot, compresses it, and sends the compressed image over the network at our desired frame rate. This led to much more efficient streaming with minimal lag. You can find our custom code here, if optionally interested in reading the code.

Now, check the list of topics (ros2 topic list). You should see new topic names, including /camera/image/compressed. If you don’t see anything, make sure you run workon_real_robot.

This technical issue also presents a new learning opportunity for you. Your task is to create a new node that will interface the compressed image topic with the image_raw one (which does not currently exists), so that your code remains consistent whether working in simulation or with the real robot. You will run this new node when working with the real robot to make /camera/image_raw available.

You will find a script named compressed_to_image_raw.py. Open it and complete the TODO comments.

Once you finished with the TODO comments, make sure to run colcon build and then run (don’t forget workon_real_robot):

ros2 run lab6 compressed_to_image_raw

Now your node should read the compressed image over the network, decompress it and republish it as a raw image under /camera/image_raw. Since the conversion and re-publishing happens on the PC, when we subscribe to the /camera/image_raw on the PC again, it would be faster than subscribing to a /camera/image_raw published over the network by the robot.

Test it (don’t forget workon_real_robot):

ros2 run image_view image_view image:=/camera/image_raw

Waive to the camera and smile.

11.4. Remarks and Checklist

Please check, by the end of this worksheet, …

  • Understand how to use the real-robot.

  • you know how to send velocities to the robot.

  • you know how to utilise the camera with OpenCV.

  • understand how to run code on the real robot (i.e., you only run robot.launch.py on the real-robot, you can run the rest on the desktop PC).

If you have any questions or problems, please kindly ask one of the Teaching team for the module for help during the lab hours.