More than some time ago we decided to invest in a low-cost wheeled platform and make the first steps in the robotics world. We then started a mini project that we called “Dynamic Knowledge Acquisition with robots” or DKA-robo.

The idea of DKA-robo is the following:

  1. we set up a robot with computation and navigation skills
  2. we create map an environment
  3. we create an RDF knowledge base about such environment (e.g. which are the rooms and the open spaces, how cold or busy they are, etc.)
  4. we create a system that allows user to query the knowledge base to know the status of the environment and, if some information in the KB has expired, the robot is sent to collect the new information, as a mobile sensor.

In this post, we are going to talk about points 1 and 2.

/* HOW-TO */


(Note: these links are working on January 2017)

We assume a little knowledge of ROS, and that you have a catkin workspace where to work.

If you do not know ROS, see http://wiki.ros.org/indigo/Installation/Ubuntu.

If you do not have a catkin workspace, see http://wiki.ros.org/ROS/Tutorials/InstallingandConfiguringROSEnvironment .


What we used


  • To move the iRobot create, you need a rosnode that will control the base. Luckily, there is a very useful node at https://github.com/CentroEPiaggio/irobotcreate2ros (it used to be at a different repo location until June 2016) that acts as an interface between ROS and the iRobotCreate open interface.

** Note: you need to have ROS serial installed, so you will probably need to install it as

sudo apt-get ros-indigo-serial

sudo apt-get ros-indigo-rosserial

** Note: you will have to change the USB port serial name from ttyUSB0 to rplidar. Considering that we have both the robot and the laser, we renamed the port ttyUSB1  instead of ttyUSB0, which means we always have to plug the laser after the robot.

  • To make the KMI map, we used the ROS gmapping package that allows to create a 2D map from laser and pose data collected by the robot. Gmapping can be simply installed via command line:

sudo apt-get install ros-indigo-slam-gmapping


How to run?

There are two ways of running the system

  1. by opening several terminals, and running each rosnode separately. (Don’t forget to source the setup.bash in the devel folder of your catkin workspace, or it will not find the ros commands!!)
  2. by creatibg a launch file that will run everything for you

Let’s go step-by-step with 1:

  • Open a new terminal and start roscore
  • In a second terminal, start the robot :

rosrun irobotcreate2 irobotcreate2

(we modified the irobotcreate2 so that we could stop the robot with the spacebar, see  https://github.com/McKracken/kmi_ros/blob/master/irobotcreate2ros/src/irobotcreate2_our.cpp)

  • In a third terminal, run the laser node

roslaunch rplidar_ros rplidar.launch

  • In a fourth terminal, run gmapping. Ideally, it should work simply with

rosrun gmapping slam_gmapping scan:=base_scan


Considering this was not working very well for us (low quality map, the robot loosing odometry, etc.), we created a launchfile in which we tuned some of the parameters for gmapping, as suggested in http://geduino.blogspot.co.uk/2015/04/gmapping-and-rplidar.html.

The launchfile we used is at https://github.com/McKracken/kmi_ros/blob/master/kmi_navigation/launch/slam.launch

  • In a fifth terminal, run the static transformation publisher between base_scan and laser to connect the base and the laser:

rosrun tf static_transformation_publisher 0 0 0 0 0 0 base_scan laser 100

  • Finally, from another laptop, run the teleoperation node

rosrun irobotcreate2 keyboard_node

(we also have modified this one, so that our robot would move and turn faster: https://github.com/McKracken/kmi_ros/blob/master/irobotcreate2ros/src/keyboard_node_our.cpp)


Now, considering all the terminals we had to use, you understand why it is better to go for option 2! Here’s our launch file



Simply run

roslaunch <your_slam_package> <your_slam_launchfile>

e.g. roslaunch kmi_slam kmi_slam_gmapping.launch and that should make you map!

KMi robots: making a map with a Roomba, an RPlidarA2 and gmapping