SLAM (Simultaneous Localisation And Mapping) is the process of building a map by sensing the environment surrounding a robot and at the same time using that map to locate the robot and navigate it. This topic has been something of a hot item in robotics research for many years and is a core technology used in self driving cars and even robotic vacuum cleaners! The good news is that with a little work it is possible to do what was once cutting edge research with a LEGO EV3!
For me this is the end point of a long journey in LEGO robots. My very first use of LEGO robotics (using the Cybermaster kit) was to build a mapping robot. My first leJOS project (using an early version for the NXT) was to build another map building robot based on the design in Brian Bagnall’s book. Although both of these worked, they didn’t really allow me to duplicate some of the things I had been reading about in academic papers. Much of my work with leJOS and LEGO Mindstorms over the years has been trying to get to the point of being able to reproduce those experiments. You can see some of this journey in the following articles:
BNO055 IMU (This provides the IMU used by this project)
Improving position tracking of a mobile robot (combining the IMU with Odometry)
I2C on the EV3 (Fast i2c that allows capture of LIDAR scans)
EV3 LIDAR Sensor (The LIDAR sensor at the heart of this project)
The above components combined with other key leJOS features like the Chassis, motor synchronization and the communication capabilities that allow the use of programs split between the EV3 and a PC as explored in Line Following & Cooperating Robots (part 1) have allowed me to finally build a robot that can explore my house, build a map, avoid bumping into things (well mostly) and that will go where I tell it (some of the time!).
The robot that does the exploring (I call it Surveyor) can be seen above. The LIDAR scanner and IMU can clearly be seen. The data captured by these sensors along with odometry is sent to a PC based control program that handles the map building, localization and navigation functions. The display of this program looks like this:
To see the robot in action and watch the map being built and used, take a look at the following video:
You may be wondering why I didn’t simply plug the robot into something like ROS and make use of the many SLAM implementations and GUI control systems available for that. The simple answer is I wanted to understand as much as I could about how this all worked, I’ve been using this long running project to educate myself!
I’ve based my code on a Java implementation (BreezySlam) of a very simple SLAM system (coreSLAM) but have made extensive modifications (the slow scan rate of my LIDAR complicates things) and improvements. The navigation system is based on something called a Gradient Planner and the exploration algorithm is a fairly natural extension of that.
I hope to write a few more articles describing the SLAM, navigation and exploration models used. If you follow the above links you will get a good background. If you want to see the (very rough) source code it is here:
Note that this project uses some none standard hardware (the laser range finder and the IMU), some of which is no longer available (I use the first version of the rangefinder). Because of this, recreating this project is not trivial and may not even be possible without hacking some expensive hardware. The newer versions of the range finder do not seem to work well with the EV3, I don’t have the new hardware so can’t help you get it going, sorry.