//
you're reading...
Hardware, Moving around, Projects, SLAM

SLAM

banner

SLAM (Simultaneous Localisation And Mapping) is the process of building a map by sensing the environment surrounding a robot and at the same time using that map to locate the robot and navigate it. This topic has been something of a hot item in robotics research for many years and is a core technology used in self driving cars and even robotic vacuum cleaners! The good news is that with a little work it is possible to do what was once cutting edge research with a LEGO EV3!

For me this is the end point of a long journey in LEGO robots. My very first use of LEGO robotics (using the Cybermaster kit) was to build a mapping robot. My first leJOS project (using an early version for the NXT) was to build another map building robot based on the design in Brian Bagnall’s book. Although both of these worked, they didn’t really allow me to duplicate some of the things I had been reading about in academic papers. Much of my work with leJOS and LEGO Mindstorms over the years has been trying to get to the point of being able to reproduce those experiments. You can see some of this journey in the following articles:
BNO055 IMU (This provides the IMU used by this project)
Improving position tracking of a mobile robot
 (combining the IMU with Odometry)
I2C on the EV3 (Fast i2c that allows capture of LIDAR scans)
EV3 LIDAR Sensor  (The LIDAR sensor at the heart of this project)
The above components combined with other key leJOS features like the Chassismotor synchronization and the communication capabilities that allow the use of programs split between the EV3 and a PC as explored in Line Following & Cooperating Robots (part 1) have allowed me to finally build a robot that can explore my house, build a map, avoid bumping into things (well mostly) and that will go where I tell it (some of the time!).

surveyor

The robot that does the exploring (I call it Surveyor) can be seen above. The LIDAR scanner and IMU can clearly be seen. The data captured by these sensors along with odometry is sent to a PC based control program that handles the map building, localization and navigation functions. The display of this program looks like this:

control

To see the robot in action and watch the map being built and used, take a look at the following video:

 

 

You may be wondering why I didn’t simply plug the robot into something like ROS and make use of the many SLAM implementations and GUI control systems available for that. The simple answer is I wanted to understand as much as I could about how this all worked, I’ve been using this long running project to educate myself!

I’ve based my code on a Java implementation (BreezySlam) of a very simple SLAM system (coreSLAM) but have made extensive modifications (the slow scan rate of my LIDAR complicates things) and improvements. The navigation system is based on something called a Gradient Planner  and the exploration algorithm is a fairly natural extension of that.

I hope to write a few more articles describing the SLAM, navigation and exploration models used. If you follow the above links you will get a good background. If you want to see the (very rough) source code it is here:

https://github.com/gloomyandy/surveyor/tree/master

Note that this project uses some none standard hardware (the laser range finder and the IMU), some of which is no longer available (I use the first version of the rangefinder). Because of this, recreating this project is not trivial and may not even be possible without hacking some expensive hardware. The newer versions of the range finder do not seem to work well with the EV3, I don’t have the new hardware so can’t help you get it going, sorry.

Advertisements

Discussion

6 thoughts on “SLAM

  1. Wow, impressive. I might have to build one. I have started using ROS again, but not with Lego hardware. I am having another go at building a ROS turtlebot that will do SLAM using a Roomba base and a Microsoft Kinect, but am having a lot of problems with incompatible hardware. I might buy a Turtlebot3 which has a laser rangefinder.

    Posted by Lawrie Griffiths | 2017/07/18, 10:06
  2. Congrats for the code you’ve done an amazing job!

    I have successfully linked the Breezy SLAM code run the SurveyorManager and everything runs. The graphics GUI panel comes on as seen in the pictures above. I get however a few errors like:

    1. java.lang.OutOfMemoryError: Java heap space
    at ZoomAndPanPanel.paintComponent(ZoomAndPanPanel.java:42)
    at TrackDisplay.paintComponent(TrackDisplay.java:166)
    -> I have increased my JVM size to the max but still the error is there and is generated from the GUI panel.

    2. Exception in thread “AWT-EventQueue-0” java.lang.NullPointerException
    at RobotInfo.setTargetPose(RobotInfo.java:322)
    at Track$TrackInfoView.setTarget(Track.java:158)
    at Track$22.actionPerformed(Track.java:511)

    Any suggestions on how to eliminate those errors?
    Is there any newer version of this code that comes with no errors?
    Is there any documentation for the use of the GUI and the overall code as a whole?

    Many thanks!

    Posted by robo1 | 2017/12/18, 02:08
    • Sorry I can’t really help. I don’t remember ever seeing those errors. I did have to increase the amount of memory allocated by the VM but only to allow the use of long routes (each pose is stored). What sort of system are you running this code on? How much memory does it have, does it have virtual memory enabled? Have you made any changes to the source (line 42 in ZoomAndPanPanel.java does not seem to match any real code). I’ve happily run this code on a couple of different systems (One Linux, one Windows).

      Posted by gloomyandy | 2017/12/18, 09:03
      • Hi mate, thanks for your reply.

        I’m running the code in IntelliJ – Win7 x64 OS, and I don’t have any lego brick connected to my PC.

        When I run log27.dat (the example map shown above), and having the max VM memory set to -Xmx1600M (max for IntelliJ) I still get the heap space error especially when I try to zoom in. Despite the memory error I can see the example running (robot mapping). With -Xmx1600M the other logs don’t run out of memory though (smaller maps).

        No I haven’t done any changes to the java code but like you said the NullPointerException is also pointing to line 42 in ZoomAndPanPanel.java. Do I have to make any changes to this line to eliminate the NullPointerException? For me this exception is a lot more serious issue than the memory error, which I can work around, so any suggestion on how to tackle this exception will be appreciated.

        I’d like to sort out these errors and understand the code before I integrate it to a real robot with a Hokuyo laser.

        Posted by ROBO1 | 2017/12/18, 13:31
      • I’ve never seen a null pointer exception, sorry. What happens if you run the code outside of IntelliJ? Or even better what happens if you use eclipse (which is what I used)?

        Posted by gloomyandy | 2017/12/18, 18:09
      • Also can you provide details of how you cause the null pointer exception. It looks like you are trying to set a target pose, what operating mode do you have things in when you do this? It could be that you are trying to do something that I’ve never done!

        Posted by gloomyandy | 2017/12/18, 18:18

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

About leJOS News

leJOS News keeps you up-to-date with leJOS. It features latest news, explains cool features, shows advanced techniques and highlights amazing projects. Be sure to subscribe to leJOS News and never miss an article again. Best of all, subscription is free!
Follow leJOS News on WordPress.com
Advertisements
%d bloggers like this: