Homemade IMU

2014-09-18 20.46.18

Here is an example of a more useful homemade EV3 UART sensor. It is an IMU that uses an MPU-9150 breakout board that can be bought cheaply on ebay. I am usinh a cheap Arduino Nano clone.

The sensor supports 4 modes: Temperature, Compass, Gyro and Acceleration. The last three are all 3D, and have X, Y and Z values. you can program the sensor to do interesting things like on-board fusion of the gyro and acceleration data.

The USB cable in the picture is used for programming the Arduino Nano, and for diagnostics. When used in anger, the device is powered from the EV3.

The EV3UARTEmulation library is used to implement the EV3 UART functionality and the code for driving the MPu-9150 is based on that on the Arduino playground page.

This is the Arduino code for the sensor:

// Some code copied from the MPU-9150 Arduino Playground page.
#include <EV3UARTEmulation.h>
#include <SoftwareSerial.h>
#include <Wire.h>

EV3UARTEmulation sensor(7, 8, 99, 38400);

#define MPU9150_TEMP_OUT_H         0x41
#define MPU9150_TEMP_OUT_L         0x42
#define MPU9150_PWR_MGMT_1         0x6B

#define MPU9150_CMPS_XOUT_L        0x4A   // R
#define MPU9150_CMPS_XOUT_H        0x4B   // R
#define MPU9150_CMPS_YOUT_L        0x4C   // R
#define MPU9150_CMPS_YOUT_H        0x4D   // R
#define MPU9150_CMPS_ZOUT_L        0x4E   // R
#define MPU9150_CMPS_ZOUT_H        0x4F   // R

#define MPU9150_GYRO_XOUT_H        0x43   // R  
#define MPU9150_GYRO_XOUT_L        0x44   // R  
#define MPU9150_GYRO_YOUT_H        0x45   // R  
#define MPU9150_GYRO_YOUT_L        0x46   // R  
#define MPU9150_GYRO_ZOUT_H        0x47   // R  
#define MPU9150_GYRO_ZOUT_L        0x48   // R

#define MPU9150_ACCEL_XOUT_H       0x3B   // R  
#define MPU9150_ACCEL_XOUT_L       0x3C   // R  
#define MPU9150_ACCEL_YOUT_H       0x3D   // R  
#define MPU9150_ACCEL_YOUT_L       0x3E   // R  
#define MPU9150_ACCEL_ZOUT_H       0x3F   // R  
#define MPU9150_ACCEL_ZOUT_L       0x40   // R 

void setup() {
  Serial.begin(9600);
  
  Wire.begin();
  
  // Clear the 'sleep' bit to start the sensor.
  MPU9150_writeSensor(MPU9150_PWR_MGMT_1, 0);
  
  MPU9150_setupCompass();
  
  sensor.create_mode("Temperature", true, DATA16, 1, 2, 0);
  sensor.create_mode("Compass", true, DATA16, 3, 5, 0);
  sensor.create_mode("Gyro", true, DATA16, 3, 5, 0);
  sensor.create_mode("Accel", true, DATA16, 3, 5, 0);
  
  sensor.reset();
}

unsigned long last_reading = 0;
short heading[4];
short gyro[4];
short accel[4];
short t;

void loop() {
  sensor.heart_beat();
  if (millis() - last_reading > 100) {
    switch (sensor.get_current_mode()) {
    case 0:
      t = ((int) MPU9150_readSensor(MPU9150_TEMP_OUT_L,MPU9150_TEMP_OUT_H) + 12412.0) / 340.0;
      sensor.send_data16(t);
      break;
    case 1:
      heading[0] = MPU9150_readSensor(MPU9150_CMPS_XOUT_L,MPU9150_CMPS_XOUT_H);
      heading[0] = MPU9150_readSensor(MPU9150_CMPS_YOUT_L,MPU9150_CMPS_YOUT_H);
      heading[1] = MPU9150_readSensor(MPU9150_CMPS_ZOUT_L,MPU9150_CMPS_ZOUT_H);
      sensor.send_data16(heading, 4);
      break;
    case 2:
      gyro[0] = MPU9150_readSensor(MPU9150_GYRO_XOUT_L,MPU9150_GYRO_XOUT_H);
      gyro[1] = MPU9150_readSensor(MPU9150_GYRO_YOUT_L,MPU9150_GYRO_YOUT_H);
      gyro[2] = MPU9150_readSensor(MPU9150_GYRO_ZOUT_L,MPU9150_GYRO_ZOUT_H);
      sensor.send_data16(gyro, 4);
      break;
    case 3:
      accel[0] = MPU9150_readSensor(MPU9150_ACCEL_XOUT_L,MPU9150_ACCEL_XOUT_H);
      accel[1] = MPU9150_readSensor(MPU9150_ACCEL_YOUT_L,MPU9150_ACCEL_YOUT_H);
      accel[2] = MPU9150_readSensor(MPU9150_ACCEL_ZOUT_L,MPU9150_ACCEL_ZOUT_H);
      sensor.send_data16(accel, 4);
      break;     
    }
    last_reading = millis();
  }
}

int MPU9150_I2C_ADDRESS = 0x68;

int MPU9150_readSensor(int addr){
  Wire.beginTransmission(MPU9150_I2C_ADDRESS);
  Wire.write(addr);
  Wire.endTransmission(false);

  Wire.requestFrom(MPU9150_I2C_ADDRESS, 1, true);
  return Wire.read();
}

int MPU9150_readSensor(int addrL, int addrH){
  Wire.beginTransmission(MPU9150_I2C_ADDRESS);
  Wire.write(addrL);
  Wire.endTransmission(false);

  Wire.requestFrom(MPU9150_I2C_ADDRESS, 1, true);
  byte L = Wire.read();

  Wire.beginTransmission(MPU9150_I2C_ADDRESS);
  Wire.write(addrH);
  Wire.endTransmission(false);

  Wire.requestFrom(MPU9150_I2C_ADDRESS, 1, true);
  byte H = Wire.read();

  return (int16_t)((H<<8)+L);
}

int MPU9150_writeSensor(int addr,int data){
  Wire.beginTransmission(MPU9150_I2C_ADDRESS);
  Wire.write(addr);
  Wire.write(data);
  Wire.endTransmission(true);

  return 1;
}

void MPU9150_setupCompass(){
  MPU9150_I2C_ADDRESS = 0x0C;      //change Adress to Compass

  MPU9150_writeSensor(0x0A, 0x00); //PowerDownMode
  MPU9150_writeSensor(0x0A, 0x0F); //SelfTest
  MPU9150_writeSensor(0x0A, 0x00); //PowerDownMode

  MPU9150_I2C_ADDRESS = 0x68;      //change Adress to MPU

  MPU9150_writeSensor(0x24, 0x40); //Wait for Data at Slave0
  MPU9150_writeSensor(0x25, 0x8C); //Set i2c address at slave0 at 0x0C
  MPU9150_writeSensor(0x26, 0x02); //Set where reading at slave 0 starts
  MPU9150_writeSensor(0x27, 0x88); //set offset at start reading and enable
  MPU9150_writeSensor(0x28, 0x0C); //set i2c address at slv1 at 0x0C
  MPU9150_writeSensor(0x29, 0x0A); //Set where reading at slave 1 starts
  MPU9150_writeSensor(0x2A, 0x81); //Enable at set length to 1
  MPU9150_writeSensor(0x64, 0x01); //overvride register
  MPU9150_writeSensor(0x67, 0x03); //set delay rate
  MPU9150_writeSensor(0x01, 0x80);

  MPU9150_writeSensor(0x34, 0x04); //set i2c slv4 delay
  MPU9150_writeSensor(0x64, 0x00); //override register
  MPU9150_writeSensor(0x6A, 0x00); //clear usr setting
  MPU9150_writeSensor(0x64, 0x01); //override register
  MPU9150_writeSensor(0x6A, 0x20); //enable master i2c mode
  MPU9150_writeSensor(0x34, 0x13); //disable slv4
}

and here is the leJOS sensor driver class:

import lejos.hardware.port.Port;
import lejos.hardware.sensor.SensorMode;
import lejos.hardware.sensor.UARTSensor;
import lejos.robotics.SampleProvider;

public class IMU extends UARTSensor  {
	private static final long SWITCHDELAY = 200;

	public IMU(Port port) {
		super(port);
		setModes(new SensorMode[] { new TemperatureMode(), new CompassMode(), new GyroMode(), new AccelerationMode()});
	}
	
	private class TemperatureMode implements SensorMode {
	    private static final int   MODE = 0;
		private int temp;
	
	    @Override
	    public int sampleSize() {
	    	return 1;
	    }
	
	    @Override
	    public void fetchSample(float[] sample, int offset) {
	    	switchMode(MODE, SWITCHDELAY);
	    	temp = port.getShort();
			sample[offset] = temp;
	    }
	
	    @Override
	    public String getName() {
	    	return "Temperature";
	    }
	}
	
	public SampleProvider getTemperatureMode() {
		return getMode(0);
	}
	
	private class CompassMode implements SensorMode {
	    private static final int   MODE = 1;
		private short[] heading = new short[sampleSize()];
	
	    @Override
	    public int sampleSize() {
	    	return 3;
	    }
	
	    @Override
	    public void fetchSample(float[] sample, int offset) {
	    	switchMode(MODE, SWITCHDELAY);
			port.getShorts(heading, 0 ,3);
			for(int i=0;i<sampleSize();i++) sample[offset+i] = heading[i];
	    }
	
	    @Override
	    public String getName() {
	    	return "Compass";
	    }
	}
	
	public SampleProvider getCompassMode() {
		return getMode(1);
	}
	
	private class GyroMode implements SensorMode {
	    private static final int   MODE = 2;
		private short[] gyro = new short[sampleSize()];
	
	    @Override
	    public int sampleSize() {
	    	return 3;
	    }
	
	    @Override
	    public void fetchSample(float[] sample, int offset) {
	    	switchMode(MODE, SWITCHDELAY);
			port.getShorts(gyro, 0 ,3);
			for(int i=0;i<sampleSize();i++) sample[offset+i] = gyro[i];
	    }
	
	    @Override
	    public String getName() {
	    	return "Gyro";
	    }
	}
	
	public SampleProvider getGyroMode() {
		return getMode(2);
	}
	
	private class AccelerationMode implements SensorMode {
	    private static final int   MODE = 3;
		private short[] accel = new short[sampleSize()];
	
	    @Override
	    public int sampleSize() {
	    	return 3;
	    }
	
	    @Override
	    public void fetchSample(float[] sample, int offset) {
	    	switchMode(MODE, SWITCHDELAY);
			port.getShorts(accel, 0 ,3);
			for(int i=0;i<sampleSize();i++) sample[offset+i] = accel[i];
	    }
	
	    @Override
	    public String getName() {
	    	return "Acceleration";
	    }
	}
	
	public SampleProvider getAccelerationMode() {
		return getMode(3);
	}
}

and a test program:

import lejos.hardware.Button;
import lejos.hardware.lcd.LCD;
import lejos.hardware.port.SensorPort;
import lejos.robotics.SampleProvider;
import lejos.utility.Delay;

public class IMUTest {
	public static void main(String[] args) {
		IMU sensor = new IMU(SensorPort.S1);
		SampleProvider temp = sensor.getTemperatureMode();
		SampleProvider heading = sensor.getCompassMode();
		SampleProvider gyro = sensor.getGyroMode();
		SampleProvider accel = sensor.getAccelerationMode();
		float[] sample = new float[heading.sampleSize()];
		
		LCD.drawString("Temp:", 0, 1);
		LCD.drawString("H X       Y", 0, 2);
		LCD.drawString("  Z", 0, 3);
		LCD.drawString("G X       Y", 0, 4);
		LCD.drawString("  Z", 0, 5);
		LCD.drawString("A X       Y", 0, 6);
		LCD.drawString("  Z", 0, 7);
		
		while(Button.ESCAPE.isUp()) {
			temp.fetchSample(sample, 0);
			LCD.drawInt((int) sample[0], 4, 5, 1);
			heading.fetchSample(sample,0);
			for (int i=0;i<2;i++) LCD.drawInt((int)sample[i], 6, 3 + i*8, 2);
			LCD.drawInt((int)sample[2], 6, 3, 3);
			gyro.fetchSample(sample, 0);
			for (int i=0;i<2;i++) LCD.drawInt((int)sample[i], 6, 3 + i*8, 4);
			LCD.drawInt((int)sample[2], 6, 3, 5);
		    accel.fetchSample(sample, 0);
			for (int i=0;i<2;i++) LCD.drawInt((int)sample[i], 6, 3 + i*8, 6);
			LCD.drawInt((int)sample[2], 6, 3, 7);
		}	
		sensor.close();
	}
}

Webcam as Color Detector

You can also use a webcam to detect colors:

package mypackage;

import java.io.File;
import java.io.IOException;

import lejos.hardware.BrickFinder;
import lejos.hardware.Button;
import lejos.hardware.Sound;
import lejos.hardware.ev3.EV3;
import lejos.hardware.video.Video;

public class ColorDetector {
	private static final int WIDTH = 160;
	private static final int HEIGHT = 120;
	private static final int NUM_PIXELS = WIDTH * HEIGHT;
	private static final File BLACK = new File("black.wav");
	private static final File WHITE = new File("white.wav");
	private static final File RED = new File("red.wav");
	private static final File GREEN = new File("green.wav");
	
	private static long lastPlay = 0;

	public static void main(String[] args) throws IOException  {
		
		EV3 ev3 = (EV3) BrickFinder.getLocal();
		Video video = ev3.getVideo();
		video.open(WIDTH, HEIGHT);
		byte[] frame = video.createFrame();
		
		while(Button.ESCAPE.isUp()) {
			video.grabFrame(frame);
			int tr = 0, tg = 0, tb = 0;
			
			for(int i=0;i<frame.length;i+=4) {
				int y1 = frame[i] & 0xFF;
				int y2 = frame[i+2] & 0xFF;
				int u = frame[i+1] & 0xFF;
				int v = frame[i+3] & 0xFF;
				int rgb1 = convertYUVtoARGB(y1,u,v);
				int rgb2 = convertYUVtoARGB(y2,u,v);
				
				tr += ((rgb1 >> 16) & 0xff) + ((rgb2 >> 16) & 0xff);
				tg += ((rgb1 >> 8) & 0xff) + ((rgb2 >> 8) & 0xff);
				tb += (rgb1 & 0xff) + (rgb2 & 0xff);
			}
			
			float ar = tr/NUM_PIXELS, ag = tg/NUM_PIXELS, ab = tb/NUM_PIXELS;
			System.out.println((int) ar + " , " + (int) ag + " , " + (int) ab);
			if (ar < 10 && ag < 10 && ab < 10) play(BLACK);
			else if (ar > 250 && ag > 250 && ab > 250) play(WHITE);
			else if ( ar > 1.8*ag && ar > 2*ab) play(RED);
			else if ( ag > 1.8*ar && ag > ab) play(GREEN);

		}
		video.close();
	}
	
	private static void play(File file) {
		long now = System.currentTimeMillis();
		
		if (now - lastPlay > 2000) {
			System.out.println("Playing " + file.getName());
			Sound.playSample(file);
			lastPlay = now;
		}
	}
	
	private static int convertYUVtoARGB(int y, int u, int v) {
		int c = y - 16;
		int d = u - 128;
		int e = v - 128;
		int r = (298*c+409*e+128)/256;
		int g = (298*c-100*d-208*e+128)/256;
		int b = (298*c+516*d+128)/256;
	    r = r>255? 255 : r<0 ? 0 : r;
	    g = g>255? 255 : g<0 ? 0 : g;
	    b = b>255? 255 : b<0 ? 0 : b;
	    return 0xff000000 | (r<<16) | (g<<8) | b;
	}
}

You will need to record some mono wav files in a program such as Audacity, export them as 16-bit PCM and upload them to the EV3, in order to get the name of the color spoken when it is detected.

Webcam Streaming

2014-09-04 08.17.54

Following on from Andy’s post on Webcam support, here is how to use your Webcam to stream color video to your PC.

The leJOS program is:

package mypackage;

import java.io.BufferedOutputStream;
import java.io.IOException;
import java.net.Socket;

import lejos.hardware.BrickFinder;
import lejos.hardware.Button;
import lejos.hardware.ev3.EV3;
import lejos.hardware.lcd.LCD;
import lejos.hardware.video.Video;

public class Stream {
	private static final int WIDTH = 160;
	private static final int HEIGHT = 120;
	private static final String HOST = "192.168.0.2";
	private static final int PORT = 55555;

	public static void main(String[] args) throws IOException  {

		EV3 ev3 = (EV3) BrickFinder.getLocal();
		Video video = ev3.getVideo();
		video.open(WIDTH, HEIGHT);
		byte[] frame = video.createFrame();

		Socket sock = new Socket(HOST, PORT);
		BufferedOutputStream bos = new BufferedOutputStream(sock.getOutputStream());
		long start = System.currentTimeMillis();
		int frames = 0;
		LCD.drawString("fps:", 0, 2);

		while(Button.ESCAPE.isUp()) {
			try {
				video.grabFrame(frame);
				LCD.drawString("" + (++frames * 1000f/(System.currentTimeMillis() - start)), 5,2);

				bos.write(frame);
				bos.flush();
			} catch (IOException e) {
				break;
			}
		}

		bos.close();
		sock.close();
		video.close();
	}
}

And the PC program is:

package mypackage;

import java.awt.Dimension;
import java.awt.Graphics;
import java.awt.event.WindowAdapter;
import java.awt.event.WindowEvent;
import java.awt.image.BufferedImage;
import java.io.BufferedInputStream;
import java.net.ServerSocket;
import java.net.Socket;

import javax.swing.JFrame;
import javax.swing.JPanel;
import javax.swing.SwingUtilities;

public class CameraFrame {
	private static final int WIDTH = 160;
	private static final int HEIGHT = 120;
	private static final int NUM_PIXELS = WIDTH * HEIGHT;
	private static final int BUFFER_SIZE = NUM_PIXELS * 2;
	private static final int PORT = 55555;

	private ServerSocket ss;
	private Socket sock;
	private byte[] buffer = new byte[BUFFER_SIZE];
	private BufferedInputStream bis;
	private BufferedImage image;
	private CameraPanel panel = new CameraPanel();
	private JFrame frame;

	public CameraFrame() {	
		try {
			ss = new ServerSocket(PORT);
			sock = ss.accept();
			bis = new BufferedInputStream(sock.getInputStream());
		} catch (Exception e) {
			System.err.println("Failed to connect: " + e);
			System.exit(1);
		}

		image = new BufferedImage(WIDTH, HEIGHT, BufferedImage.TYPE_INT_RGB);
	}

	public void createAndShowGUI() {
		frame = new JFrame("EV3 Camera View");

		frame.getContentPane().add(panel);
		frame.setPreferredSize(new Dimension(WIDTH, HEIGHT));

		frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
	    frame.addWindowListener(new WindowAdapter() {
	        @Override
	        public void windowClosing(WindowEvent e) {
	        	close();
	        }
	    });

		frame.pack();
		frame.setVisible(true);
	}

	public void close() {
    	try {
    		if (bis != null) bis.close();
    		if (sock != null) sock.close();
			if (ss != null) ss.close();
		} catch (Exception e1) {
			System.err.println("Exception closing window: " + e1);
		}
	}

	private int convertYUVtoARGB(int y, int u, int v) {
		int c = y - 16;
		int d = u - 128;
		int e = v - 128;
		int r = (298*c+409*e+128)/256;
		int g = (298*c-100*d-208*e+128)/256;
		int b = (298*c+516*d+128)/256;
	    r = r>255? 255 : r<0 ? 0 : r;
	    g = g>255? 255 : g<0 ? 0 : g;
	    b = b>255? 255 : b<0 ? 0 : b;
	    return 0xff000000 | (r<<16) | (g<<8) | b;
	}

	public void run() {
		while(true) {
			synchronized (this) {
				try {
					int offset = 0;
					while (offset < BUFFER_SIZE) {
						offset += bis.read(buffer, offset, BUFFER_SIZE - offset);
					}
					for(int i=0;i<BUFFER_SIZE;i+=4) {
						int y1 = buffer[i] & 0xFF;
						int y2 = buffer[i+2] & 0xFF;
						int u = buffer[i+1] & 0xFF;
						int v = buffer[i+3] & 0xFF;
						int rgb1 = convertYUVtoARGB(y1,u,v);
						int rgb2 = convertYUVtoARGB(y2,u,v);
						image.setRGB((i % (WIDTH * 2)) / 2, i / (WIDTH * 2), rgb1);
						image.setRGB((i % (WIDTH * 2)) / 2 + 1, i / (WIDTH * 2), rgb2);
					}
				} catch (Exception e) {
					break;
				}
			}
			panel.repaint(1);
		}
	}

	class CameraPanel extends JPanel {
		private static final long serialVersionUID = 1L;

	    @Override
	    protected void paintComponent(Graphics g) {
	        super.paintComponent(g);
	        // Ensure that we don't paint while the image is being refreshed
	        synchronized(CameraFrame.this) {
	        	g.drawImage(image, 0, 0, null);
	        }
	    }	
	}

	public static void main(String[] args) {	
		final CameraFrame cameraFrame = new CameraFrame();
		SwingUtilities.invokeLater(new Runnable() {
			@Override
			public void run() {
				cameraFrame.createAndShowGUI();	
			}
		});
		cameraFrame.run();
	}
}

Webcam support

One of the really cool things about the EV3 is that it runs on top of a fairly standard Linux system. This means that it is reasonably easy to hack the system to do new things and support new devices. One of our users Gabriel Ferrer has been doing just that. In this thread and in a series of web posts on his blog Gabe describes the process of enabling video support in the Linux kernel and providing access to it from a leJOS program. Gabe was happy to contribute this great work to leJOS and I’ve now incorporated the code into leJOS and the leJOS sd card (at the moment this support is only available in the git master version). Using this updated version I was able to use the EV3 and my Logitech pro 9000 to take a “selfie”
ev3selfieThe above image is actually a screen grab of the EV3 LCD display using the leJOS EV3Control program. The program running on the EV3 basically grabs frames from the webcam and converts them into a mono image for display on the EV3. The code to do this is actually pretty simple (and is based on a test program written by Gabe).

import java.io.IOException;  
import lejos.hardware.video.Video;
import lejos.hardware.video.YUYVImage;
import lejos.hardware.BrickFinder;
import lejos.hardware.Button;  
import lejos.hardware.lcd.GraphicsLCD;

   
 public class CameraDemo {  
     public static void main(String[] args) { 
         try {  
             Video wc = BrickFinder.getDefault().getVideo();
             wc.open(160,120);
             byte[] frame = wc.createFrame();
             YUYVImage img = new YUYVImage(frame, wc.getWidth(), wc.getHeight());
             GraphicsLCD g = BrickFinder.getDefault().getGraphicsLCD();
             int threshold = 128;
             boolean auto = true;
             while (!Button.ESCAPE.isDown()) {
                 wc.grabFrame(frame);
                 if (auto)
                     threshold = img.getMeanY();
                 img.display(g, 0, 0, threshold);  
                 if (Button.UP.isDown()) {
                     threshold++;
                     if (threshold > 255)
                         threshold = 255;
                     auto = false;
                 }
                 if (Button.DOWN.isDown()) {
                     threshold--;
                     if (threshold < 0)
                         threshold = 0;
                     auto = false;
                 }
                 if (Button.ENTER.isDown()) {
                     auto = true;
                 }
             }
             wc.close();  
             g.clear();  
         } catch (IOException ioe) {  
             ioe.printStackTrace();  
             System.out.println("Driver exception: " + ioe.getMessage());  
         }  
     }  
 }  

The conversion to monochrome is pretty simplistic simply turning any pixel that has a luminance less than a defined threshold to be black and other values to be white. The program uses the UP/DOWN keys to change the threshold settings. ENTER can be used to enable an auto mode which simply uses the mean luminance as the threshold.

You may be wondering why we chose this rather odd YUYV format (also known as YUV422 and YUY2) rather than simple RGB pixel values. The above program shows how YUYV can be easier to work with. This format separates the colour information from the luminance which in this case makes it very easy to perform the threshold operation. Often it is useful to perform operations on the luminance of an image (edge detection etc.) and so we use this format. We may extend the video classes in the future to support other formats but for now YUYV is what you get! If you want to know more including how to convert to RGB take a look at this Wikipedia entry.

Uart Sensor using Sparkfun Pro Micro

2014-08-24 12.46.00

Here is an example of an EV3 UART sensor based a Sparkfun Pro Micro Arduino clone and a Dexter Industries breadboard adapter. It uses the EV3UARTEmulationHard library. This could be used with a variety of cheap sensors. It is shown with an MQ-2 gas sensor. There is also a Bluetooth serial adapter. This could be used to access the sensor wirelessly.

The EV3UARTEmulationHard library is the same as the EV3UARTEmulation library, except that it uses hardware serial rather than software serial. It is suitable for Arduino devices like the Leonardo or the Sparkfun Pro Micro or the Mega, that have a Serial1 hardware UART. For those devices, the USB Serial device can be used for diagnostics and Serial1 can be used for the EV3 communication. Hardware serial should be able to support high bit rates than software serial.

The Arduino code for this device, using EV3UARTEmulationHard is:

#include <EV3UARTEmulationHard.h>
#include <Serial.h>

EV3UARTEmulation sensor(&Serial1, 99, 38400);

void setup() {
  Serial.begin(9600);
  while(!Serial1);
  sensor.create_mode("TEST", true, DATA16, 1, 3, 0);
  sensor.reset();
}

unsigned long last_reading = 0;

void loop() {
  sensor.heart_beat();
  if (millis() - last_reading > 100) {
    int r = analogRead(0);
    sensor.send_data16(r);
    Serial.print("Reading: ");
    Serial.println(r);
    last_reading = millis();
  }
}

UART Compass Sensor

uartcompass

This is an example of a homemade EV3 UART sensor using an HMC5883L compass module, which can be bought cheaply from ebay or amazon. I am using an Arduino Uno, but a smaller device like an Arduino Nano or a Sparkfun Arduino Pro Micro would be better.

The code below implements two modes: the raw mode and the calculated heading. The EV3 UART protocol only supports sending number of data items that are a power of 2, so 4 items rather than 3 are sent for the raw mode. The EV3UARTEmulation library could do the necessary padding of the data, but currently does not.

The heading is not adjusted for tilt of the sensor. See the HMC5883L library and example program for more details on this.

Here is the Arduino code:

#include <SoftwareSerial.h>
#include <EV3UARTEmulation.h>
#include <Serial.h>

EV3UARTEmulation sensor(10, 11, 99, 38400);

// Reference the I2C Library
#include <Wire.h>
// Reference the HMC5883L Compass Library
#include <HMC5883L.h>

// Store our compass as a variable.
HMC5883L compass;
// Record any errors that may occur in the compass.
int error = 0;

void setup() {
 Serial.begin(9600);
 sensor.create_mode("RAW", true, DATA16, 3, 5, 0);
 sensor.create_mode("DEGREES", true, DATA16, 1, 3, 0);
 sensor.reset();

 Serial.println("Starting the I2C interface.");
 Wire.begin(); // Start the I2C interface.

 Serial.println("Constructing new HMC5883L");
 compass = HMC5883L(); // Construct a new HMC5883 compass.

 Serial.println("Setting measurement mode to continous.");
 error = compass.SetMeasurementMode(Measurement_Continuous); // Set the measurement mode to Continuous
 if(error != 0) // If there is an error, print it out.
 Serial.println(compass.GetErrorText(error));

 Serial.println("Setting scale to +/- 1.3 Ga");
 error = compass.SetScale(1.3); // Set the scale of the compass.
 if(error != 0) // If there is an error, print it out.
 Serial.println(compass.GetErrorText(error));
}

short heading[4];
unsigned long last_reading = 0;

void loop() {
 sensor.heart_beat();
 if (millis() - last_reading > 100) {
 MagnetometerRaw raw;
 MagnetometerScaled scaled;
 switch (sensor.get_current_mode()) {
 case 0:
   // Retrieve the raw values from the compass (not scaled).
   raw = compass.ReadRawAxis();
   heading[0] = raw.XAxis;
   heading[1] = raw.YAxis;
   heading[2] = raw.ZAxis;
   Serial.print("Raw X: ");
   Serial.print(raw.XAxis);
   Serial.print(" Y: ");
   Serial.print(raw.YAxis);
   Serial.print(" Z: ");
   Serial.println(raw.ZAxis);
   sensor.send_data16(heading, 4);
   break;
 case 1:
   // Retrieve the scaled values from the compass (scaled to the configured scale).
   scaled = compass.ReadScaledAxis();

   // Calculate heading when the magnetometer is level, then correct for signs of axis.
   short degrees = atan2(scaled.YAxis, scaled.XAxis) * 180/M_PI;

   if (degrees < 0) degrees += 360;
   Serial.print("Heading: ");
   Serial.println(degrees);
   sensor.send_data16(degrees);
   break;
 }
 last_reading = millis();
 }
}

The corresponding Java code for the Compass sensor is:

import lejos.hardware.port.Port;
import lejos.hardware.sensor.SensorMode;
import lejos.hardware.sensor.UARTSensor;
import lejos.robotics.SampleProvider;

public class CompassSensor extends UARTSensor {
 private static final long SWITCHDELAY = 200;

 public CompassSensor(Port port) {
   super(port);
   setModes(new SensorMode[] { new RawMode(), new HeadingMode()});
 }

 private class RawMode implements SensorMode {
   private static final int MODE = 0;
   private short[] raw = new short[sampleSize()];

   @Override
   public int sampleSize() {
     return 3;
   }

   @Override
   public void fetchSample(float[] sample, int offset) {
     switchMode(MODE, SWITCHDELAY);
     port.getShorts(raw,0,sampleSize());
     for(int i=0;i<sampleSize();i++) sample[offset+i] = raw[i];
   }

   @Override
   public String getName() {
     return "Raw";
   }
 }

 public SampleProvider getRawMode() {
   return getMode(0);
 }

 private class HeadingMode implements SensorMode {
   private static final int MODE = 1;
   private int heading;

   @Override
   public int sampleSize() {
     return 1;
   }

   @Override
   public void fetchSample(float[] sample, int offset) {
     switchMode(MODE, SWITCHDELAY);
     heading = port.getShort();
     sample[offset] = heading;
   }

   @Override
   public String getName() {
     return "Heading";
   }
 }

 public SampleProvider getHeadingMode() {
   return getMode(1);
 }
}

And here is a test program:

import lejos.hardware.Button;
import lejos.hardware.lcd.LCD;
import lejos.hardware.port.SensorPort;
import lejos.robotics.SampleProvider;
import lejos.utility.Delay;

public class CompassTest {
	public static void main(String[] args) {
		CompassSensor sensor = new CompassSensor(SensorPort.S1);
		SampleProvider raw = sensor.getRawMode();
		SampleProvider heading = sensor.getHeadingMode();
		float[] sample = new float[raw.sampleSize()];
		float[] degrees = new float[1];
		LCD.drawString("Raw:", 3, 0);
		LCD.drawString("X:",0, 3);
		LCD.drawString("Y:",0, 4);
		LCD.drawString("Z:",0, 5);
		LCD.drawString("Heading:", 0, 7);

		while(Button.ESCAPE.isUp()) {
			raw.fetchSample(sample,0);
			for (int i=0;i<3;i++) LCD.drawInt((int)sample[i],5,4,3+i);
			heading.fetchSample(degrees, 0);
			LCD.drawInt((int) degrees[0], 3, 10, 7);
			Delay.msDelay(1000);
		}
		sensor.close();
	}
}

Getting heading from a gyro

The EV3 comes with a gyro sensor. This sensor measures rate of turn or angle over the vertical axis. Rate of turn is the speed at which the sensor rotates. This is also known as angular speed and is expressed (in leJOS) in degrees per second. Angle is the change in position of the sensor since the sensor was first used. Or since the last time it was reset. Angle is very much like a compass. The difference is that a compass measures the angle from an absolute heading, north. The gyro in angle mode measures from a relative heading, it’s starting position. Angle is not measured directly by the sensor. The sensor can only measure rate of turn directly. Angle is calculated from rate of turn by integration. This calculation is very sensitive to small errors in rate of turn. Lego therefore advices to reset the sensor before every turn. During a reset the angle is reset to zero, but more important, the offset error of the sensor is recalculated. The offset error is the difference between the actual rate of turn and the rate of turn the sensor measures. If this error is known one can correct for it. To determine this error the sensor must be motionless. The offset error is not constant, it therefor must be recalculated regularly.

There are several third party gyro sensors available for the NXT and EV3. Some are able to measure rate of turn over three axes but none is able to provide angle. This is a clear advantage of the Lego sensor. With leJOS however it is very easy to calculate angle from rate of turn. All you need to do is to apply two standard filters, the OffsetCorrectionFilter and the IntegrationFilter. The OffsetCorrectionFilter is applied first. The IntegrationFilter comes second. The OffsetCorrectionFilter uses advanced statistics to continuously update the offset error. There is no need to instruct it to recalculate it. You only have to make sure that the sensor is motionless during the first few reads ( >15 ) from the sensor. Also the IntegrationFilter needs little care. It performs best when the sensor is queried often. But it is not necessary to read it a precise regular intervals. You can reset the accumulated angle every so often. If you happen to have another reference like a compass you can use the angle it provides in the reset. This gives your gyro generated angle an absolute reference.

test setupIt is interesting to know how a software solution performs compared to the Lego gyro sensor. I ran some simple tests to compare three different setups. A Dexter Industries dIMU sensor using the OffsetCorrection and Integraton filters, a EV3 gyro sensor in angle mode and the same EV3 gyro sensor in rate mode using the two filters. I tested the drift over a period of 5 minutes while the sensors were motionless. I tested the outcome of a ninety degree turn and back at high (90 degrees/s) and at slow (22 degrees/s) speed. The three setups were tested simultaneously, thus canceling out the effect of variation in test conditions. I turned the test device by hand.
Below are the test results. The figures in blue is the error from the expected value  (root mean squared error). The bigger this number the bigger the error.

 

 

 

Test dIMU EV3 angle EV3 rate
5 min Drift 2 0 20
. 0 0 36
. 1,0 0,0 20,6
90 degree turn 87 91 88
. 89 91 89
. 88 91 89
. 87 90 88
. 2,4 0,9 1,6
Back 0 1 2
. 0 2 4
. 0 1 2
. -1 0 1
. 0,5 1,2 2,5
slow 90 degree turn 89 87 79
. 87 92 87
. 87 92 86
. 89 90 85
. 2,2 2,1 6,5
slow Back -1 -4 -6
. -3 5 7
. -1 4 4
. 0 1 3
. 1,7 3,8 5,2

The results show that the software solution does not work very well on the EV3 gyro sensor. This setup gave the biggest errors. I think this is because the rate that this sensor returns to the brick is rounded (or truncated) to degrees. The resolution of the dIMU sensor is much better 1/70 of a degree. It gives much better results. The EV3 gyro sensor in angle mode gives best absolute results, although not much better then the software solution. It is interesesting to see that the dIMU performs best at the turn and turn back test. This makes me believe that the dIMU has a scaling error. It would take another filter to compensate for that.

The EV3 in angle mode seems not to drift at all. In rate mode the drift is huge, all on account of rounding off errors. the dIMU has very little drift. A test with really really slow turns showed that the angle given by the EV3 in angle mode remains zero, even after a ninety degree turn. So something in the sensor makes it ignore very small changes.

Overall both the software solution and the EV3 sensor in angle mode perform satisfactory. There is no clear advantage to one one of them. If one goes for the software solution one must use a sensor that has an good resolution.

The tests were run using leJOS 0.8.2 beta. This is not released at the time of writing. The OffsetCorrectionFilter of earlier releases does not perform well.