//
you're reading...
Computer vision, How-to's

Webcam HTTP streaming

To follow up my post on webcam streaming, here is an example of HTTP MJPG streaming. This allows you to open the stream in a browser using a URL like http://10.0.1.1:8080/. However, Chrome does not let you open this directly – it must be embedded in an HTML document, e.g.:

<html>
<body>
  <img src="http://10.0.1.1:8080"/>
</body>
</html>

Here is the EV3 code, which could be improved a lot:

package mypackage;

import java.awt.image.BufferedImage;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.net.ServerSocket;
import java.net.Socket;

import javax.imageio.ImageIO;

import lejos.hardware.BrickFinder;
import lejos.hardware.Button;
import lejos.hardware.ev3.EV3;
import lejos.hardware.video.Video;

public class HttpStream {
	
	private static int WIDTH = 160;
	private static int HEIGHT = 120;
	private static int NUM_PIXELS = WIDTH * HEIGHT;
	private static int FRAME_SIZE = NUM_PIXELS * 2;

	public static void main(String[] args) throws IOException {
		EV3 ev3 = (EV3) BrickFinder.getLocal();
		Video video = ev3.getVideo();
		video.open(WIDTH, HEIGHT);
		byte[] frame = video.createFrame();	
		BufferedImage img = new BufferedImage(WIDTH, HEIGHT,BufferedImage.TYPE_INT_RGB);
		ServerSocket ss = new ServerSocket(8080);
		Socket sock = ss.accept();
		String boundary = "Thats it folks!";
		writeHeader(sock.getOutputStream(), boundary);
		while (Button.ESCAPE.isUp()) {
			video.grabFrame(frame);
			for(int i=0;i<FRAME_SIZE;i+=4) {
				int y1 = frame[i] & 0xFF;
				int y2 = frame[i+2] & 0xFF;
				int u = frame[i+1] & 0xFF;
				int v = frame[i+3] & 0xFF;
				int rgb1 = convertYUVtoARGB(y1,u,v);
				int rgb2 = convertYUVtoARGB(y2,u,v);
				img.setRGB((i % (WIDTH * 2)) / 2, i / (WIDTH * 2), rgb1);
				img.setRGB((i % (WIDTH * 2)) / 2 + 1, i / (WIDTH * 2), rgb2);
			}
			writeJpg(sock.getOutputStream(), img, boundary);
		}	
		video.close();
		sock.close();
		ss.close();
	}
	
	private static void writeHeader(OutputStream stream, String boundary) throws IOException {
		stream.write(("HTTP/1.0 200 OK\r\n" +
	            "Connection: close\r\n" +
	            "Max-Age: 0\r\n" +
	            "Expires: 0\r\n" +
	            "Cache-Control: no-store, no-cache, must-revalidate, pre-check=0, post-check=0, max-age=0\r\n" +
	            "Pragma: no-cache\r\n" + 
	            "Content-Type: multipart/x-mixed-replace; " +
	            "boundary=" + boundary + "\r\n" +
	            "\r\n" +
	            "--" + boundary + "\r\n").getBytes());
	}
	
	private static void writeJpg(OutputStream stream, BufferedImage img, String boundary) throws IOException {
		ByteArrayOutputStream baos = new ByteArrayOutputStream();
		ImageIO.write(img, "jpg", baos);
		byte[] imageBytes = baos.toByteArray();
		stream.write(("Content-type: image/jpeg\r\n" +
	            "Content-Length: " + imageBytes.length + "\r\n" +
	            "\r\n").getBytes());
	    stream.write(imageBytes);
		stream.write(("\r\n--" + boundary + "\r\n").getBytes());
	}
	
	private static int convertYUVtoARGB(int y, int u, int v) {
		int c = y - 16;
		int d = u - 128;
		int e = v - 128;
		int r = (298*c+409*e+128)/256;
		int g = (298*c-100*d-208*e+128)/256;
		int b = (298*c+516*d+128)/256;
	    r = r>255? 255 : r<0 ? 0 : r;
	    g = g>255? 255 : g<0 ? 0 : g;
	    b = b>255? 255 : b<0 ? 0 : b;
	    return 0xff000000 | (r<<16) | (g<<8) | b;
	}
}

You won’t get a very good frame rate from this. It would be faster if the camera produced a JPG rather than a YUC format image, as converting to Jpeg in Java is slow. This program uses awt.image and javax.imageio methods, so it won’t work if you are using a Java8 profile that omits these.

Advertisements

Discussion

3 thoughts on “Webcam HTTP streaming

  1. Hello. What about microphone? How to use it?

    Posted by MotoR | 2015/09/15, 12:36
    • leJOS does not currently support USB microphones. It is possible, but we would have to add the necessary Kernel drivers, etc. to the Linux system.

      Posted by Lawrie Griffiths | 2015/11/16, 12:39
  2. Hi Laurie,

    I have an application similar to this, in which I want to send a BufferedImage over a socket to be read by a client program on the PC.

    At the server (EV3) end, the critical bit is, as you have above, e.g.

    video.grabFrame(frame);
    image = convertFrameToBuffImage(frame); /* a function to convert YUV to RGB, as above

    OutputStream stream = videoSock.getOutputStream();
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    ImageIO.write(image, “JPEG”, baos);
    byte[] imageBytes = baos.toByteArray();
    stream.write(imageBytes);
    stream.flush();

    At client end I have :

    private static final int WIDTH = 160;
    private static final int HEIGHT = 120;

    private static final int NUM_PIXELS = WIDTH * HEIGHT;
    private static final int BUFFER_SIZE = NUM_PIXELS * 2;
    private static byte[] buffer = new byte[BUFFER_SIZE];

    int offset = 0;
    while (offset < BUFFER_SIZE)
    {
    offset += (videoSock.getInputStream()).read(buffer, offset, BUFFER_SIZE – offset);
    }
    image = ImageIO.read(new ByteArrayInputStream(buffer));

    This does not work (the image does not display when assigned to a JPanel).

    Previously, I have had working well the code here
    https://lejosnews.wordpress.com/2014/09/04/webcam-streaming/
    where the frame is sent as byte array and the conversion from YUV to RGB and assignment to a BufferedImage is done at the PC end.

    The reason I now want to do the image manipulation at the EV3 end is that opencv has become available for the brick and I don't have to do it on the PC. Also, I was hoping to compress the image data a bit as I am using Bluetooth not TCP/IP and it is quite slow. But if as you say JPEG conversion is also slow, perhaps I won't see much better performance .

    What I don't understand about the method of sending a BufferedImage above is this. In the old code, I just grabbed a frame from the camera and sent it as a bunch of bytes to the PC. On the PC end, I read the bytes into a buffer and did some manipulations on them (i.e .the YUV to RGB conversion above) and built a BufferedImage. When sending a BufferedImage , however, I have to convert the bytes to an image format (JPEG,PNG, etc) for ImageIO. Surely means data than just the pixel values is sent? In which case, how is it unpicked at the receiving end? Perhaps this is why it doesn't display?

    Thanks for your help.

    Rob

    Posted by Rob Buttrose | 2016/03/08, 05:03

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

About leJOS News

leJOS News keeps you up-to-date with leJOS. It features latest news, explains cool features, shows advanced techniques and highlights amazing projects. Be sure to subscribe to leJOS News and never miss an article again. Best of all, subscription is free!
Follow leJOS News on WordPress.com
%d bloggers like this: