//
you're reading...
Projects

MIDI controlled robots

When I was a young kid we had the most intriguing device in our local warehouse. It was a glass box with monkeys in it. Not real monkeys, but to me they seemed alive. Each monkey had an instrument, flute, banjo drums, etc. and they sat in a nice scenery with palm trees. Every time you put a dime in that box the monkeys would play a song. It was the same song always, but I never got bored with it. IMG_3413.JPG

Now I am not so young anymore, and I decided to recreate my childhood fascination using Mindstorms. I wanted to create a band of small robots, each with an intrument. They should be able to play together and give the impression that they actually make music. Just like the monkeys from my youth.
But I also wanted to improve on the monkeys. They always played the same song and, as I found out when I grew up, they didn’t even play in sync with the music. The robots should be able to do both, play different songs and play in tune as well.

I created a prototype of the software and a rudimentary robot to proof this could work. This prototype drew the attention of Matthias Paul Scholz who then teamed up with me to further develop the concept.

I am proud to now show you our music playing robots. I hope they will intrigue you like these monkeys intrigued me. I will explain how they work. Here’s a few examples of the robots playing music.



(Note: For some examples I had to overdub the music because of disturbances from the US sensor. HenceThat is why you don’t always hear the motors running.)

Like it? Curious how it works? This is how. The music you hear is generated by a synthesizer that interprets a Midi sequence. Midi is the standard for electronic music. It is a set of instructions, called messages, that tell electronic instruments what sound to create. A midi message could be like “make sounds like a piano”. The next message could be “play note x”, etcetera. Midi messages can be generated by a keyboard. They can also be stored in a file. You can find thousands of midi files on the Internet.
Most Midi devices are electronic instruments, like synthesizers, drum computers or samplers. But that is not a rule, anything that deals with Midi messages is a Midi device. Also the software that drives these robots is a midi device. This Software is called BoR, meaning Band of Robots.

Let us look at the internals of this software. The ultimate goal of the software is to translate a Midi message into an action of a robot. The sum of all they actions shou il give the impression of a robot playing music.
The midi messages are generated from a MIDI file by a device called a sequencer. The BoR software uses the Midi sequencer that is shipped with the Java sound API. The sequencer is a transmitter of Midi messages and implements the Transmitter interface (from the Java sound API). One can connect other Midi devices, like synthesizers, to Transmitters like the sequencer to process the Midi messages. In the Java sound API these devices implement the receiver interface. One can connect any number of transmitters to any number of receivers. In case of the BoR software there are two receivers. One is a software synthesizer that is also part of the Java sound API, this device is responsible for generating sound. The other is the BoRConductor, it’s job is to filter and distribute Midi messages to the robots.

The first task of the BoRConductor is to filter the Midi messages it receives. Not all messages should be translated into robot actions.
Its second task is to distribute messages over different robots. You want the piano player to receive only the piano messages , not the percussion messages . Distribution is based on midi channels. Midi messages are grouped in channels, a channel normally corresponds to an instrument. The BoR software lets you assign a robot to a channel. It will only receive the messages coming from this channel. This is how each robot is able to play a different part of the song. There are two ways to assign robots to channels, as a musician or as a singer. This allows the robots to play the music using their instrument and to sing a melody at the same time without messing the two up. The assignments are specific to each song as the structure of the midi files vary and instruments used in each song are different.

So the BoRConductor takes care of filtering and distribution. But it does not transport the messages to the robots. For this it hands over the messages to objects that represent the robots in the band. These objects are called BoRBricks. They accept the Midi messages from the BoRConductor, store it in a buffed and send it to the Brick as soon as the Bricks is ready to accept the message. There is also some filtering done here. If the messages are coming in faster than they can be processed by the robot then all messages except the most recent one are discarded. This ensures that the robots are never lagging behind from the music.
The BoRBricks translate the Midi messages into remote object calls to the robot. This Java technique called RMI allows for the execution of a method of an object existing on a remote machine. In this case the remote machine is the robot. RMI uses TCP to communicate with the remote machine. This makes it easy to utilize an existing (wireless) network for transport.
On the receiving side are the robots. Each robot is an RMI server capable of executing remotely invocated methods. These methods are defined by the Musician interface. Its most important methods are NoteOn and voiceOn. These methods have two parameters, tone and intensity. Tone giving information about the pitch and intensity about the volume of the note. So what started as a Midi message on a PC now is a method invocation on the EV3.
How to implement this method is up to the robot that implements the Musician interface. It is different for each robot but in general it is translated into some kind of motor action that makes the robot move. The guitar player for example positions its left hand according to the pitch and moves it’s right hand with a speed related to volume.

To control the BoR software we have created a shell interface. The interface accepts commands to select a song, to assign robots to midi channels and to start and stop a song. One can also execute playlists of songs to be played.

With the BoR software Matthias and I created a band called the S3NSORH3ADS. It currently has four musicians, a singer, a piano player, a guitar player and a drummer. We were given the opportunity to show the band at a robot event in Prague. The expressions on the faces of the audience told me that I succeeded in my initial intention,  to fascinate.

Advertisements

Discussion

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

About leJOS News

leJOS News keeps you up-to-date with leJOS. It features latest news, explains cool features, shows advanced techniques and highlights amazing projects. Be sure to subscribe to leJOS News and never miss an article again. Best of all, subscription is free!
Follow leJOS News on WordPress.com
%d bloggers like this: