Wednesday, October 28, 2009

Tuesday, October 20, 2009

Web Cam Color Data -> Arduino Output

One of the issues we've been looking into is creating some sort of link between the input and output mechanisms. We've looked at using libraries such as OpenCV or GSVideo to grab webcam data, and output the color values, or play notes. We've also looked into using arduinos to drive stepper motors. We have not had any sort of interaction between the two, so that's part of what this demo tried to accomplish.

The basic concept behind this demo was to grab some sort of data from the webcam and have it affect the arduinos actions. This is more of a "behind the scenes" type of obstacle, trying to find a way to communicate between multiple different independent groups of functionality.

The main goals of this demo were as follows:
  • Interaction between webcam and arduino
  • Interoperability between different pieces of the project - Moving everything to eclipse.
  • Pulling stress off the arduino, letting the computer just pass pin manipulations to the arduino so it doesn't have to worry about any calculations.

Hardware:
Though this is not the focus of the demo, it is important to understand pieces of the demo without understanding how it's structured.

Everything was mostly selected by what parts I had around.
There are 17 LEDs. 8 red, 8 green and 1 blue.

The 8 red LEDs are just directly wired to 8 arduino ports. They are set up like a percent meter. The first on means 32 (256/8), second means 64, etc.

The 8 green LEDs are wired through a shift register IC. It has 8 digital outputs, but uses only 3 inputs - data/latch/clock. This was needed so I could fit all the LEDs into the limited number of arduino ports. This displays in the same fashion as the red LEDs.

The blue LED is just wired to an analog port on the arduino. It displays its value simply by brightness.


Software
The software used is as follows:

Processing - Manipulate images, grab colors
GSVideo - Processing library, grabs webcam images to be used
Firmata - Processing functions, arduino firmware. Used to control arduino from processing/java.
Eclipse - All libraries are running within Eclipse - This overcomes an important jump. We are no longer stuck inside the Processing IDE which has some limits, and very little in the ways of interoperability. Running this from Eclipse allows us full access to java, and any java libraries. This broadens what we can use, and makes interacting with other services or languages relatively easier. Not to mention support will generally be more available. This was a large hurdle to try to overcome, finding ways to coerce each library to work in the same eclipse project. I'm curious if it's easy to move this all over to Parrot, which would give us interoperability with MANY languages, most notably C, all of .NET, Perl, Python, and PHP.




The demo is structured into two main pieces.

Processing Applet:
First the webcam data is acquired using GSVideo for processing. GSVideo allows us to easily grab webcam images, and use them in processing. Processing can easily pull out values for the colors in the scene, as Chris demonstrated in an earlier demo. A lot of the code for this is copied out of his demo, and modified for this one. This is done in a processing applet so we can display back the values, and allow the user to select which pixel to sample using the mouse. Source Code Here

Arduino Controller:
Secondly, another object is used to work as an intermediate controller for the arduino. This object keeps track of the current color data, and is responsible for passing this data to the arduino. It knows which pins are which, how to write to each pin, what port the arduino is on, and how to interpret the color data. This is implemented using the firmata set up - firmware loaded to the arduino which listens to serial input, which there is a processing library for. This object also writes to the arduino on a separate thread, allowing it to run independently of the applet. More on this below.... Source Code Here


These two pieces allow us an important abstraction - the Arduino controller doesn't need to know anything about what the applet is doing, and the applet doesn't need to know anything about how this data is being used. This allows either end to be interchanged. We could swap out a different interpretation of data that doesn't use the arduinos at all, and this would only affect a couple of simple lines in the applet. The opposite is true as well - we can completely change what pixel, or even where this color data is coming from, and the arduino controller will not know the difference.

Another important thing here, is that this allows the arduino serial writes to reside on a different thread. This means the CPU can focus on updating webcam data while it waits for writes to the arduino. Serial writes for each pin take about 20 ms. This kills the applet performance if it cannot update until all 17 lights (which due to the shift register, is around 35 writes = 700 ms = .7 sec, which would limit us to 1 fps) have been written.



The full eclipse project is available zipped here



Video


In this video I'm passing a spectrum printed on a piece of paper across the webcam. The lighting is pretty bad and it's printed paper so the colors not terrific on either end. The functionality is all still there, I just dont have a good medium or lighting set up around at the moment to demonstrate it with.

Monday, October 5, 2009

Sunday, October 4, 2009

Music Creation from Video

One of the challenges of this project is the final sonification of all the data input that our system will generate. There are so many ways to go about it and each of them have the possibility to output interesting soundscapes, it's pretty daunting just to decide!
But since the project is still in the early formative stages, we still have time to mull over those possibilities and hopefully as more of the visual aspects are set, a complementary audio accompaniment methodology will become apparent.
In the meantime, figuring out what language/program/technology would allow for easy creation of data-to-audio systems has been a personal focus of mine for a couple weeks now.
So far, I've been leaning towards Pure Data due to its relative ease of use and for how well the GEM multimedia library handles webcam feeds.
Now it's too early to say what we'll be doing with our video data, but I can show you a possibility; below is a patch I put together yesterday to show a simple motion-sensing midi generator. To put it simply, it monitors five pixels located in a vertical line down the middle of the video window and when anything changes, it starts to output midi notes. Now I could go into more detail, but I believe in the beauty of self-education, so located below is a download link for the patch and a short demo video of me waving my arm to make some sweet sounds.
Get the patch!


Make sure to watch in fullscreen!

But this is only one part of the possible sonic range that we can traverse, with the midi data being generated from the video, we can now think about how to utilize those notes. One option is to pass the notes to a separate program to generate the sounds. To see if this was possible, I threw together a quick midi-controlled track in Jeskola Buzz and after some fidgeting with MIDI Yoke I got everything to work together.
I've recorded a quick session with the Buzz track and my Monome running flin. [so I have some midi to work with]







Click the image for the audio!

So that's all that I'll say for now about the audio side of things, but there will surely be more into to come in the coming months. Stay tuned!

Thursday, October 1, 2009

Stepper demo



As a prelude to a full scale model of the video read head component of the project, we first needed to show that interaction between the computer and a stepper motor was possible.

In this setup, we use a the computer mouse to control the speed of a single six-wire stepper motor. the stepper motor is connected to an arduino motor shield which connects to my laptop through a USB port and powered by a 12 volt supply. On the arduino we are running a modified version of the motor shield demo code found here. On the laptop we are running a modified version of David A. Mellis's dimmer project developed in Processing.

The Processing code creates a small window with a color gradient, and as you move the mouse from one side to the other, you can decrease or increase the rate of the stepper motor's steps. On a side note, the program seems to fail if the mouse ever leaves the created window. The issue is in all likelihood a trivial fix, but it functions sufficiently as is to demonstrate the viability of this concept.

Laptop Code:

/* Processing code for this example */
// Dimmer - sends bytes over a serial port
// by David A. Mellis

import processing.serial.*;
Serial port;

void setup() {
size(256, 150);

println("Available serial ports:");
println(Serial.list());

// Uses the first port in this list (number 0). Change this to
// select the port corresponding to your Arduino board. The last
// parameter (e.g. 9600) is the speed of the communication. It
// has to correspond to the value passed to Serial.begin() in your
// Arduino sketch.
port = new Serial(this, Serial.list()[2], 9600);

// If you know the name of the port used by the Arduino board, you
// can specify it directly like this.
//port = new Serial(this, "COM1", 9600);
}

void draw() {
// draw a gradient from black to white
for (int i = 0; i < 256; i++) {
stroke(i);
line(i, 0, i, 150);
}

// write the current X-position of the mouse to the serial port as
// a single byte
port.write(mouseX);
}
*/



Arduino Code

/* by David A. Mellis's
* Modified by Nick Smith
*/

#include

AF_Stepper motor(48, 1);

void setup() {
Serial.begin(9600);
motor.setSpeed(10);
//motor.step(100, FORWARD, SINGLE);
motor.release();
delay(1000);

}

void loop() {
byte brightness;
if (Serial.available()) {
brightness = Serial.read();
motor.setSpeed(brightness / 5);
motor.step(10, FORWARD, INTERLEAVE);
//motor.step(100, BACKWARD, MICROSTEP);
Serial.flush();
}
}

NOTES THUR 10/1/09

NOTES THUR 10/1/09