Sunday, December 27, 2009

Introductions and Updates

Hello internet people,

After looking at the bulk of posts we have up here it's come to my attention that we have a good amount of data but not much in terms of explanation.

In this post I'm going to attempt to give you an overview of this project and just where we are in terms of design and assembly.

Introduction

To get a good idea of what this project is about, we have written a descriptive proposal that covers all aspects, technical and artistic, of the installation in progress:

“SaraSong is an interactive installation that uses color-pencil pictures drawn by participants on a special robotically-augmented paper canvas to create live visual and auditory compositions that reflect on the cyclical nature of life on earth. “

SaraSong Proposal on Google Docs

The Past

Over the last two months, the IPA-IQP team has been able to implement a few of the main elements described in the proposal:

The wooden stand that will serve as the user’s drawing surface and hardware mounting platform has been fully designed and assembled.

The Java framework for generating multi-channel midi sequences from the webcam color/luminance data has been designed and is now in a workable state with more features currently being implemented.

The current audio synthesis program, Jeskola Buzz, has been explored by the team and an initial instrument/sample-bank setup has been created for testing the midi output from the music generation Java program.

And finally, the hardware for the vertical plotter [hector clone] has been machined and installed on the wooden stand. In addition, the code for controlling the movement of the plotter’s payload has been written and is in workable condition.

The Present

At this point, all of the team members are on a much needed holiday break. Even so, some work is being done to add additional data massaging capability to the audio generation framework as to widen the auditory possibilities that we can test and some designs are being laid out for the second [eraser] vertical plotter. Also, a few members are working on condensing the large amount of behind-the-scenes data and information that has been accumulating in preparation for a public release near the end of the year long project.

The Future

As of January 14th, the team has started an aggressive campaign to finish all the design work and ultimately, fully assemble all physical elements so more time can be dedicated to the computer-side data manipulation and audio generation to make sure that all user-experience elements can be accomplished.

One part of this project that needs more work is what you are reading now, the blog. We are currently discussing how to better keep the public updated with our progress and hopefully create a dialog with interested readers. With our work falling in so many domains and the amount of time we have to create these posts being very limited, other expressive venues are being explored.

Keep an eye out for more updates in the coming weeks!

Lastly, near the conclusion of the project, a comprehensive set of design data and documentation will be released to the public with the hope that our work will live on in the public arena though derivative works by other artists and students.

~ Chris

Saturday, December 5, 2009

Wednesday, October 28, 2009

Tuesday, October 20, 2009

Web Cam Color Data -> Arduino Output

One of the issues we've been looking into is creating some sort of link between the input and output mechanisms. We've looked at using libraries such as OpenCV or GSVideo to grab webcam data, and output the color values, or play notes. We've also looked into using arduinos to drive stepper motors. We have not had any sort of interaction between the two, so that's part of what this demo tried to accomplish.

The basic concept behind this demo was to grab some sort of data from the webcam and have it affect the arduinos actions. This is more of a "behind the scenes" type of obstacle, trying to find a way to communicate between multiple different independent groups of functionality.

The main goals of this demo were as follows:
  • Interaction between webcam and arduino
  • Interoperability between different pieces of the project - Moving everything to eclipse.
  • Pulling stress off the arduino, letting the computer just pass pin manipulations to the arduino so it doesn't have to worry about any calculations.

Hardware:
Though this is not the focus of the demo, it is important to understand pieces of the demo without understanding how it's structured.

Everything was mostly selected by what parts I had around.
There are 17 LEDs. 8 red, 8 green and 1 blue.

The 8 red LEDs are just directly wired to 8 arduino ports. They are set up like a percent meter. The first on means 32 (256/8), second means 64, etc.

The 8 green LEDs are wired through a shift register IC. It has 8 digital outputs, but uses only 3 inputs - data/latch/clock. This was needed so I could fit all the LEDs into the limited number of arduino ports. This displays in the same fashion as the red LEDs.

The blue LED is just wired to an analog port on the arduino. It displays its value simply by brightness.


Software
The software used is as follows:

Processing - Manipulate images, grab colors
GSVideo - Processing library, grabs webcam images to be used
Firmata - Processing functions, arduino firmware. Used to control arduino from processing/java.
Eclipse - All libraries are running within Eclipse - This overcomes an important jump. We are no longer stuck inside the Processing IDE which has some limits, and very little in the ways of interoperability. Running this from Eclipse allows us full access to java, and any java libraries. This broadens what we can use, and makes interacting with other services or languages relatively easier. Not to mention support will generally be more available. This was a large hurdle to try to overcome, finding ways to coerce each library to work in the same eclipse project. I'm curious if it's easy to move this all over to Parrot, which would give us interoperability with MANY languages, most notably C, all of .NET, Perl, Python, and PHP.




The demo is structured into two main pieces.

Processing Applet:
First the webcam data is acquired using GSVideo for processing. GSVideo allows us to easily grab webcam images, and use them in processing. Processing can easily pull out values for the colors in the scene, as Chris demonstrated in an earlier demo. A lot of the code for this is copied out of his demo, and modified for this one. This is done in a processing applet so we can display back the values, and allow the user to select which pixel to sample using the mouse. Source Code Here

Arduino Controller:
Secondly, another object is used to work as an intermediate controller for the arduino. This object keeps track of the current color data, and is responsible for passing this data to the arduino. It knows which pins are which, how to write to each pin, what port the arduino is on, and how to interpret the color data. This is implemented using the firmata set up - firmware loaded to the arduino which listens to serial input, which there is a processing library for. This object also writes to the arduino on a separate thread, allowing it to run independently of the applet. More on this below.... Source Code Here


These two pieces allow us an important abstraction - the Arduino controller doesn't need to know anything about what the applet is doing, and the applet doesn't need to know anything about how this data is being used. This allows either end to be interchanged. We could swap out a different interpretation of data that doesn't use the arduinos at all, and this would only affect a couple of simple lines in the applet. The opposite is true as well - we can completely change what pixel, or even where this color data is coming from, and the arduino controller will not know the difference.

Another important thing here, is that this allows the arduino serial writes to reside on a different thread. This means the CPU can focus on updating webcam data while it waits for writes to the arduino. Serial writes for each pin take about 20 ms. This kills the applet performance if it cannot update until all 17 lights (which due to the shift register, is around 35 writes = 700 ms = .7 sec, which would limit us to 1 fps) have been written.



The full eclipse project is available zipped here



Video


In this video I'm passing a spectrum printed on a piece of paper across the webcam. The lighting is pretty bad and it's printed paper so the colors not terrific on either end. The functionality is all still there, I just dont have a good medium or lighting set up around at the moment to demonstrate it with.

Monday, October 5, 2009

Sunday, October 4, 2009

Music Creation from Video

One of the challenges of this project is the final sonification of all the data input that our system will generate. There are so many ways to go about it and each of them have the possibility to output interesting soundscapes, it's pretty daunting just to decide!
But since the project is still in the early formative stages, we still have time to mull over those possibilities and hopefully as more of the visual aspects are set, a complementary audio accompaniment methodology will become apparent.
In the meantime, figuring out what language/program/technology would allow for easy creation of data-to-audio systems has been a personal focus of mine for a couple weeks now.
So far, I've been leaning towards Pure Data due to its relative ease of use and for how well the GEM multimedia library handles webcam feeds.
Now it's too early to say what we'll be doing with our video data, but I can show you a possibility; below is a patch I put together yesterday to show a simple motion-sensing midi generator. To put it simply, it monitors five pixels located in a vertical line down the middle of the video window and when anything changes, it starts to output midi notes. Now I could go into more detail, but I believe in the beauty of self-education, so located below is a download link for the patch and a short demo video of me waving my arm to make some sweet sounds.
Get the patch!


Make sure to watch in fullscreen!

But this is only one part of the possible sonic range that we can traverse, with the midi data being generated from the video, we can now think about how to utilize those notes. One option is to pass the notes to a separate program to generate the sounds. To see if this was possible, I threw together a quick midi-controlled track in Jeskola Buzz and after some fidgeting with MIDI Yoke I got everything to work together.
I've recorded a quick session with the Buzz track and my Monome running flin. [so I have some midi to work with]







Click the image for the audio!

So that's all that I'll say for now about the audio side of things, but there will surely be more into to come in the coming months. Stay tuned!

Thursday, October 1, 2009

Stepper demo



As a prelude to a full scale model of the video read head component of the project, we first needed to show that interaction between the computer and a stepper motor was possible.

In this setup, we use a the computer mouse to control the speed of a single six-wire stepper motor. the stepper motor is connected to an arduino motor shield which connects to my laptop through a USB port and powered by a 12 volt supply. On the arduino we are running a modified version of the motor shield demo code found here. On the laptop we are running a modified version of David A. Mellis's dimmer project developed in Processing.

The Processing code creates a small window with a color gradient, and as you move the mouse from one side to the other, you can decrease or increase the rate of the stepper motor's steps. On a side note, the program seems to fail if the mouse ever leaves the created window. The issue is in all likelihood a trivial fix, but it functions sufficiently as is to demonstrate the viability of this concept.

Laptop Code:

/* Processing code for this example */
// Dimmer - sends bytes over a serial port
// by David A. Mellis

import processing.serial.*;
Serial port;

void setup() {
size(256, 150);

println("Available serial ports:");
println(Serial.list());

// Uses the first port in this list (number 0). Change this to
// select the port corresponding to your Arduino board. The last
// parameter (e.g. 9600) is the speed of the communication. It
// has to correspond to the value passed to Serial.begin() in your
// Arduino sketch.
port = new Serial(this, Serial.list()[2], 9600);

// If you know the name of the port used by the Arduino board, you
// can specify it directly like this.
//port = new Serial(this, "COM1", 9600);
}

void draw() {
// draw a gradient from black to white
for (int i = 0; i < 256; i++) {
stroke(i);
line(i, 0, i, 150);
}

// write the current X-position of the mouse to the serial port as
// a single byte
port.write(mouseX);
}
*/



Arduino Code

/* by David A. Mellis's
* Modified by Nick Smith
*/

#include

AF_Stepper motor(48, 1);

void setup() {
Serial.begin(9600);
motor.setSpeed(10);
//motor.step(100, FORWARD, SINGLE);
motor.release();
delay(1000);

}

void loop() {
byte brightness;
if (Serial.available()) {
brightness = Serial.read();
motor.setSpeed(brightness / 5);
motor.step(10, FORWARD, INTERLEAVE);
//motor.step(100, BACKWARD, MICROSTEP);
Serial.flush();
}
}

NOTES THUR 10/1/09

NOTES THUR 10/1/09

Wednesday, September 30, 2009

Scale Model Camera System

Math. analysis & pre-coding work
Horizontal beam and stepper motor mount made out of cheap cheap cardboard for scale model proof of concept.


Tuesday, September 29, 2009

NOTES SUN 9/27/09

NOTES SUN 9/27/09

A board overview of project component pieces and
some stepper motors connected to an arduino motor sheild




Thursday, September 17, 2009

FOCUS IDEAS

FOCUS IDEAS:
-paper loop
-theremin
-brining people together
-loss between analog and digital

NOTES THUR 9/17/09

NOTES THUR 9/17/09

NOTES WENS 9/16/09

NOTES WENS 9/16/09

Wednesday, September 16, 2009

Monday, September 14, 2009

Notes THUR 9/10/09

Notes THUR 9/10/09

Limiting Factors / Rules

Limiting Factors/ Rules
  • Must have overarching theme
  • Must be some physical component
  • Accessible publicly (perhaps through internet)
  • Persisting information of previous interactions
  • Immediate interactivity & state of activity w/out ppl
  • Audio components
  • Social issue component of society & tech
Make ppl aware “important issues to the world”
  • Magical/wow factor
  • Get ppl interacting in some way they aren’t used to. Novel interaction
  • Machine aspect abstracted
  • Sense of living machine
  • Logistics
Not too hard to move around. Must be break down-able

Saturday, September 5, 2009

Blog Launch

The blog is now running.

Posting requires a Google Accounts, but as we discussed this in the first meeting, I don't think that should be much of a problem.

Email me your Google Account names and I'll add everyone :D