Category: S1: FALL 2009


Today I presented my final project (the figure-skating-inspired-movement-painting) in ICM (Introduction to Computational Media), and yesterday I presented it in Pcomp (Physical Computing).  The presentation entailed me putting on a jacket that had the accelerometer with XBee sender in a pocket and a green circle sewed onto the front.  I started the Processing sketch, clicked on the green circle (for tracking purposes) and then once the blank white “canvas” came on screen, I moved around to create an etching along with other effects.  I really enjoyed hearing the commands of “jump” and “spin” from some classmates and the sounds of glee when my line-bows, spirals and spinning arcs appeared in reaction to my movements.  Here’s a video demo:

Pcomp-ICM Final Project demo from N Z on Vimeo.

I approached this project in three main stages. Rather than create a really long post, below are the project stages with links to the posts (photos and/or videos are included in the posts):

Since the final Processing sketch depends on both camera tracking and the XBee/accelerometer to work, it is available as a zip download via the link versus a link to an online sketch.  Of course there is much more that I’d like to do to improve this project, but I must now focus on getting my freelance work done and my film completed on adult competitive figure skaters.

The final test related to my final project was to incorporate camera tracking via colors/blobs in order to map a person’s movement to the etching effect in my Processing sketch.  I spent a fair amount of time reviewing the color tracking, brightness threshold and motion tracking examples in Learning Processing, but none of them did what I needed and I couldn’t quite figure out how best to manipulate the code to get it to accurately track the color I click on and identify the center x/y coordinates.

NZ_20091206_blobsI ended up asking Elie Zananiri, sitting across from me one day, who recommended the JMyron library, as he’d worked with it before and had even created an online JMyron help post.  I downloaded and imported the library into a new Processing sketch and then added the “findGlobs” and “trackColor” constructors in “setup.”  The array variable for “globArray” was originally set to “globBoxes” so that I could see, via rectangular bounding boxes, what blobs were being detected (screen shot taken later at my office, for visual reference).

Once I was confident that this was detecting the color I wanted, then I switched the globArray to “globCenters” so that the center-point could be determined and mapped to the x and y coordinates of whatever I wanted to draw (in this photo, I’m drawing a simple ellipse at the blob’s center point). NZ_20091206_blobC

For my final sketch, I did not want to have the video image displayed, but rather only the effects that I created in Processing.  I wanted the video image to come up initially though so that I could click on the color to be tracked.  Once I clicked, I wanted it to disappear.  This was accomplished by creating a variable for “state” set to “0” initially and then “1” at the end of the “mousePressed” code.  In “draw”, I would then add this pseudocode: if (state == 0){loadPixels through updatePixels in here} else{background(255); my draw code}.  The full code is included in the final sketch on the main “documentation” post.

The original idea or context for this project was to be able to trace a skater’s movement over the ice, assuming a camera could be placed directly over a rink, and visualize jumps and spins in a standard manner that skaters and coaches can understand.  While this has utility, it’s not so interesting visually to non-skaters, which is why I wanted to create effects that had some visual appeal.

I started with a simple circle following a mouse, but that was too basic, so I transformed them into a trail of small circles that lagged slightly behind the mouse, to create an interesting movement effect (this was inspired by a sketch I saw on Open Processing).  I then added Perlin noise around the trail, which for me symbolized ice particles coming off a skating blade.  Also, when the skater or mouse is still, the noise continues, which symbolizes a sort of restlessness and should therefore encourage movement.

NZ_20091206_spiralFor the spin, I wanted to create a spiral from the center point of the mouse position (which would ultimately correspond with the person’s position in relation to the camera), and then vary the speed depending on how fast one turned.  As a test, I made a spiral where the x coordinate of a point was determined by: “float x = r * cos(theta);” and the y coordinate by: “float y = r * sin(theta);”.   The radius incremented by a set value but the theta incremented by speed, which was determined by the position of the mouse on the screen (left is slow, right is fast).  This spiral processing sketch worked fine on its own, but when I added it to the “etching” sketch, I was not happy with the results, especially since the spiral did not render fast enough, even at a high speed value.

NZ_20091206_spinsI then decided to create a different kind of effect with faster movement to correspond with a left or right spin.  I first created one that utilized arcs in a loop in this manner: “arc(mouseX, mouseY, random(50), random(50), random(PI), random(PI));”.  It created a full, spherical effect which was interesting.  When I took it to Dan for review, he modified the code so that the arcs rotate around the center, which I really liked (see the code in the “Spin” class in the sketch link below).  I created an alternate spin effect, also using arcs within a loop, in this manner: “arc(mouseX, mouseY, arcW+i, arcH+i, random(PI), random(PI));” (see the code in the “Spin2” class in the sketch link below).

NZ_20091206_jumpsFor the jumps, I wanted the effect to be happily explosive and vertically oriented, yet somewhat random.  Within a loop, I coded lines in this manner: “line(mouseX,mouseY, mouseX-(random(-15,15)), mouseY-(random(-75,75)));”.  I added “noise” within a smaller random rectangular area around the center, which I plan to modify at some point.

Finally, I combined all the classes and code together in one Processing effects-only sketch, whereby the different effects are controlled by keys, which gives the person running the sketch complete control, via the mouse and keys, of the “movement painting.”

I soldered wires from a 5-pin header to the LilyPad:

  • white: Z value on the accelerometer to d0 on the LilyPad
  • yellow: Y value on the accelerometer to d1 on the LilyPad
  • green: X value on the accelerometer to d2 on the LilyPad
  • red: 3V on the accelerometer to 3.3V on the LilyPad
  • black: ground on the accelerometer to ground on the LilyPad

I then soldered red and black wires between power and ground respectively from the LilyPad to the battery holder.  I placed the accelerometer in the 5-pin header and the XBee sender in the LilyPad.  Once I confirmed that this was working via the Processing sketch, then I sewed it into my jacket pocket (see photo below) so that the orientation of the accelerometer would be relatively constant and generate similar values each time I jump or spin.

NZ_091205_XbeeAll_3434aw

The last thing that I needed to do was to measure the values for jumping and spinning.  I wanted to save a file with all of the readings from the X, Y and Z values for each type of movement.  I accomplished this by adding the following code to the Processing sketch:

At the top (global variable):

int counter = 0;
String[] alldata = new String[5000];

At the end of the parseData function:

alldata[counter] = adcValues[0] +” “+ adcValues[1] +” “+ adcValues[2];
if (counter < 5000) {counter ++;} else {counter = 0;}

saveStrings(“accValues1.txt”, alldata);//one quick turn to the right
I then commented the above out and changed the file name for each type of movement

After I created a bunch of text files corresponding to different movements, then I pasted the data into different sheets in an Excel workbook and looked for significant fluctuations in the X, Y and Z values.  I discovered the “normal” range for the Z value averaged at 590; the Y averaged at 453 and the X at 487.  I deduced from the readings that a jump happened when the Z value went above 800; a spin to the right happened when the Y value was between 300 to 400; and a spin to the left happened when the Y value went below 300.  These were factored into conditional statements within “draw” in the final sketch.

After testing the accelerometer on the Arduino, I decided to move forward with the plan to take it wireless, since my project depends on freedom of movement for jumps and spins in order to generate visual effects on a screen. I spoke to a few people had experience with wireless devices and it was highly recommended that I go with XBee over BlueSmirf, since the latter caused some unknown/unexpected problems.

Ted Hayes, aka t3db0t, recommended that I get the following from SparkFun:

I also got a little AA battery holder to connect to the LilyPad.

Once all the items arrived, I downloaded and reviewed all the datasheets and manuals…and started to feel a heart attack coming on, so I went straight to Ted and he helped me cut to the chase to configure the XBee’s.  I attached the XBee receiver to the USB and wired up the XBee sender to the accelerometer in my Arduino as shown in the photo (the X value from the accelerometer was wired to AD0 input on the XBee, the Y to AD1, and the Z to AD2; VREF on the XBee was wired to power; the 3V on the accelerometer was also wired to power; the power on the Arduino was on the 3.3V pin).

NZ_091130_Xbee_3389aw

We then went over to the Windows computer in the CommsLab room.  Ted logged in and assigned me a PAN ID on the in-house wiki.  Next we launched X-CTU to update the firmware and set the pins for the ADC (Analog to Digital Converter) on the XBee sender. The sender was assigned an ID of “1” and the receiver was set to “0”.  I then happily saw a green light, which indicated that the sender and receiver were in communication.  Fabulous.

We then created a Processing sketch and pasted XBee Packet Reader code into it.  This utilized serialEvent and parseData functions and drew a graph based on the accelerometer values.  After testing this and getting graph readings, I felt confident to move to the next stage of getting the XBee sender and accelerometer off the breadboard and soldered to the LilyPad.

Step 1: Testing the accelerometer via Arduino and Processing.

I wired up the triple axis accelerometer (that I purchased from the NYU Computer Store) as shown in the following photo (sorry for the blurriness):

Accelerometer wiring

In the Arduino code, I created three int “sensorValue” variables and three int variables for the X, Y and Z inputs.  I created an “establishContact” function, which is called in setup, and the loop reads the values from each X, Y, Z input via  analogRead and assigns these to each “sensorValue.”

In Processing, I created a standard “serialEvent” function with the “Serial myPort” parameter, that reads the incoming data as a string and creates an array of values corresponding to the X, Y and Z values.  For the first Processing sketch, I mapped the X, Y and Z values separately to Red, Green and Blue variables (code example: “r = map(sensors[0], 340,415,1,255);”).  I used the openGL library and in the draw loop, I created a simple ellipse that rotates around on the Z axis and the fill colors change depending on the position of the accelerometer.  Check out the results in this video:

Accelerometer testing, version 1 from N Z on Vimeo.

I ended up creating another Processing sketch because I wanted to see more directly how the motions affected an object.  Using openGL again, I simply created a 3D box, rotated slightly on the Y axis, and mapped the X, Y and Z values to the X, Y and Z coordinates of the box.   Here is the sketch:

Accelerometer testing, version 2 from N Z on Vimeo.

I spent the weekend of November 7th agonizing over ideas for a final project.  For ICM, I entertained the idea of doing more with the butterfly movements and morphing the butterfly into other forms (the Pcomp part involved either a balance board or wand), but then I started to think along the lines of producing something that could be applied to one of two projects that I’m working on outside of ITP: a documentary on adult competitive figure skaters and a website related to vegan raw food preparation.  I thought of games and data visualizations (both of which I ruled out for now but might try to do over the winter break).  I ended up settling on an idea related to skating: namely the creation of a choreography tool for figure skaters.

The short and free/long programs that figure skaters have to perform in tests and competitions are judged in a number of ways, one of which involves coverage of the ice (i.e. making the best use of the dimensions of the rink*).   I believe it would be helpful for skaters during the choreography phase of putting a program together to see a visual representation of their choreography and patterns on the ice with indications of all the required elements (spins, jumps, footwork, etc).  This could be rendered after a skater is videotaped or as they are skating – the output would be a processing sketch that could later be played to the music.

For ITP, I will tweak this idea to enable a person to create a “digital painting” based on their movements.  The person will put on a belt or vest containing a wireless accelerometer that will measure movements like spinning and jumping, each of which will be rendered distinctly in a Processing sketch.  They will perform their movements in front of a camera – their position within the space will be mapped to x and y positions of an object in a Processing sketch.   There is more that I’d like to incorporate in terms of sensors (heart-rate monitor) and Processing effects (adding color and style palettes and sound) but I’ll elaborate on those later, if I’m able to get what I’ve already mentioned done in time.

I’ve already begun to look at examples and log Processing effects that I’d like to incorporate for tracings, spins and jumps and have purchased and tested an accelerometer.  I’ll detail progress in subsequent posts.

* samples of required patterns for “moves in the field” tests

sample-patterns-1sample-patterns-2

I created a form in PHP that utilized a simple text box, radio button and checkbox and when the user clicks the “submit” button, it “posts” the data to another PHP form and also simultaneously writes to a “data.txt” file in the “data” folder.  (I did not use “get” because it was drilled into me, when I took my webdev class in the Spring, never to use that because of security problems.)  I wanted to enable the user to create an avatar, and because I’m working on a documentary on figure skaters, I wanted to have the avatar be a skater.  I started to draw a body and outfit and quickly realized that it would take way too long to do for this assignment (plus I’m not good at drawing), so I chose some photos of skaters online instead.  The Processing sketch loads the”data.txt” file and then splits the data into individual strings – this part is working fine now, thanks to help from Dan.  I then created buttons with conditional statements to map the user input, based on gender and whether the user chooses “medal”, to the appropriate image (there are four options).  This is working fine locally but isn’t updating properly on the browser…will aim to troubleshoot this tomorrow.   CLICK HERE to go to the PHP form.

The media controller project that I worked on with Baowen Wong, aka Bo, was inspired by our mutual interest in working with images and my appreciation of the 3D animated movie, Coraline.  I latched onto the concept of the tunnel from the film as a portal to a virtual world where “reality” was not as it initially appeared.  I wanted users to look through a “tunnel” and see a different image each time of characters from the film looking back at them.  I also wanted users to engage with the tunnel as an object and use it to change what they saw.  In this prototype, if the user moves the tunnel to the right, then the image’s transparency changes to reveal the “lost souls” at the core of the movie.  If the user pushes the tunnel into the screen, then the pixels of the image push back and explode toward the viewer, which simulates an effect towards the end of the movie when Coraline’s alternate dream reality becomes more and more nightmarish and her environment begins to visually dissove in pieces before her eyes.  Check out a demo that was recorded in the Pcomp lab the day before we presented it in class:

Pcomp Week 9: Media Controller Project from N Z on Vimeo.

How it works:

Our prototype tunnel had just two sides (so that people could see the screen effects), and was made simply of black poster board attached to a computer monitor that Bo checked out from the equipment room (see note below on the tunnel we originally made).  We made a simple frame for the viewer end of the tunnel, covered with gaffer tape, and attached a photocell sensor to the outside of the frame NZ_Pcomp_091116_3295w, with the idea that the viewer’s head would block it and thus trigger the image change on the screen.  This didn’t quite work, so the viewer had to cover the photocell with their hands, which is still somewhat natural since one often cups their hands around their eyes to block out light when peering through some sort of tunnel or hole.

We also attached a flex sensor to the left side of the posterboard flap (out of frame but indicated by the red and green wires; the blue/yellow ones were attached to the photocell).  The values were mapped in Processing such that when the flap was moved to the right, the current image’s transparency setting would drop and reveal the “lost soul” photo already loaded underneath. NZ_Pcomp_091116_3299aw The flex sensor also measured the change in values when that left flap bent towards the monitor, and that variable affected the “z” value of the “explode pixels” function in Processing.  NZ_Pcomp_091116_3305aw

Coding:
NZ_Pcomp_091116_3293aw
The overall code was based on the serial labs which we had just completed.  The three variables from the two analog sensors were written as such in the “serialEvent” section:

  • flexbendin = map(sensors[0], 260,200,0,width);
  • flexright = map(sensors[0], 290,350,0,100);
  • photocell = map(sensors[1], 170,630,50,100);

The “flexbendin” variable was utilized in the “explode pixels” function in this line:

  • float z = (flexbendin/(float)width) * brightness(images[imageIndex].pixels[loc])- 100.0;

The photocell variable was called in draw, along with a simple function called “changeImage”:

  • if (photocell<50){viewing = true;} else {viewing = false;}
  • if (viewing != lastViewing){if (viewing == false) changeImage();}
  • lastViewing = viewing;

Corey Menscher helped me troubleshoot the boolean button code and how that turned the effects on and off.  The other two variables also utilized their own boolean buttons to turn on and off the transparency/tint effect and the exploding pixels effect.

Bo worked on a whole sound element using the Minim sound library for Processing.  She chose a variety of sounds from the library that matched the mood of the images, and we chose ultimately to integrate just one haunting sound to go with the appearance of the lost souls.  The “song.play();” and “song.close();” were incorporated into the “flexright” conditional statement.  All of the code was correct but the sound didn’t play when we presented the video.  I learned afterward that Minim doesn’t work properly with the version of Processing that I’m using (version 0135 – I’m using this because of an unknown Java runtime error that comes up with the most current version of the application on my computer.  Dan Shiffman knows about this but was not able to help troubleshoot it.  I need to upgrade my OS from Tiger to at least Leopard…but I’ll do that during winter break.)

The H-Bridge Lab builds on the Transistor Lab by inserting an H-Bridge on the breadboard so that the direction of the motor can be controlled by a switch.  Note the addition of a capacitor.  My little motor performed seemed to like this wiring better than the transistor one.

NZ_IPC_HW8b_lab_3315

Pcomp Week 8 Lab: HBridge from N Z on Vimeo.