Category: Introduction to Computational Media

Today I presented my final project (the figure-skating-inspired-movement-painting) in ICM (Introduction to Computational Media), and yesterday I presented it in Pcomp (Physical Computing).  The presentation entailed me putting on a jacket that had the accelerometer with XBee sender in a pocket and a green circle sewed onto the front.  I started the Processing sketch, clicked on the green circle (for tracking purposes) and then once the blank white “canvas” came on screen, I moved around to create an etching along with other effects.  I really enjoyed hearing the commands of “jump” and “spin” from some classmates and the sounds of glee when my line-bows, spirals and spinning arcs appeared in reaction to my movements.  Here’s a video demo:

Pcomp-ICM Final Project demo from N Z on Vimeo.

I approached this project in three main stages. Rather than create a really long post, below are the project stages with links to the posts (photos and/or videos are included in the posts):

Since the final Processing sketch depends on both camera tracking and the XBee/accelerometer to work, it is available as a zip download via the link versus a link to an online sketch.  Of course there is much more that I’d like to do to improve this project, but I must now focus on getting my freelance work done and my film completed on adult competitive figure skaters.

The final test related to my final project was to incorporate camera tracking via colors/blobs in order to map a person’s movement to the etching effect in my Processing sketch.  I spent a fair amount of time reviewing the color tracking, brightness threshold and motion tracking examples in Learning Processing, but none of them did what I needed and I couldn’t quite figure out how best to manipulate the code to get it to accurately track the color I click on and identify the center x/y coordinates.

NZ_20091206_blobsI ended up asking Elie Zananiri, sitting across from me one day, who recommended the JMyron library, as he’d worked with it before and had even created an online JMyron help post.  I downloaded and imported the library into a new Processing sketch and then added the “findGlobs” and “trackColor” constructors in “setup.”  The array variable for “globArray” was originally set to “globBoxes” so that I could see, via rectangular bounding boxes, what blobs were being detected (screen shot taken later at my office, for visual reference).

Once I was confident that this was detecting the color I wanted, then I switched the globArray to “globCenters” so that the center-point could be determined and mapped to the x and y coordinates of whatever I wanted to draw (in this photo, I’m drawing a simple ellipse at the blob’s center point). NZ_20091206_blobC

For my final sketch, I did not want to have the video image displayed, but rather only the effects that I created in Processing.  I wanted the video image to come up initially though so that I could click on the color to be tracked.  Once I clicked, I wanted it to disappear.  This was accomplished by creating a variable for “state” set to “0” initially and then “1” at the end of the “mousePressed” code.  In “draw”, I would then add this pseudocode: if (state == 0){loadPixels through updatePixels in here} else{background(255); my draw code}.  The full code is included in the final sketch on the main “documentation” post.

The original idea or context for this project was to be able to trace a skater’s movement over the ice, assuming a camera could be placed directly over a rink, and visualize jumps and spins in a standard manner that skaters and coaches can understand.  While this has utility, it’s not so interesting visually to non-skaters, which is why I wanted to create effects that had some visual appeal.

I started with a simple circle following a mouse, but that was too basic, so I transformed them into a trail of small circles that lagged slightly behind the mouse, to create an interesting movement effect (this was inspired by a sketch I saw on Open Processing).  I then added Perlin noise around the trail, which for me symbolized ice particles coming off a skating blade.  Also, when the skater or mouse is still, the noise continues, which symbolizes a sort of restlessness and should therefore encourage movement.

NZ_20091206_spiralFor the spin, I wanted to create a spiral from the center point of the mouse position (which would ultimately correspond with the person’s position in relation to the camera), and then vary the speed depending on how fast one turned.  As a test, I made a spiral where the x coordinate of a point was determined by: “float x = r * cos(theta);” and the y coordinate by: “float y = r * sin(theta);”.   The radius incremented by a set value but the theta incremented by speed, which was determined by the position of the mouse on the screen (left is slow, right is fast).  This spiral processing sketch worked fine on its own, but when I added it to the “etching” sketch, I was not happy with the results, especially since the spiral did not render fast enough, even at a high speed value.

NZ_20091206_spinsI then decided to create a different kind of effect with faster movement to correspond with a left or right spin.  I first created one that utilized arcs in a loop in this manner: “arc(mouseX, mouseY, random(50), random(50), random(PI), random(PI));”.  It created a full, spherical effect which was interesting.  When I took it to Dan for review, he modified the code so that the arcs rotate around the center, which I really liked (see the code in the “Spin” class in the sketch link below).  I created an alternate spin effect, also using arcs within a loop, in this manner: “arc(mouseX, mouseY, arcW+i, arcH+i, random(PI), random(PI));” (see the code in the “Spin2” class in the sketch link below).

NZ_20091206_jumpsFor the jumps, I wanted the effect to be happily explosive and vertically oriented, yet somewhat random.  Within a loop, I coded lines in this manner: “line(mouseX,mouseY, mouseX-(random(-15,15)), mouseY-(random(-75,75)));”.  I added “noise” within a smaller random rectangular area around the center, which I plan to modify at some point.

Finally, I combined all the classes and code together in one Processing effects-only sketch, whereby the different effects are controlled by keys, which gives the person running the sketch complete control, via the mouse and keys, of the “movement painting.”

I soldered wires from a 5-pin header to the LilyPad:

  • white: Z value on the accelerometer to d0 on the LilyPad
  • yellow: Y value on the accelerometer to d1 on the LilyPad
  • green: X value on the accelerometer to d2 on the LilyPad
  • red: 3V on the accelerometer to 3.3V on the LilyPad
  • black: ground on the accelerometer to ground on the LilyPad

I then soldered red and black wires between power and ground respectively from the LilyPad to the battery holder.  I placed the accelerometer in the 5-pin header and the XBee sender in the LilyPad.  Once I confirmed that this was working via the Processing sketch, then I sewed it into my jacket pocket (see photo below) so that the orientation of the accelerometer would be relatively constant and generate similar values each time I jump or spin.


The last thing that I needed to do was to measure the values for jumping and spinning.  I wanted to save a file with all of the readings from the X, Y and Z values for each type of movement.  I accomplished this by adding the following code to the Processing sketch:

At the top (global variable):

int counter = 0;
String[] alldata = new String[5000];

At the end of the parseData function:

alldata[counter] = adcValues[0] +” “+ adcValues[1] +” “+ adcValues[2];
if (counter < 5000) {counter ++;} else {counter = 0;}

saveStrings(“accValues1.txt”, alldata);//one quick turn to the right
I then commented the above out and changed the file name for each type of movement

After I created a bunch of text files corresponding to different movements, then I pasted the data into different sheets in an Excel workbook and looked for significant fluctuations in the X, Y and Z values.  I discovered the “normal” range for the Z value averaged at 590; the Y averaged at 453 and the X at 487.  I deduced from the readings that a jump happened when the Z value went above 800; a spin to the right happened when the Y value was between 300 to 400; and a spin to the left happened when the Y value went below 300.  These were factored into conditional statements within “draw” in the final sketch.

After testing the accelerometer on the Arduino, I decided to move forward with the plan to take it wireless, since my project depends on freedom of movement for jumps and spins in order to generate visual effects on a screen. I spoke to a few people had experience with wireless devices and it was highly recommended that I go with XBee over BlueSmirf, since the latter caused some unknown/unexpected problems.

Ted Hayes, aka t3db0t, recommended that I get the following from SparkFun:

I also got a little AA battery holder to connect to the LilyPad.

Once all the items arrived, I downloaded and reviewed all the datasheets and manuals…and started to feel a heart attack coming on, so I went straight to Ted and he helped me cut to the chase to configure the XBee’s.  I attached the XBee receiver to the USB and wired up the XBee sender to the accelerometer in my Arduino as shown in the photo (the X value from the accelerometer was wired to AD0 input on the XBee, the Y to AD1, and the Z to AD2; VREF on the XBee was wired to power; the 3V on the accelerometer was also wired to power; the power on the Arduino was on the 3.3V pin).


We then went over to the Windows computer in the CommsLab room.  Ted logged in and assigned me a PAN ID on the in-house wiki.  Next we launched X-CTU to update the firmware and set the pins for the ADC (Analog to Digital Converter) on the XBee sender. The sender was assigned an ID of “1” and the receiver was set to “0”.  I then happily saw a green light, which indicated that the sender and receiver were in communication.  Fabulous.

We then created a Processing sketch and pasted XBee Packet Reader code into it.  This utilized serialEvent and parseData functions and drew a graph based on the accelerometer values.  After testing this and getting graph readings, I felt confident to move to the next stage of getting the XBee sender and accelerometer off the breadboard and soldered to the LilyPad.

Step 1: Testing the accelerometer via Arduino and Processing.

I wired up the triple axis accelerometer (that I purchased from the NYU Computer Store) as shown in the following photo (sorry for the blurriness):

Accelerometer wiring

In the Arduino code, I created three int “sensorValue” variables and three int variables for the X, Y and Z inputs.  I created an “establishContact” function, which is called in setup, and the loop reads the values from each X, Y, Z input via  analogRead and assigns these to each “sensorValue.”

In Processing, I created a standard “serialEvent” function with the “Serial myPort” parameter, that reads the incoming data as a string and creates an array of values corresponding to the X, Y and Z values.  For the first Processing sketch, I mapped the X, Y and Z values separately to Red, Green and Blue variables (code example: “r = map(sensors[0], 340,415,1,255);”).  I used the openGL library and in the draw loop, I created a simple ellipse that rotates around on the Z axis and the fill colors change depending on the position of the accelerometer.  Check out the results in this video:

Accelerometer testing, version 1 from N Z on Vimeo.

I ended up creating another Processing sketch because I wanted to see more directly how the motions affected an object.  Using openGL again, I simply created a 3D box, rotated slightly on the Y axis, and mapped the X, Y and Z values to the X, Y and Z coordinates of the box.   Here is the sketch:

Accelerometer testing, version 2 from N Z on Vimeo.

I spent the weekend of November 7th agonizing over ideas for a final project.  For ICM, I entertained the idea of doing more with the butterfly movements and morphing the butterfly into other forms (the Pcomp part involved either a balance board or wand), but then I started to think along the lines of producing something that could be applied to one of two projects that I’m working on outside of ITP: a documentary on adult competitive figure skaters and a website related to vegan raw food preparation.  I thought of games and data visualizations (both of which I ruled out for now but might try to do over the winter break).  I ended up settling on an idea related to skating: namely the creation of a choreography tool for figure skaters.

The short and free/long programs that figure skaters have to perform in tests and competitions are judged in a number of ways, one of which involves coverage of the ice (i.e. making the best use of the dimensions of the rink*).   I believe it would be helpful for skaters during the choreography phase of putting a program together to see a visual representation of their choreography and patterns on the ice with indications of all the required elements (spins, jumps, footwork, etc).  This could be rendered after a skater is videotaped or as they are skating – the output would be a processing sketch that could later be played to the music.

For ITP, I will tweak this idea to enable a person to create a “digital painting” based on their movements.  The person will put on a belt or vest containing a wireless accelerometer that will measure movements like spinning and jumping, each of which will be rendered distinctly in a Processing sketch.  They will perform their movements in front of a camera – their position within the space will be mapped to x and y positions of an object in a Processing sketch.   There is more that I’d like to incorporate in terms of sensors (heart-rate monitor) and Processing effects (adding color and style palettes and sound) but I’ll elaborate on those later, if I’m able to get what I’ve already mentioned done in time.

I’ve already begun to look at examples and log Processing effects that I’d like to incorporate for tracings, spins and jumps and have purchased and tested an accelerometer.  I’ll detail progress in subsequent posts.

* samples of required patterns for “moves in the field” tests


I created a form in PHP that utilized a simple text box, radio button and checkbox and when the user clicks the “submit” button, it “posts” the data to another PHP form and also simultaneously writes to a “data.txt” file in the “data” folder.  (I did not use “get” because it was drilled into me, when I took my webdev class in the Spring, never to use that because of security problems.)  I wanted to enable the user to create an avatar, and because I’m working on a documentary on figure skaters, I wanted to have the avatar be a skater.  I started to draw a body and outfit and quickly realized that it would take way too long to do for this assignment (plus I’m not good at drawing), so I chose some photos of skaters online instead.  The Processing sketch loads the”data.txt” file and then splits the data into individual strings – this part is working fine now, thanks to help from Dan.  I then created buttons with conditional statements to map the user input, based on gender and whether the user chooses “medal”, to the appropriate image (there are four options).  This is working fine locally but isn’t updating properly on the browser…will aim to troubleshoot this tomorrow.   CLICK HERE to go to the PHP form.

I wanted to do more with the accidental butterfly that I made, so I conceived of a harmonious environment built around butterflies that would be disrupted if a user got on a balance board.  (I didn’t realize until after reading the mathematics and translate/rotate sections in the book how similar this idea was to the Lesson Six Project.  There are also two groups in my Pcomp class who just announced that they’re making balance boards, so I might change my physical interface).

Dan suggested that I just work on making one butterfly move in response to the mouse for the Week 7 class.  It wasn’t too hard to make the butterfly move in response to the mouse within the butterfly object class, but the vector curves didn’t move in tandem with the vx, vy coordinates.  I like the effect of the wings lagging behind though (one has to click on the image to initiate the butterfly moving with the mouse).  CLICK HERE to see this sketch.

I really wanted to try making the butterfly move in more complex ways though and in a three dimensional world, so I experimented with OPENGL, rotations and the perspective function, which I found in the 3D tutorials online.  I added a sun, which is a rotating sphere, and a ground which is a simple rectangle.  When you click on the mouse, the perspective is activated – move the mouse around to see the butterfly – still flapping its wings – in relation to the sun and ground.  I can’t figure out why the color of the wings doesn’t render well in 3D and I’m not crazy about the perspective on the butterfly, so this sketch certainly needs more work and I need to get some help with these 3D functions.  CLICK HERE to see the 3d version.

This “Serial Duplex” lab makes use of multiple sensors.  In my case, I used a potentiometer and a photocell for the analog inputs and a standard switch for the digital one.

Pcomp week 7 lab wiring

After following the instructions and getting the white circle to move horizontally and vertically based on the sensor values, I changed my Processing code to create an ellipse in openGL that rotates on the Z axis.  I mapped the values from the photocell (“g = map(sensors[0], 50,520,1,255);”) to both the Green value of the ellipse color and the Z value of the “translate” code (for example: “translate(width/2, height/2, g);”).  I mapped the values of the potentiometer (“b = map(sensors[1], 0,1023,1,255);”) to both the Blue value of the ellipse color and the Z value of the actual ellipse (“ellipse(0, 0, 100, b);”).  I didn’t really want to use the switch, so I just set it to the Red value of the ellipse color.  The “theta” value for the Z rotation increased by 0.02.  Here’s the video demo, shot by Matt Richard:

Pcomp Week 7 Lab: Multiple Serial Output from N Z on Vimeo.