Archive for October, 2009

ICM Class Notes – Using Images and Video – October 22, 2009

Friday, October 30th, 2009

Here are my notes from last week’s ICM class, where we learned how to use pictures and video in Processing. It’s taken me some time for me to get around to posting these notes because I have been focused on developing the media controller for physical computing, and working on my mid-term project. Without further ado here are my notes.

Using Images in Processing
There are two main types of activities related to using and manipulating pictures in Processing:
1. Loading and displaying images
2. Reading and manipulating the pixels

Loading and Displaying Images
In processing there is a class that handles images that is called PImage – in many ways this class is similar to the PFont object. For example, to create a new instance of an image object the loadImage(URL) function is used rather than a “new” object declaration. Here is sample code:

PFont cat;
cat = loadImage(“cat.jpg”);

The image() function is used to draw onto the screen images that are loaded. This method requires from 3 to 5 arguments, here is the complete set: image(PImageVar, xpos, ypos, width [optional], height [optional]). We can assign an image to the background() function as well, in which case the image would be displayed as the background. Here are some useful methods associated to displaying images:

  • The imageMode() function allows setting of image alignment. Setting options include CENTER, CORNER (standard setting) and CORNERS.
  • The tint() function enables changing the color of the image. It does not color the picture but rather removes the other colors. So if you change the tint to red, processing will set all G and B value to “0”. Tint also supports setting the transparency of an image.
  • The PImage.get(xpos, ypos) function returns the color from the pixel at the coordinate xpos ypos. Get() can be used without an image to get the color of the pixel at coordinates xpos and ypos of the processing screen.
  • The PImage.set(xpos, ypos, color) function sets the pixel at location xpos ypos to the color that is passed as the third argument.
  • The red(color) (green() and blue()) functions determines how of that color (red, green or blue) is included in that pixels.

Loading and Displaying Videos
Processing handles videos in the same way that it handles images (it actually views videos as a bunch of standalone images. Using live video in processing is done through the Capture class. To use this class you need to add the video library to your sketch. Note that you need to use a different class to include movies, which are pre-recorded videos. Unfortunately, Processing does a bad job at handling movies though it works well with live video feeds.

To declare a Capture object you need to use the standard “new” object declaration: “myVideo = new Capture(this, xpos, ypos, refresh rate)”. When you initiate a video object processing will default to using the default system camera. The name of an alternate video source can be added between the ypos and refresh rate variables.

Before you display an image you need to read the image using the myVideo.read() function. The same image() that is used for PImage objects will display images from the camera onto the screen. All of the image processing functions discussed above can also be used to analyze video input.


Media Controller Project (and ICM Mid Term) – Phase 3

Friday, October 30th, 2009

During that last several days I have been working on setting up a Processing sketch that can work with my Physical Computing media controller and serve as my mid-term project for the Introduction to Computational Media course. Long before arriving at ITP I have been interested in the design and development of media controllers. This project provided the opportunity for me to start some hands-on explorations.

In my previous post I already discussed the process for choosing the solution for playing and controlling our audio – we have decided to use Processing (and the Arduino) to control Ableton Live. Today I will provide an overview of how I developed the code for this application and some of the interface considerations associated to designing a software that could work across physical and screen-based interfaces.

My longer term objective is to create MIDI controllers using for audio and video applications using touchscreen and gestural interfaces. The interfaces that I am designing would ideally be evolved to work on multi-touch surfaces. In regards to my interest in gestural interaction, this I hope to explore through my current physical computing project and future projects.

Developing the Sketches
Since the physical computing project requires three basic types of controls that are the foundation of the media interface for my computational media mid-term, I decide to start with a focus on writing the code for these three basic elements. I set out to create code that could be easily re-used so that I could add additional elements with little effort. Here is a link to the sketch on openprocessing.org, where you can also view the full code for the controller pictured below (v1.0).

The process I used to create these sketches included the following steps: (1) creating the functionality associated to each element, separately; (2) creating a class for each element; (3) integrating objects of each class in Processing; (4) testing Processing with OSCulator and Ableton; (5) creating the Serial protocol to communicate the Arduino; (6) testing the sensors; (7) writing the final code for the Arduino; (8) testing Serial connection to Arduino; (9) calibration of the physical computing interface (whenever and wherever we set it up).

I have already made two posts on this subject (go to phase 1 post, go to phase 2 post), however, today I can attest that I have completed the vast majority of the work. The last processing sketch that I shared featured a mostly completed Matrix object that included functions for OSC communication. The serial communication protocol had also been defined.

The many additions to the sketch include creation of button and slider elements (each in its own class), a control panel (that holds the buttons and sliders), and a version of the application that features multiple button and sliders. The main updates to existing features include changes to Serial communication protocol to support additional sliders and matrices), and OSC communication code updates to ensure that messages are only sent when values change rather than continuously.

For the slider object I used the mouseDrag() function for the very first time. I had to debug my code for a while to get the visual slider to work properly. The button was easy to code from a visual perspective. The challenge I faced was in structuring the OSC messages so that I was able to send two separate and opposing messages for each click. The reason why this is important is that Ableton Live uses a separate buttons for starting and stopping clips. So I had to find a way to enable a single button to perform both functions.

The serial communication protocol update was easy to implement, so I will not delve into it here. To change the OSC communication protocol required a bit more work. I created a previous state variable in each object class to be enable verification of whether a change had occurred. The logic was implemented an “if” statement in the OSC message function.

Evolving the Controller
Here is an overview of my plans associated to this project: I plan to expand the current media controller with a few effect grids and the ability to select individual channels to apply effects. In order to do this I have to create new functions for the matrix class that enables me to set the X and Y matrix map values. I also want to work on improving the overall esthetics of the interface (while keeping its minimal feel).

From a sketch-architecture perspective I am considering creating a parent class for all buttons, grids and sliders. It would feature attributes and functionality that is common amongst all elements. Common attributes include location, size and color; common functionality requirements include detection of mouse location relative to object, OSC communication.

Questions for Class
Here is a question that came up during my development of this sketch (Dan, I need your help here). Can I use the translate, pop and pushMatrix commands to just to capture the current mouse location? This would be an easier solution to checking whether the mouse was hovering over an object.


Media Controller Project – Phase 2

Wednesday, October 28th, 2009

This weekend I spent a lot of time working on solving the issue of how to control and manage the music clips that we want to use in our project. Our requirements are pretty straight forward, which is not to say easy to address.

Requirements for Audio Controls
We need a solution that can handle playback of looped samples and dynamic control of at least two effects to be applied on the sample (such as tempo and pitch). Ideally we would like the solution to be scalable (so we can add multiple sounds) and be able to support quantization and other techniques to ensure that the resulting sound is of good quality.

Since we are a creating a prototype to run off of a single computer we do not need this solution to be easily portable (e.g. it does not need to be easy to run on different computers).

Initial Assumptions
Due to the expertise of the team members we are using a combination of Arduino and Processing to do the heavy lifting in the areas of input sensing and data handling. After researching all of the options available in processing to manage sound, we have decided to use Ableton Live instead. Processing’s role will be relegated to interpreting the data from the Arduino to control Ableton Live via OSC.

Below I provide a more in-depth overview of my research and the solution that I have chosen. I have also posted an updated version of my sketch along with a link to the file I have created in Ableton Live for this application. Please note that you will need to set-up OSCulator in order for the sketch to work properly.

Latest Version of the Sketch
Note that I am only sharing the code for the sketch because no updates were made to the look, feel and interaction of the applet. All updates are related to enabling the sketch to communicate with Ableton via OSC.

/* IPC Media Controller Project, October, 2009
* VIRTUAL MATRIX SKETCH
*
* This sketch is the first draft of the processing portion of our media controller project
* in its current state, this sketch only focuses on reading input from serial ports,
* processing this input to determine location on the virtual matrix, then provide these
* coordinates to other objects (such as the music generation object that we will create in the future)
*
*/

import processing.serial.*;
Serial arduino;


import oscP5.*;
import netP5.*;
OscP5 oscComm;
NetAddress myRemoteLoc;

boolean isStarted = false;
// Matrix-Related Variables
Matrix matrix;
final int x = 0; final int y = 1; // variables to use with matrix size array
int [] cellSize = {50, 50};
int [] screenPad = {25,25}; // define padding between grid and screen border
int [] screenSize = new int [2]; // define screen size, note that we only add volSize to width, since volume knob will be place to right of screen

// Volume-Control Related Variables
int [] volSize = {0,0};


void setup() {
// initialize the matrix object
matrix = new Matrix(screenPad[x], screenPad[y], cellSize[x], cellSize[y]);

// instantiate the serial variable
arduino = new Serial(this, Serial.list()[0], 9600);
arduino.bufferUntil('.');

// set frame rate
frameRate(25);
// start osc communication, listening for incoming messages at port 12000
oscComm = new OscP5(this,12000);
// set destination of our OSC messages (set to port 8000, which is the OSCulator port)
myRemoteLoc = new NetAddress("10.0.1.3",8000);

// set screen size related variables
screenSize[y] = int(matrix.getMatrixWidth() + (screenPad[y] * 2));
screenSize[x] = int(matrix.getMatrixHeight() + volSize[x] + (screenPad[Y]*2));
size(screenSize[x], screenSize[y]); // draw the window 100 pixels wider and 50 pixels taller
}


void draw() {
matrix.isCellActiveMouse();
matrix.isCellActiveSerial();
matrix.drawMatrix();
matrix.sendOscMessage(oscComm, myRemoteLoc);
}


void serialEvent(Serial arduino) {
matrix.readSerialInput(arduino);
}

void oscEvent(OscMessage theOscMessage) {
/* print the address pattern and the typetag of the received OscMessage */
print("### received an osc message.");
print(" addrpattern: "+theOscMessage.addrPattern());
println(" typetag: "+theOscMessage.typetag());
}


/* CLASS MATRIX
*
* this class holds a virtual matrix that will mimic the real world matrix.
* It contains functions that read input from a serial port or mouse, then use that
* input to determine the location of the object or mouse on the grid
*
*/

class Matrix {

// general variables used accross class
final int x = 0; final int y = 1; // variables to use with matrix size array
final int mouseControl = 0; final int serialControl = 1;

// matrix and cell related variables
final int [] cellNumber = {5, 5}; // number of cells on the horizontal axis of the matrix
final float [] cellSize = new float [2]; // width and height of each cell of the matrix
int [] matrixLoc = new int [2]; // location of the overall matrix
final float [] matrixSize = new float [2]; // the total height and width of the matrix
float [] xCellLoc = new float [cellNumber[x]]; // location of each cell on the grid
float [] yCellLoc = new float [cellNumber[y]]; // location of each cell on the grid
Boolean [][] cellState = new Boolean [cellNumber[x]][cellNumber[y]]; // holds whether the mouse or serial object is hovering over a cell
color activeColor = color (255,0,0); // holds color of active cells
color inactiveColor = color (255); // holds color of inactive cells
int [] previousState = {0,0}; // holds prevous state of the cell

// variables for reading serial input
int mainControl = mouseControl;
float [] serialLoc = {0,0}; // holds Y reading from the serial port

// Matrix Object Constructor
Matrix (int XLoc, int YLoc, int cellWidth, int cellHeight) {
matrixLoc[x] = XLoc; // set X and Y location of the virtual matrix
matrixLoc[y] = YLoc;
cellSize[x] = cellWidth; // set the size of each cell on the grid of the virtual matrix
cellSize[y] = cellHeight;
matrixSize[x] = cellNumber[x] * cellSize[x]; // calculate width of the matrix
matrixSize[y] = cellNumber[y] * cellSize[y]; // calculate height of the matrix

// sets the location of each cell on the grid
for (int xCounter = 0; xCounter < xCellLoc.length; xCounter++) {
xCellLoc[xCounter] = xCounter * cellSize[x]; }
for (int yCounter = 0; yCounter < yCellLoc.length; yCounter++){
yCellLoc[yCounter] = yCounter * cellSize[y]; }

// sets the status of each cell to false
for (int xCounter = 0; xCounter < cellState.length; xCounter++) {
for (int yCounter = 0; yCounter < cellState[xCounter].length; yCounter++) {
cellState[xCounter][yCounter] = false; }
}
} // close the constructor



// function that returns the height of the matrix
float getMatrixHeight(){
return matrixSize[x];
}


// function that returns the width of the matrix
float getMatrixWidth(){
return matrixSize[y];
}


// function that draws the matrix on the screen
void drawMatrix() {
for (int xCounter = 0; xCounter < xCellLoc.length; xCounter++){ // loop through each element in the xCellLoc array
for (int yCounter = 0; yCounter < yCellLoc.length; yCounter++){ // loop through each element in the yCellLoc array
if (cellState[xCounter][yCounter] == true) { fill(activeColor); } // if the cellState is true then change the color of the cell
else { fill(inactiveColor);} // if the cellState is false then don't change the color of the cell
rect(xCellLoc[xCounter]+matrixLoc[x], yCellLoc[yCounter]+matrixLoc[y], cellSize[x], cellSize[y]); // draw rectangle
}
}
} // close drawMatrix() function


// function that reads the input from the serial port
void readSerialInput (Serial Arduino) {
if (!isStarted) { // if this is the first time we are establishing a connection
isStarted = true; // set isStarted to true
arduino.write("n"); // respond to arduino to request more data
} else { // if this is NOT the first time we have received data from the arduino
String bufferString = arduino.readString(); // read the buffer into the bufferString variable
if (bufferString != null) { // if bufferString holds data then process the data
bufferString = bufferString.substring(0, bufferString.length() - 1); // trim the string
String[] serialValues = splitTokens(bufferString, " "); // separate the two values from the string and save them in the serialValues variable
serialLoc[x] = float(serialValues[x]); // assign value to serialLoc[x]
serialLoc[y] = float(serialValues[y]); // assign value to serialLoc[y]
}
arduino.write("n"); // respond to arduino to request more data
}
} // close readSerialInput() function


// returns an array with the unfiltered x and y locations from the serial monitor (may need to filter data based on range of serial input and requirements of music objects)
int[] getSerialXY() {
return int(serialLoc);
}

// TO BE CREATED
// function for user to set whether main input is serial or mouse based
void setMainControl(int tControlType) {
mainControl = tControlType;
}

// function that sends OSC messages with input values
void sendOscMessage(OscP5 tOscComm, NetAddress tMyRemoteLoc) {
float messageX = 0;
float messageY = 0;

// open new OSC messages of type x and type y
OscMessage myOscXMessage = new OscMessage("/controlGrid/x");
OscMessage myOscYMessage = new OscMessage("/controlGrid/y");

// determine whether readings that are sent to OSC will originate from serial device or mouse
if (mainControl == serialControl) {
messageX = map(serialLoc[x], 0, width, 0, 1);
messageY = map(serialLoc[y], 0, height, 0, 1);
} else if (mainControl == mouseControl) {
messageX = map(mouseX, 0, width, 0.075, 0.125);
messageY = map(mouseY, 0, height, 0.3, 0.7);
}

myOscXMessage.add("x "); /* add an int to the osc message */
myOscYMessage.add("y "); /* add an int to the osc message */
myOscXMessage.add(messageX); /* add an int to the osc message */
myOscYMessage.add(messageY); /* add a float to the osc message */
tOscComm.send(myOscXMessage, tMyRemoteLoc);
tOscComm.send(myOscYMessage, tMyRemoteLoc);

print("X: " + messageX + " ");
print("Y: " + messageY + " ");
println();
}

// returns an array with the unfiltered x and y locations from the mouse-based interface (may need to filter data based on requirements of music object)
int [] getMmouseXY() {
int [] mouseXY = {mouseX, mouseY};
return mouseXY;
}


// check if a cell on virtual Matrix is active based on the mouse location
void isCellActiveMouse () {
int XLocMouse = mouseX - matrixLoc[x]; // adjust variable to account for location of Matrix within window
int YLocMouse = mouseY - matrixLoc[y]; // adjust variable to account for location of Matrix within window
isCellActive(XLocMouse, YLocMouse); // call the function to check if the cell is active based on current location of mouse
}


// check if a cell on virtual Matrix is active based on the current physical location/state of an external object
void isCellActiveSerial () {
int xLocSerial = int(map(serialLoc[x], 0, 1024, 0, matrixSize[x])); // adjust variable to account for location of Matrix within window
int yLocSerial = int(map(serialLoc[y], 0, 1024, 0, matrixSize[y])); // adjust variable to account for location of Matrix within window
isCellActive(xLocSerial, yLocSerial); // call the function to check if the cell is active based on current location of mouse

}


// function that checks whether a specific cell is Active
void isCellActive (int tXloc, int tYloc) {
int xLoc = tXloc; // set the location of the X coordinate where the mouse or serial object is located
int yLoc = tYloc; // set the location of the Y coordinate where the mouse or serial object is located

for (int xCounter = 0; xCounter < xCellLoc.length; xCounter++){ // loop through each element in the xCellLoc array
for (int yCounter = 0; yCounter < yCellLoc.length; yCounter++){ // loop through each element in the yCellLoc array

// check out what are the mouse or serial object is intersecting
if ( (xLoc > xCellLoc[xCounter] && xLoc < (xCellLoc[xCounter] + cellSize[x])) &&
(yLoc > xCellLoc[yCounter] && yLoc < (yCellLoc[yCounter] + cellSize[y])) ) {
cellState[previousState[x]][previousState[y]] = false; // set previous grid element to false
cellState[xCounter][yCounter] = true; // set current element to active status
previousState[x] = xCounter; // set x number of previous active cell
previousState[y] = yCounter; // set y number of previous active cell
}
}
}
}
}

Making Some Noise
When we started working on this project we assumed that we would be able to use one of Processing existing sound libraries to play and modulate an audio loop. However, after doing extensive research into Minim, ESS, and Sonia, I realized that none of these tools offered the feature set that we needed for this project.

The next solution that I investigated was Max/MSP. This programming language/environment is definitely capable of providing the functionality that we are looking for. However, no one on our team has the expertise to use it nor the time to learn it for this project.

using OSC to communicate with an external music application that can provide the features we are looking for. I was happy to find out that there is a simple library called oscP5 that makes it easy to communicate from a sketch using OSC. Equally important, I also found an application called OSCulator that routes and translates OSC and MIDI messages.

Having figured out how to get the sketch to communicate via OSC and MIDI we set out to find the right application. This was an easy task in large part because both Michael and I are familiar with Ableton.

I am happy to report that we already have Ableton up and running with the virtual matrix application developed in processing, though that is not to say the sketch is finished. We still need to add start and stop buttons to the interface, along with a volume control (not to mention other improvements and ideas that have not yet been considered).

In the next day or so I will share with you more updates, including details about how the physical elements of the interface are shaping up.


IPC Class Notes, Serial Communication P2 – Oct 21, 2009

Tuesday, October 27th, 2009

Today’s class focused on expanding our understanding of serial communications and holding brief discussions regarding our media controller projects.

Media Controller Discussion – Specific Guidance
For our project one of the main areas that we have not yet solved is how to play and modulate the sound. In response to our request for input we briefly discussed the three sound libraries available in Processing:

  • Minim – sound generation and audio playback support with limited in functionality.
  • ESS – more functionality then Minim but still very basic set of features.
  • Sonia – most powerful, flexible and complex of the bunch.

(Since we originally held this discussion we have figured out a solution for our needs. I will post more information about it in the next two days.)

Media Controller Discussion – General Guidance
Tom provided the class with an overview of a helpful process for building prototypes. Here is a description of it, along with some additional thoughts of my own. Once you’ve decided on the idea for your project and are ready to start building mental, virtual and physical prototypes it is useful to break down the idea into sensing, data processing and response activities.

When building out the sensing portion of your project, (1) if you have any doubts regarding whether your plan will work then you should create simple models to test your strategy. At this stage the simpler the better, though some times there is only so much simplification you can add.

Once you know that your overall sensing strategy is sound, (2) work on getting your sensors physically set-up and connected properly to the circuit. Test the circuit to ensure it works properly and confirm the range of the sensor.

(3) Only after confirming that the sensors are working properly should you move on to setting up communication between the Arduino and the computer (in our case, processing). There is nothing wrong with working on this section of the code simultaneously but just don’t try to debug your sensors from across the serial connection.

When working on the data processing part of your project, (1) start by focusing on developing a sketch that is able to process fake data before trying to connect the application to handle live sensor readings. (2) It can be helpful to develop a virtual version of your physical interface. It enables you to test code before the physical prototype is done, and can serve as a debugging tool. (3) Once the data processing is working, set-up and test the connection between the sensor, data processing, and response elements.

Follow a similar process for setting up the response mechanism. (1) Make sure that you can get it working on its own, (2) then connect it to the data processing hub (or directly to the Arduino) and test the two together.

Serial Communication – Part 2
To communicate multiple messages at once via the serial port requires use of one of the following strategies: (1) delimitation method; (2) handshaking.

(1) Delimitation (or Punctuation)
A communication protocol that leverages punctuation characters to demarcate where one piece of data begins and another one ends. Here is an overview of the Processing functions that you will need to use to decode the readings from the Arduino:

  • PortName.bufferUntil() – sets on which character the serialEvent callback function is called by the serial buffer.
  • String.Trim() – get rid of any blank space in the beginning and end of the string. Does not impact the middle of the string.
  • Split(string, split character) – enables splitting the string at where ever the split character is found.

(2) Handshaking
This protocol determines that the Arduino will not send any messages unless it receives a request from the computer. In order to make this protocol work you need to set-up the following logic to govern the communication cadence:

  • On the Arduino sketch you need to integrate an if statement that confirms that data has been received via the serial port before it sends out any of its own data via the serial port (Serial.available() > 0). Make sure to clear the buffer every time by using the Serial.read() function. This if statement could also check for specific characters being received through the serial port.
  • On the Processing side you need add code that sends out a message to the Arduino every time that Processing is ready to receive a new communication. This can be triggered anytime that Processing is done reading the current serial data buffer (or using counters and event-based triggers).

Often it may be necessary to create a simple function on the Arduino and Processing side to establish the initial communication. Here is an example of how these types of functions may work together: On the Arduino side a function would be added to the setup()that loops repeatedly sending one-character messages to Processing until it receives back a response, which confirms a connection has been established. On the Processing side, whenever a serial communication is received a simple function would check whether communications had previously been established. If it had not, then the communication state would be changed and the current communication discarted. If communication had been established then the data would be processed.

Miscellaneous Notes
Every time the serial port is opened the Arduino re-initiates the sketch that it is running.

  • \n = new line
  • \r = carriage return (goes back to the beginning of the line)

Paradigms for Programming Languages

  • Loop-based – processing and arduino are loop-based languages because at their core that organizes the code to run in a repetitive loop.
  • Callback-based (event-based) – javascript on the other hand is a callback-based language that organizes codes to run based on events.

IR cameras are a great way to use a camera in a way that helps it identify the object we need to locate.

Things to Check Out

  • Check out Dan’s site about the “rest of you” for information about bio-feedback.
  • Look at Aaron’s theses on cats.

ICM Class Notes on Text Processing and Serial – Oct 14, 2009

Friday, October 23rd, 2009

Last week’s ICM class focused on parsing strings, picking up where we left off the week before, with a short overview of serial communications. Here a list of the topics that we covered:

  • Processing and XML
  • Additional functions for processing text
  • Serial communications

XML Libraries in Processing
XML has a hierarchical structure similar to a tree. This makes XML files easier to read than other types of content. There are many different ways that you can read XML documents in processing. Here is an overview of these options:

  1. Standard string parsing functions in processing, like the ones outlined below and in my post from last week.
  2. Existing processing XML libraries. Examples of these include simpleML, XMLElement, proXML. The libraries enable processing to navigate the structure of an XML file by finding a data elements by navigating through its children or parent structure.
  3. Application processing interfaces (APIs) from the data source. Examples of sites with APIs include Flickr, Google Maps, etc.

Using Tokens to Parse Text
This week we were introduced to the concept of tokens and splits. Tokens are small chunks of text (these are only one character long). Here is a list of functions that leverage tokens or splits.

  • split(String, SplitStringIdentifier); – This function returns multiple strings. The input copy “String” is split at the instance(s) of SplitStringidentifier. The SplitStringIdentifier is removed from the final strings.
  • splitTokens(s, multiple SplitStringIdentifier); – This function is the same of split, however, it can accept multiple SplitStringIdentifiers. These are input all together within quotation marks. For example using “_.” Would look for instances where either “_” or “.” appear in the string.

Comparing Text

  • equals(); – This function compares the copy contained in a string to a piece of text data. For example “string.equals(“stringData”);” is equivalent to the syntax “intVar == 3”. That said, we cannot use the Boolean operator “==” to match strings. That is why the equals() function is needed. In order to compare words regardless of whether they are lower or upper case we can use the string.toLowerCase() function.
  • Regular Expressions – Regular expressions are special text strings for describing search patterns. These capabilities enable more sophisticated parsing of content than is possible through the standard functionality in Processing. This is the ideal way to perform complex data parsing and cleaning. We did not cover this in class, so research will be needed on this front.

Serial Overview
There are two main approaches to creating protocols for serial communication. The first is called punctuation, and it entails adding tokens to the data being communicated to enable the receiving computer to parse the data once received. The second approach, called handshaking, entails having each computer wait to send a message until they have received data from the other connected device.

I will not review these approaches here in detail because I have covered them in my posts from the Intro to Physical Computing class. The punctuation method is described here. I will soon add a link to the post regarding the handshake method (which is currently a work in progress).

Related notes and concepts:

  • CallBack refer to methods that are called by other applications when a certain event takes place. MousePressed and SerialEvent are types of callback or event functions available in processing.
  • To create a new serial object it is important to always include the Serial library (as it is not a standard processing library). Then to instantiate the object, once you’ve declared it, you need to use the following syntax: “portName = new Serial(this, “serialPortNumber”, baudRate);” The “this” argumen tells the serial object that it is setting up a communication between the serial port and this specific sketch.

Serial Communication Lab, Sending Multiple Bits of Data

Thursday, October 22nd, 2009

Late last night I completed this week’s lab for Introduction to Physical Computing class. Our focus this week was on expanding our ability to leverage serial communications to enable a computer and microcontroller to send and interpret multiple pieces of data to each other at any given time. The key challenge involved in this process is setting-up a protocol that can be encoded by the sender and interpreted by the receiver. Here is a link to the instructions for this lab.

We specifically focused on exploring two different approaches for this type of communication: (1) punctuation protocols; (2) handshake (or polite communication) protocols. I will not delve into these in detail here because I will soon post my full notes from this week’s class that will cover this topics in greater length. Here is the video from this week’s lab:


I am happy to report that I did not encounter any issues with either of these approaches. My previous experience setting up this type of communications was a great help. (here are some links to the previous projects and labs where I had already explored these types of serial communications: Fangs Controller; Etch-a-Sketch; Media Controller.)

That is not to say that I did not learn from this week’s lab. For example, the “establishContact()” function on the Arduino easily solves a problem that I struggled with yesterday when building the first draft of the media controller project “middleware”. Also, though I have been able to leverage this type of functionality previously, I am still a novice in need of much reinforcement as provided by this lab.


Media Controller Project – Phase I

Thursday, October 22nd, 2009

I recently began to work on a media controller project with two colleagues from ITP, Zeven Rodriguez and Michael Martinez-Campos. After much discussion, and our fair share of agreements and disagreements we have decided to develop an interface for music modulation. This device is going to be composed of a square horizontal surface coupled with a physical mouse-like object. By sliding the object across the surface, a user is able to modulate two attributes of the sound that is being generated.

Ideally, we would like to make the axis of this surface assignable (e.g. you could choose the effect/modulation associated to each axis). Also, it would be great if we could provide the user with the ability to play multiple sounds simultaneously, and to choose whether to control all sounds, or just a single sound with this surface.

That said, this project is being developed as part of our Introduction to Physical Computing curriculum. For our initial deadline 2-weeks from now we have decided to keep things simple; if we get things done sooner we may try to integrate some of these features.

To help get things done efficiently we have divided our roles and responsibilities. Zeven is taking the lead on creating the physical surface and object. Mike is working on investigating the solutions for the sound generation through Processing. I am leading the development of what I am calling the middleware – the application that gets the data from the sensors and feeds it to the program that will generate the music.

Creating the Connection (and a Virtual Grid/Surface)
Earlier today I finished the initial version of the processing application that will be responsible for reading data from the serial port, interpreting that data, and sending it onto to the sound generator. This is an important step in the evolution of this project, though I know that many updates will have to be made to this app during the coming weeks.

Pictures of Arduino Input for Test

One of the biggest challenges in creating this sketch was setting up the serial communication between the Arduino and Processing – we need to find a way to send data from two sensors to the computer. We decided to use the handshake protocol because it minimizes response delays.

Since I just learned how to use this type of protocol, it took me a little bit to get it working properly. Below I’ve included a brief overview of the issues I encountered along with the code I wrote for the Arduino, and a link to the Processing application.

Using the Handshake Communication Protocol
To set-up the handshake protocol, the first thing I did was to create a syntax for the communications. Unfortunately, that syntax did not work so I had to update it a few times to smooth out all of the kinks. Below I have shared the original and revised protocols.

  • Initial protocol: “valueOne.valueTwo. \n\r” (e.g. “224.200.\n\r”). The new line and carriage return characters are appended by the println() function that I used to send data for the my first attempts.
  • Final protocol: “valueOne valueTwo.” (e.g. “224 200.”). After numerous frustrating attempts to get the initial protocol to work, I decided to simplify the protocol as outlined above.

Another challenging issue that I encountered along the way was resolving the source of an “array out of bounds” error. After some investigation I realized that this error was generated within the function I created to read serial data. Upon further investigation I realized that I needed to confirm that a connection had been established with the Arduino before I starting to process the data that was coming in from the Arduino.

Once I understood the problem it was easy to fix. The solution was to add an “if” statement to check whether a piece data received by the computer is the first piece of data in the communication stream. I noticed that the code sample from this week’s labs features a similar solution.

Processing Sketch

Here is a link to the processing sketch that I developed (please note that I’ve commented out all serial communications related functionality in order for this sketch to run online). Below you will find the code for the Arduino.

Code for the Arduino

int analogPin1 = 0;
int analogPin2 = 1;
int analogValue1 = 0;
int analogValue2 = 0;

void setup()
{
// start serial port at 9600 bps:
Serial.begin(9600);
Serial.write('.');
}

void loop()
{
// read analog inputs:
if (Serial.available() > 0) {

analogValue1 = analogRead(analogPin1);
Serial.print(analogValue1, DEC);
Serial.print(' ');

analogValue2 = analogRead(analogPin2);
Serial.print(analogValue2, DEC);
Serial.print('.');

Serial.read();
}
}

Setting-up an Accelerometer – Success!

Wednesday, October 21st, 2009

Earlier today I met with one of the residents at ITP to discuss the issues that I have been encountering with my 3-axis accelerometer (ADXL335). After meeting for a mere 5 minutes, Ithai informed me that the issue was likely being caused by the fact that I did not solder the leads into the breakout board. My initial instinct to NOT solder the header pins to the board in case the accelerometer was not working proved to be overly cautious.

Here are the charts featuring the latest data I collected from the accelerometer

The good news is that the accelerometer is now working, and I only pulled out a few hairs in the process. Now that this accelerometer is working I have a few additional learnings and resources to share with anyone working on hooking up an accelerometer. I hope these can help you get up and running without any hair pulling:

  1. Make sure that you have soldered the pins to your accelerometer breakout board before starting to test.
  2. Use the AREF pin on the Arduino to set the reference voltage to 3v and improve the sensor readings.
  3. Use a running average of all readings or some other stabilization algorithm to help reduce noise from accelerometer readings.
  4. Check out the code samples below for different ways to test your new accelerometer.
  5. Whichever axis is in vertical position will have a different sensor reading due to gravity, even when resting.
  6. The sensor for each axis is only able to alternate resistance by +-15%.

Code Sample 1 – As Simple as You Can Get

int xAxis = 0;
int yAxis = 1;
int zAxis = 2;
int zInput = 0;
int yInput = 0;
int xInput = 0;

void setup () {
Serial.begin(9600);
}

void loop () {
xInput = analogRead(xAxis);
delay (10);
yInput = analogRead(yAxis);
delay (10);
zInput = analogRead(zAxis);
delay (10);

Serial.print("Inpu (xyz): ");
Serial.print(xInput);
Serial.print(", ");
Serial.print(yInput);
Serial.print(", ");
Serial.print(zInput);
Serial.println(".");
}

Code Sample 2 – Capture Base Readings and Then Report Difference from Base
This sample was developed by Andy Davidson, and taken from the Arduino message boards.

/* ADXL335test6
 Test of ADXL335 accelerometer
 Andy Davidson
 */

const boolean debugging = true;    // whether to print debugging to serial output
const boolean showBuffer = false;  // whether to dump details of ring buffer at each
read

const int xPin = 0;  // analog: X axis output from accelerometer
const int yPin = 1;  // analog: Y axis output from accelerometer
const int zPin = 2;  // analog: Z axis output from accelerometer
const int led = 13;  // just to blink a heartbeat while running

const int totalAxes = 3;  // for XYZ arrays: 0=x, 1=y, 2=z

const int baseSamples = 1000;  // number of samples to average for establishing
zero g base
const int bufferSize = 16;     // number of samples for buffer of data for running
average
const int loopBlink = 100;     // number of trips through main loop to blink led

// array of pin numbers for each axis, so the constants above can be chnaged with
impunity
const int pin [totalAxes] = {
  xPin, yPin, zPin};


// base value for each axis - zero g offset (at rest when sketch starts)
int base [totalAxes];

// ring buffer for running average of data, one for each axis, each with  
samples
int buffer [totalAxes] [bufferSize];

// index into ring buffer of next slot to use, for each axis
int next [totalAxes] = {
  0,0,0};

// current values from each axis of accelerometer
int curVal [totalAxes];

// count of trips through main loop, modulo blink rate
int loops = 0;



void setup() {


  long sum [totalAxes]= {  // accumulator for calculating base value of each axis
    0,0,0        };


  Serial.begin   (9600);
  Serial.println ("***");

  // initialize all pins
  pinMode (led, OUTPUT);
  for (int axis=0; axis
    pinMode (pin [axis], INPUT);    // not necessary for analog, really

  // read all axes a bunch of times and average the data to establish zero g offset
  // chip should be at rest during this time
  for (int i=0; i
    for (int axis=0; axis
      sum [axis] += analogRead (pin [axis]);
  for (int axis=0; axis
    base [axis] = round (sum [axis] / baseSamples);

  // and display them
  Serial.print ("*** base: ");
  for (int axis=0; axis
    Serial.print (base [axis]);
    Serial.print ("\t");
  }
  Serial.println ();
  Serial.println ("***");

  // initialize the ring buffer with these values so the averaging starts off right
  for (int axis=0; axis
    for (int i=0; i
      buffer [axis] [i] = base [axis];

  // light up the led and wait til the user is ready to start (sends anything on serial)
  // so that the base values don't immediately shoot off the top of the serial window
  digitalWrite (led, HIGH);
  while (!Serial.available())
    /* wait for */    ;
  digitalWrite (led, LOW);

}



void loop() {

  //increment the loop counter and blink the led periodically

  loops = (loops + 1) % loopBlink;
  digitalWrite (led, loops == 0);
  // get new data from each axis by calling a routine that returns
  // the running average, instead of calling analogRead directly

  for (int axis=0; axis
    curVal [axis] = getVal (axis, showBuffer);
    if (debugging) {
      Serial.print (curVal [axis]);
      Serial.print ("\t");
    }
  }
  if (debugging)  
    Serial.println ();

  // here we will do all of the real work with curVals

}



int getVal (int axis, boolean show) {

  // returns the current value on , averaged across the previous  
reads
  // print details if is true


  long sum;  // to hold the total for aaveraging all values in the buffer


  // read the data into the next slot in the buffer and stall for a short time
  // to make sure the ADC can cleanly finish multiplexing to another pin

  buffer [axis] [next [axis]] = analogRead (pin [axis]);
  delay (10);    // probably not necessary given the stuff below

  // display the buffer if requested

  if (show) {
    for (int i=0; i
      if (i == next [axis]) Serial.print ("*");
      Serial.print (buffer [axis] [i]);
      Serial.print (" ");
    }
    Serial.println ();
  }

  // bump up the index of the next available slot, wrapping around

  next [axis] = (next [axis] + 1) % bufferSize;

  // add up all the values and return the average,
  // taking into account the offset for zero g base

  sum = 0;
  for (int i=0; i
    sum += buffer [axis] [i];

  return (round (sum / bufferSize) - base [axis]);

}

Linda Stone – Continuous Partial Attention and More

Wednesday, October 21st, 2009

Earlier today I had the opportunity to hear Linda Stone speak at an Applications of Interactive Telecommunications Technology class. Linda has worked in the technology industry for over 20 years, having spent time at some of the sector’s biggest and most innovative organizations, such as Microsoft and Apple. Most recently, her attention has been focused on the phenomenon of “Continuous Partial Attention”.

Continuous partial attention refers to an artificial state of crisis that we create (that’s right we have to take responsibility here) because of our attempts to not miss anything and to be connected, always on, anytime, anywhere. This is a distinct phenomenon from multi-tasking, which usually connotes a focus on productivity (not the case with continuous partial attention). There is more information about this concept on Linda’s blog.

Below I’ve compiled a brief overview of my notes from today’s event. My focus here has been to capture high-level ideas that may serve to inspire my future projects and research at ITP.

Top Three Ideas

  • Our current always-on state of being is unhealthy and unsustainable
  • Trend society’s focus moving from thinking and doing to sensing and feeling
  • Opportunity to bring the body back into our interactions with computers

More Detailed Overview

The condition of continuous partial attention keeps people in a constant state of fight or flight at a low-level. This state is not healthy or sustainable.
  • Physiologically, the chemical impact of remaining in this state for prolonged periods of time has a negative impact on our mental and physical wellbeing.
  • Medical research shows that being in a chronic state of fight or flight has negative physiological and psychological impacts (e.g. depression).
  • Breathing exercises and meditation are one of the many tools that we can use to manage state of mind (and upstate the parasympathetic nervous system).

To date, our interactions with technology have for the most part ignored physicality, which has had a negative impact on our lives.

  • When we engage with computers we often have bad posture and even neglect to breath.
  • Breathing is linked to attention and emotions. Thus physical ways to engage with computational devices can help us on these levels as well.

A new era is beginning, where people are going to be looking for technologies that provide quality of life (not just simplicity).

  • Opportunities to explore how to use ambient or environmental technologies to create contexts that help people relax by stimulating/engaging our parasympathetic nervous system.

Technology can be used to change people’s behavior with the need for using incentives and punishments

  • The Prius demonstrates how providing individuals with the ability to self-regulate is often sufficient to change behavior.
  • The Fun Theory campaign from VW shows examples of how creating new interactions that are fun can also change the behavior of people. I’ve embedded one of the videos below.

Book Recommendations


Interactions – Self Checkout

Wednesday, October 21st, 2009

This week’s assignment for my Physical Computing class was to observe the use of an interactive technology in public (link to full description). For this assignment I choose to focus my attention on the self-checkout machines that are available at the Home Depot. This type of device, which began to appear in retail locations a few years back, is still fraught with interaction challenges and obstacles. Therefore, I thought this would be the ideal subject.

My Expectations
Here is an overview of my assumptions regarding how the self-checkout machines should work: the shopper approaches the machine with a shopping cart and/or basket. The touchscreen monitor in the self-checkout kiosks display a prompt for the consumer to click a large button on the screen to initiate the checkout process. This is coupled with an audio prompt that instructs the user to take this action – “touch the screen to start your checkout”.

Once the user has selected to move forward with the checkout process, the machine asks him/her to scan the first item and place that item in a plastic bag that is placed above a surface next to the scanner that features a scale. This surface uses the weight of the bag to monitor that the shopper is not adding (or forgetting to add) any items to their shopping bags.

Once all of the goods have been scanned, the machine offers the shopper standard payment options such as cash, debit and credit card. The process to make payment with cash involves an interaction similar to purchasing a soda from a vending machine. The process for paying with debit and credit cards is navigated through the touchscreen.

Throughout this process one or two store clerks are responsible for monitoring several self-checkout kiosks. Their main role is to ensure that customer issues are quickly solved (rather than to police the customers and avoid theft).

Real-World Observations
On average the self-checkout at the Home Depot takes between 60 seconds and 4 minutes per person. The process for self-checkout seems to be simple for the most part, with some notable exceptions that cause a lot of user frustration. Now let’s briefly examine each step of this process.

For the most part, users did not have any issues initiating the purchased process. They were able to walk up to the self-checkout machines and get started without a hitch.

The process for scanning items was by far the biggest source of issues. The scanning itself was not a problem for any shopper. However, as described in my expectations above, the self-checkout kiosks feature a scale in the area where customers bag their purchases. This was the source of the confusion and issues that arose.

Many customers encountered one of two issues here: either they placed an item in the bag that the scale was not able to detect or they removed an item in the bag at a moment that the machine was not expecting – in either case a store clerk had to help the shopper finish his/her transaction. Here is a video from YouTube that shows an example of customers having this type of issue at a supermarket self-checkout line.

Once shoppers were able to get all of their items scanned, the payment process seemed to run smoothly.