Finger paint mandala is a multitouch application that enables people to create musical paintings that gradually fade away. This project was inspired by the sand mandalas created for the purpose of meditation in Buddhist traditions.
Here is a brief excerpt from Wikipedia regarding sand mandalas: “As a meditation on impermanence (a central teaching of Buddhism), after days or weeks of creating the intricate pattern of a sand mandala, the sand is brushed together and placed in a body of running water to spread the blessings of the mandala.” [link to Wikipedia article]
Here is how Finger Paint Mandala works: The artist will use his finger to create paintings on a multitouch display. He/she will be able to select the color for each stroke, each color will play a different sound when a new paint stroke is created. For the first build of this project (which I hope to display at the Spring show) the paint strokes and sounds will fade gradually over 30 seconds as soon as the artist lifts his fingers.
In the second build I plan to give artists the freedom to create a full drawing, or mandala over a period of several minutes. I also plan to allow artist to destroy their own creations by blowing on the surface of the display (I will use piezo sensors in the display case to sense the blowing). This action would also create sound that generated as if the notes which had been were being blown away.
Building the Multitouch Display
To create this multitouch screen I will use the Diffused Screen Illumination with an LCD monitor (rather than projector). Here is a blog post from the Multitouch Development Blot that provides a good comparison of the different approaches that exist. For a camera I plan to use the PS3 Eye, which comes highly recommended by colleagues and online resources.
So far I have converted to camera to an IR camera, and I have also built the IR light panel using the special EndLight acrylic, that has special reflective properties that makes DSI possible. I am still work on dismantling the LCD monitor and creating an appropriate case. Below are some pictures of the camera work, and the process of taking apart the LCD.
Creating the Software
I have decided to use a combination of Puredata and Processing to create the application. Puredata will serve as the audio synthesizer and controller. While Processing will be used primarily for graphics generation. From a video capture perspective I plan on using Reactivision.
I have already built an initial version of the Processing and Puredata sketches. That said, much work remains to be done. Thus far I have only created applications using a mouse paradigm (single point of focus). Over the coming week I will have to convert the applications to work with a multitouch input (multiple points of input).
I have been reading up on TUIO protocols to prepare for this transition. I have also been investigating the libraries available in Puredata and Processing that can handle multitouch input. The NUI forum has been especially helpful, along with the Peau Productions website.