Grainita is a new musical instrument for novices, drawing inspirations from sand art, ambient music and stochastic composition. It attempted to study new forms of interaction for music, combined audio-visual art paradigms, computer vision and machine improvisation through algorithmic composition.
‘Sand art’ is the name given to the practice of drawing or painting with sand, as in the following video.
Notice that this art form has a strong temporal dimension, the process of creating the final image is as much part of the art as the final image itself. It is generally abstract in nature, drawing on inferences and assumptions by the viewer. In these properties, sand art bears large resemblances to ambient music. Moreover, sand art is usually accompanied by unrelated music. This inspired me to create a unified art paradigm, a new musical instrument for novices with a unique interaction scheme.
An important goal while designing grainita was a DIY ethos and cheap material cost. The initial setup consisted of a ‘light box’, made with a hollowed out IKEA table costing approximately 7 USD. A strip of LED lights was installed, after spray painting the interiors a glossy white for increased reflection. The top of the table was replaced by a translucent acrylic sheet. (Thanks to Prof. Tripp Edwards and the power tools of the Design Shop in the College of Architecture, Georgia Tech for helping out with this!)
Grainita uses computer vision to capture the sand on the surface and correcting for errors, using a USB webcam to connect to Max/MSP. A flaw was found in the aforementioned design; the camera was positioned overhead, and captured the hands of the artist while they performed with sand. There were some potential solutions, but the overhead was too large, and a redesign was necessary.
The modification was to place the camera underneath the acrylic surface, held up by the four legs of the original IKEA table. This proved to be a simple and functional solution, leading to ‘version 1.1’.
The video feed from the camera was processed and tessellated into a 4X4 grid on the computer. The presence of sand in the sub-frame was detected, and if this crosses a certain threshold, a MIDI note is generated. The exact note generated depends on the scale/mode and the root note that the user can set using a graphical slider interface.
Another goal was to facilitate the modification of control parameters without having to stop a performance and reach out to a computer; an Arduino connected to a pushbutton matrix was used to simplify this.
Grainita also ‘learns’ from a user’s inputs, and can accompany the human performance. It uses a second order Markov process to store the note information; the duration information is computed by an independent first order Markov process. The user can turn on and off the accompaniment, as well as change the tempo at which it plays back using the Arduino and the keypad matrix. The computer improvisation module generates MIDI as well, which is routed to a DAW using a separate channel, allowing the user to create multiple layered sounds.
Grainita also has a built-in looper function, which ‘records’ user played material and plays it back in an infinite loop. The performer can turn this on or off, and similar to the computer improvisation module, this is routed to a different MIDI channel as well, resulting in a total of 3 channels a user can harness.
Sand is an interesting medium for visual art, it is tangible, but can be shaped quickly and easily. There is something that attracts people to it, many acquaintances reached out just to touch the sand I was working with. Working with sand on a rectangular surface provided a 2 dimensional spatial parameter, which was used to generate MIDI control data.
For this, the video feed was divided into two halves, allowing for manipulation of two output MIDI channels. In each half, the position of the centroid of the largest ‘blob’ was calculated, and scaled into a 7 bit range for MIDI. These data can be used to control software synthesizer parameters, independent of the other modules.
There are a few modifications and improvements I had in mind, including discarding the Arduino and Keypad interface in favor of touchOSC. There are a lot of parameters that can be extracted from the sand patterns, and this could control music in unique ways. Watch this page, there is more to come!Tags: 2013, Arduino, Computer Vision, Georgia Tech, Max, Music, NIME, Stochastic Composition