I have seen the Fnords!

Live performance is an integral part of music. It is my belief that the sophisticated sounds that can be created with computers need to be controlled by an equally sophisticated physical instrument. Keyboard and woodwind interfaces have been created to control digital sound, but this offers poor control of a complex synth, and no tactile feedback that is so important for performance. Performers and luthiers of the past refined specialized instruments, such as violins or clarinets, into quality devices that a performer used to create and control sound. Because of the open and reconfigurable nature of sound today, performers need an open and reconfigurable instrument that uses haptic feedback to inform the user with tactile feedback. This piece is a step in that direction.

Phantom Omni

I have seen the Fnords! is the second in a series of experiments featuring the SensAble Phantom Omni haptic device. For this work, I have used the OpenHaptics API to create an interface for controlling live spatialized sound. The software I have written creates a virtual concert hall that is felt by the user so that sound can be spatialized in real-time under the control of the performer. To accompany this software, I have created a quasi-improvised performance using larger scale gestures to both shape the sound and spatialize it in a real space. Three ‘instruments’ are used in the piece, with parameters of each controlled by the haptic device.

A fnord is that which is present but not seen, and the source of confusion. To see the fnords means to be unaffected by the supposed hypnotic power of the word.

haptic environment

A major component of the piece is the haptic design. I wanted to create a simplified layout of the concert hall so that my gestures could control the spatialization of one of the instruments directly. To do this, I created a cube of rigid walls to represent the performance space. Inside that I placed a force gradient so that the most 'restful' areas are on the ambisonic circle, while the most unstable area is right in the center. In fact, the force curves are so steep that the device seems to come alive and resist the performers motion. Complete control is difficult, and the off-balance interaction has a great influence on the gestures generated by the performer.

To control the other sound elements, I track the stylus pitch, yaw, roll, and 3d location. This information is sent via OSC to a SuperCollider server that serves as the synthesizer.

sound environment

The sound component of the piece is created from three instruments, each externally controlled via OSC.

  1. barrel - the 'drums' of the band
    This instrument consists of a prerecorded track featuring FM synthesized booming sounds. I use the vertical position of the Omni to control the cutoff frequency of a low pass filter, and well as trigger it to start with one of the buttons.
  2. chant - the 'bass' of the band
    This instrument is the simplest and just consists of prerecorded bass sounds that are triggered to start by the performer using one of the Omni's buttons.
  3. glitch - the 'lead' of the band
    This instrument is the most complex and requires the most Omni control. In honor of the unofficial pop-music theme, I started with a heavily distorted and time stretched version of "La Vida Loca" by Ricky Martin, and then convolved that with a Dust signal to granularize it. I then ran that through my custom BarkDelay algorithm, which delays each Bark band independently. After that, I apply a favorite convolution phase effect.
    The Omni controls are as follows:
    pen pitch => grain frequency
    pen yaw => BarkDelay time
    pen roll => phase effect cross fade
    X,Y positions => location within the performance space for ambisonic encoding

The organization of instruments and use of materials was roughly inspired by a simple pop music format. However, the organization of the music had nothing to do with pop music. I created a small library of gestures, and organized a quasi-improvised performance that pitted me against the machine. The exact control of each instrument was created between the push and pull of myself and the Omni.

The SuperCollider3 code can be found here:


I have seen the Fnords! was performed October 25, 2005 at UW Meany Hall, and February 10, 2006 at DXARTS.

further plans

I think the technology of haptics will make a profound impact in the art world, and that advances in the devices used to create force feedback will enable a new medium in art – force itself.

Further plans include creating a more general-purpose way of creating haptic spaces so that new interfaces can be created from modular components without having to use the OpenHaptics API directly.