The I-Cube System: moving towards sensor technology for artists
Simon Fraser University
Infusion Systems Ltd.
This text (without graphics) is published in the ISEA 95 proceedings.
© Copyright 1995 Axel Mulder. All rights reserved.
Art can be called interactive if an intelligent response (in terms of changing lights, sounds, images, moving objects etc.) to an action by a performer or visitor or to a changing environment occurs. To add such interactive capabilities to their art or performances artists have to engage in a costly and difficult dialogue with highly skilled technical persons. A data acquisition and processing system based on MIDI and Opcode's Max is proposed to facilitate, for artists, the design and creation of interactive art.
Many artists include some form of interaction in their creation (Atkins (1994), Crawford (1994), Demers (1993), Schiphorst (1992), Malina in Leopoldseder (1990)). An interactive art installation may have a response to an action of a visitor, or in a performance, the artist may control or interact with one or more media. To detect the actions of the visitor or performer sensing devices are required. In addition to this, it may be of interest to capture environmental variables, such as room temperature or windspeed. Up to now artists had to fall back on existing, commercially available controllers or sensing devices, designed for specific applications, i.e. with little flexibility, to include such interaction.
Before examining existing sensing devices, it is important to distinguish the levels of abstraction that can be used in describing events and changes in the environment and human behaviour. For example, the description of an event or change can be:
physical (lightlevel in lux is represented in voltage)
signal (rate of increase of lightlevel)
gestural or environmental (hand moves away from light sensor, or lights are coming up)
emotional or multimedia (tension increases in the currently playing sequence of sounds, lights, images etc.)
These distinctions are important because the aim is to interpret the events or changes in a given context so that they can be used to generate other events or changes. Therefore they need to be expressed in a similar representation as that of the context. This can be achieved by analysing the events and extracting features, information, meaning etc.. For instance, if the system would describe touch as the amount of pressure exerted on a surface by a finger it is not apparent from the data, without further analysis, that someone is hitting the surface or stroking it. Transducers describe an event or change only at one level of abstraction, ie. in physical terms. They are devices that generate an electrical signal (voltage, current, charge, ..) as a result of an event. Sensors and detectors however address a variety of levels of abstraction. Sensors, transducers and detectors are all sensing devices. These distinctions are also very useful in the dialogue between artists and technologists, since they often communicate at different levels of abstraction.
Existing sensing devices and the interactive art design process
A variety of devices that output MIDI data exist (MIDI keyboard, MIDI guitar, Yamaha EWI (MIDI wind instrument), Zeta violin). However, they are only suited for very specific gestures in musical performance practice. Computer input devices (keyboard, mouse, trackerball, joystick, tablet) require know-how to access sensor data, while they are also designed for specific gestures, mostly for work with computer displays, ie. involving the visuo-motor system. Scientific research instrumentation (general purpose data acquisition system, motion capture installation, instrumented glove) requires a lot of know-how to operate and is also very expensive.
In many cases the artist's interaction needs require instrumentation with different shape, configuration or capabilities than is available or they cannot afford or do not need the extra capabilities of the systems that would cover their needs. Also, their process to a final art piece may require a lot of experimentation with different sensing capabilities which implies the need to purchase a number of different specialized sensing devices. However, when considering the possibility to design and build sensing devices with individual hardware components, it is currently technically challenging for an artist to use and experiment with transducers and build sensing devices in a way that suits their needs within a reasonable amount of time and money. Usually, the artist falls back on very simple systems, such as alarm installation components, that were designed for the consumer market but which can be hacked quickly into their art installation or performance. Very little flexibility and reliability normally results.
While it can be argued that the resulting technical solution is an integral part of the art piece, it is hard to believe that the point of the art piece is to convey to a visitor or an audience the technical solution itself. Such art would be indistinguishable from a demo as a technological research result. However, it is clear that the boundary between art and technology can be as thin as a silicon wafer.
Design of a sensing device development system for artists
With the above considerations in mind the design of a solution to the varying sensing needs of an artist has some surrealistic feel. On the one hand such a design would "improve" the day-to-day life of an artist since he or she wouldn't need to delve as much into the technical knowledge and an interactive art piece could be put together more easily, on the other hand many artists will say they still need to expend as much effort as before to come out with an interesting piece. This ambiguity has consequences for the design of such a system for the development of sensing devices, since it is not obvious which capabilities of the system are important to the users.
In interactive artworks, four types of systems can be identifed (fig. 1-4):
Interactive Installations that respond to Natural, environmental phenomena (NII). Example: a sculpture that changes shape depending on windspeed and temperature.
Interactive Installations that respond to actions of an audience, consisting of one or more persons (HII). Example: a puppet with temperature transducers and piezo elements that can be touched, hugged, hit etc..
Non-immersive Interactive Performance Systems (NIPS) - systems that interact with a human performer, who performs for an audience. The performer perceives the system separate from the natural environment and his/her body. Example: a performance space with pressure transducers on the floor and lightbeams that illuminate lightintensity transducers. The performer can step on the pressure transducers and interrupt the light beams.
Immersive Interactive Performance Systems (IIPS) - As NIPS, but the performer perceives the system integrated with the natural environment and his/her body. Example: a glove or suit with pressure, flex and myoelectric transducers. The performer can move or gesture and affect a virtual environment.
While NIPS and HII appear to have similar characteristics, a distinction is made because in NIPS (as well as IIPS) the performer (the system) has learned to interact with the system (the performer) with greater refinement than is the case with a visitor in a HII. One can also say that interactive art consists of interactive devices, from small ones, that are possibly wearable, to big ones, or it consists of interactive spaces. Interaction takes place between humans and/or nature and the system, where the level of familiarity of the human (system) with the system (human) plays an important role.
As discussed above, the variety of interactive art works and performances is large. Also, an individual artist may want to experiment with a variety of sensing devices. Therefore it is sensible to make a system that allows artists to design their own sensing device. This is feasible for non-engineers in the case of HII, NII and NIPS. However, in the case of IIPS, sensing devices that are designed to be worn by a human performer more often require very specialized engineering and transducers (Mulder, 1994).
In general, the systems that enable the artworks or performances discussed above implement the following functions in order:
Transduction of physical phenomena into voltage or current through transducers.
Low level signal conditioning and processing.
Feature extraction, data management and analysis.
Mapping functions and setup management.
Generation of sound, light, image, motion etc.
Furthermore, multi-channel analog to digital data conversion will be necessary at an early stage in the signal/data path to reduce noise and interference. Also, data transmission by cable or wireless systems, using a communications protocol will be necessary since not all functions will be implemented in only one physical device.
Max, an object oriented graphical programming language, by Opcode Systems, is used in many interactive artworks, especially music compositions and performances, as a prototyping and performance tool for mapping and setup management, because of its ease of use and expandability. Max is also suitable for implementation of feature extraction and data analysis.
Fig. 1. Interactive installation responding to natural phenomena (NII). [widen your webpage if the images overlap]
Fig. 2. Interactive installation responding to human behaviour (HII). [widen your webpage if the images overlap]
Fig. 3. Non-immersive interactive performance system (NIPS). [widen your webpage if the images overlap]
Fig. 4. Immersive interactive performance system (IIPS). [widen your webpage if the images overlap]
Earlier work and products
Commercially available products that implement the first 4 functions listed above are hard to find. A number of manufacturers have marketed control voltage to MIDI converters. These devices however, convert usually only upto 8 channels from analog to digital with only 7 bits, which does not allow for any ranging ("zooming in"), ie. signal conditioning hardware is needed. They lack power supply for transducers that need to be powered or only allow resistive transducers. Also, although they can interface with Max because they output MIDI data, an interface that allows for easy configuring of a sensing device and its setup needs to be programmed. In fact, they were not designed for tranducer interfacing, but for converting signals from "ancient" analog synthesizers into MIDI. STEIM in the Netherlands has implemented most functions in their hardware design called Sensorlab (Anderton, 1994). Although it converts with only 8 bit resolution, it does include signal conditioning hardware which allows "zooming in" on a particular part of the voltage input range. The data transmission protocol is MIDI, while mapping is implemented in software called Spider. The Spider software environment is not as user friendly as Max, since it is a text based, C-like programming language. Some command line addicts will no doubt disagree. The SensorLab is quite expensive (about US$2500) and therefore not used by many artists with small budgets. Wired serially connected data aqcuisition systems that are marketed to the industrial market have no mapping
software suitable for the current application, too few channels, nor a MIDI interface. They are also very expensive since they comply with industrial standards. Other efforts in the desired direction remain in R&D stages. Curtin (1994), worked on a system called the SoundLab, after STEIM's SensorLab, that included a lot of mapping functionality. His design mainly addressed electronic musical instrument design problems.
The I-Cube project
Since the system design criteria outlined above were not met by available systems, the I-Cube project was started to realise the desired system, gain experience in the field and possibly commercialize the result. An investigation into cheap transducers useful for artists was conducted and a system was built with the following properties (fig. 5):
The digitizer unit, small and wearable, is based on a 68HC11 microcontroller. It converts analog signals to digital of upto 24 transducers with 12 bit resolution and upto 8 transducers with 8 bit resolution. It also has 8 binary outputs (switchable between 0 or 5 Volt). It normally communicates via MIDI system exclusive messages with Max.
The digitizer plugs into the patchbay, a 19 inch rackmount unit, which, when needed, allows easy access to individual analog inputs.
The iCube and oCube Max objects decode the MIDI messages and prepare the signals for processing and mapping (fig. 6).
Figure 5. The I-Cube System diagram.
Figure 6. A Max patch showing the iCube object and its commandset.
In order to obtain user feedback, local artists explored the following HII, NIPS and IIPS works with the system.
"The space in between"
In the Western Front, an artist-run center in Vancouver, the author worked with Grant Gregson on an IPS application of the system. The piece explored the idea of capturing gestures, particularly of the upper body, of a musician (a pianist in this case) that do not normally result in sounds directly. The captured data then controlled lights as well as the actions of a piano, a Yamaha Disklavier.
Two computers were used, one for developing patches for controlling the lights and the piano and one for pre-processing the transducer data. Transducers were 10 light dependent resistors (LDR) and 2 force sensing resistors (FSR). The LDR's were positioned on the piano and used to detect changes in the light when the pianist moved his upper body, head or arms. The FSR's were positioned on the pianist's seat so that rocking to the left or to the right would change the values of these transducers. 6 lights were installed at the ceiling to light the area around the piano.
To test the I-Cube System for interactive installation applications, Carlos Vela-Martinez used the I-Cube System in a sculpture of human proportions. The project aimed to incorporate the I-Cube System into the operation of an interactive object. The object was about the size of two human trunks. It contained speakers, driven by a sound module, and a small television as output media and sensed visitor input through force sensing resistors (FSR), that sense touch, placed at various locations on the outside surface of the object, acoustic transducers (electret mikes), placed on the inside and light dependent resistors (LDR). The acoustic transducer signals were processed so that only an acoustic impact (handclap, stomp on the sculpture's surface) was detected and stored. An LDR was used to detect the proximity of the visitor.
"The virtual drum set"
For wearable interactive performance systems the author built on Rolf Wilkinson's (finger) drumming experience and used the I-Cube System to make a touch glove for creating a virtual drum set. This project aimed at using the I-Cube System with a sensing device that needed to be worn on the human body, ie. an IIPS application. The transducers (FSR's) were placed in gloves for the right and left hands and their signals were processed in a Max patch to detect whether and how hard the tips of the fingers as well as the palms were touching a surface. This data is then used to control a drum synthesizer, so that the perception is created the musician is playing a virtual drum set. An important requirement was that whenever the musician hit a surface the sounds had to coincide with the tactile sense of hitting the surface to make the musician perceive the virtual drums as one "gestalt". The project was the more demanding of the test projects, since a high timing resolution was needed, ie. a high sampling rate and low processing & transmission latencies. Also, the capturing of body signals required special attention to the design of the gloves, eg. placement of transducers and feel of the glove. The cotton glove currently in use is deemed reasonable.
Conclusions and outlook
The I-Cube project has realised an affordable and flexible environment for design of sensing devices. While development of newer versions is ongoing, a number of conclusions can be drawn from the I-Cube project:
Although the I-Cube System makes sensor technology more accessible for artists, the system still requires a fair amount of technical knowledge (understanding of calibration, linearization, ranging, positioning etc. of transducers as well processing and analysis of their signals). Therefore, the supply of information on transducers and how to use them to design sensing devices is crucial.
Tools for datareduction and sensorfusion are needed to make the use of many (upto 32 !) transducer signals practical and more efficient. Such tools are especially useful for applications with low latency requirements (less than ca. 10 ms) that tax the MIDI bandwith as well as the computer capabilities, particularly if signal processing and analysis is required.
For less technically interested artists to use the system, transducer assemblies that plug directly in the digitizer and stand-alone Max (version 3.0) patches that are specifically for these transducer assemblies need to be developed, as well as information about the system and/or sensing devices in terminology that more artists can relate to.
The project mainly aimed at sensing problems. However, many interactive artworks appeared to require a few binary outputs too, e.g. to drive videosignal switches and turn on and off small motors and lights. They were added to the design.
Although the current system works with the graphical programming environment Max, it uses symbolic command language messages to control the digitizer. Future work aims to control sensors with a graphical command language.
While the current system can be made wireless through the use of a wireless MIDI system, other, cheaper avenues to wireless sensing are being explored.
The I-Cube System is commercially available. Contact Infusion Systems (see above) for detailed, up-to-date information.
The development of the I-Cube System was partially funded by a grant from the Centre for Image and Sound Research in Vancouver.
Besides the author, the following people were involved in the development of the I-Cube System: Darek Garncarz, Thomas Sternberg, Andrey Gleener, Carlos Vela-Martinez, Rolf Wilkinson and Grant Gregson.
Furthermore, the author wishes to thank for the support and feedback provided by the Graphics & Multimedia Research Lab, the Human Motor Systems Lab, Graham Hunter, Mario Vela-Martinez, Steven Haworth, Mel Frank, Sang Mah, Tom Calvert, Christie Mackenzie, Ron Marteniuk, Parveen Bawa and many other people around the globe.
Anderton, C. (1994). STEIM: In the land of alternate controllers. Keyboard (august) p54-62.
Anderton, C.; B. Moses, G. Bartlett (1994). Digital projects for musicians. New York, NY USA: AMSCO publications.
Atkins, S. (1994). Interactive media - virtual reality and the actual performer. Canadian Theatre Review 81 (winter 1994), p 16-19.
Crawford, J. et al (1994). Merging Media: Opportunities and innovation in computer-human interface. Symposium held in July 1994, Simon Fraser University Theatre, Burnaby, BC, Canada.
Curtin, S. (1994). The SoundLab: A wearable computer music instrument. Proceedings International Computer Music Conference, Aarhus, Denmark, p200-201. San Francisco CA, USA: International Computer Music Association.
Demers, L.-P. (1993). Interactive and live accompaniment lighting for dance. Presented at the conference on dance and technology, held in July 1993, Simon Fraser University Theatre, Burnaby, BC, Canada.
Gilbert, R. (1994). Computers and theatre - Mimesis, simulation and interconnectivity. Canadian Theatre Review 81 (winter 1994), p 10-15.
Huntington, J. (1994). Control systems for live entertainment. Newton, MA, USA: Focal press.
Leopoldseder, H. (1990). Der Prix Ars Electronica - International compendium of the computer arts. Linz, Austria: Veritas verlag.
MacLeod, D. (1994). Information theatre. Canadian Theatre Review 81 (winter 1994), p 5-9.
Mulder, A.G.E. (1994). Build a better powerglove. PCVR 16 pp 10-14. Stoughton, WI, USA: PCVR Magazine. Available through the WWW, URL http://xspasm.com/x/sfu/vmi/PCVR.html
Mulder, A.G.E. (1994). Human Movement Tracking Technology. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University. Available through the WWW, use URL http://xspasm.com/x/sfu/vmi/HMTT.pub.html
Mulder, A.G.E. (1994). Virtual Musical Instruments: Accessing the sound synthesis universe as a performer. Proceedings of the first Brazilian Symposium on Computer Music, held in Caxambu, Minas Gerais, Brazil, August 2-4 1994, during the XIV annual congress of the Brazilian computer science society pp 243-250. Belo Horizonte, M.G., Brazil: Universidade Federal de Minas Gerais.
Available through the WWW, use URL http://xspasm.com/x/sfu/vmi/BSCM1.rev.html
Schiphorst, T. et al (1992). The shadow project: An exploratory workshop in performance and technology. Workshop held in August 1992, Simon Fraser University, Burnaby, BC, Canada.