Putting a Face on Theatre
Before you read any further, watch this video:
While you pick your jaw up off the floor, it is worth emphasizing a couple of points: 1) the hip hop performer was, indeed, controlling the audio playback--loops, drum pad, filters, volume, etc.--using his body; 2) the motion capture was achieved using an Xbox Kinect (well, actually, two Kinects).
This project is amazing on a number of levels: first, the production of the video is stellar--the shot of the bystander, mouth agape, huge smile, hands on his head an amazement, perfectly captures what it must have felt like to see this event live. Second, the the audio track is pretty hot (even if you're aren't usually into that kind of music); third, the visual effects are crazy.
The two biggest reasons I was amazed with this project? First, it is a perfect example of what a control system is all about--getting data from one device to another, in a way that is efficient and effective; second, that the collision of technology with performance can create amazing things, and change the nature of performance in fundamental ways.
The folks who built the system have a great series of blog posts on the design and development process. They discuss the difficulties they had using the Kinect devices, the software hurdles they had to jump over, and the development of the overall system (with a great, hand-drawn example of a functional block diagram!). Underneath all of this, however, is a constant thread: how do we get data from the performer--i.e., information about the performer's body position and movements--to the music software and the visual effects software? Beyond all the spectacular visuals, this project at its base is about creating a control surface for Ableton Live, the audio playback software, albeit an invisible, "air" interface. (Maybe that's what makes this project so compelling: it's like playing air guitar in our bedrooms when we were fifteen!) In fact, the system is really two interfaces--one for gesture controls, controlling loop playback, filters, volume, and the like; and one that is a drum pad/keyboard.
What an awesome project! And it is all about data! What do I mean by that? Well, first you need to know how the Kinect passes on information about motion tracking--in other words, when my hand is *here*, what data does the Kinect send out? (Along with this comes a bunch of other detail questions, such as, "what is the refresh rate," "how is depth information passed on," "in what format is the data transmitted," "by what transmission method is it sent?") Then, once you know what data the Kinect is sending out and how it is sending it, you need to find some way to make that information something Ableton can understand. As soon as you know that, you find or create a device that interprets the Kinect data into something Ableton understands. (For this project, it seems that they used a PC running Processing, which is a lot like Max/MSP; both are powerful programming applications that allow for the creating of complex algorithms that manipulate data in a multitude of ways.)
What makes this so exciting is that data is easy to manipulate. Once you realize that all we are talking about is sequences of bits and bytes, you can do anything you want with it. There is no reason this same interface couldn't be used for, say, complex video playback. Or manipulating a roboting arm. Or controlling motorized scenery. Which is just a little bit crazy, but a lot awesome!
Which brings me to the other reason this project is so cool: it is a game-changer. It opens up new possibilities for performance. It blurs the boundariea between "technician" and "performer"; suddenly the DJ *is* the performance. And it creates new avenues of artistic exploration--a performance piece where the performer's movements control the audio, or the lighting, or the sound or video: this is a whole new way of storytelling. Sure, it allows for better "synchronization" of on stage action and lighting and sound cues; more importantly, however, it opens the door to new ways of thinking about the relationship between action and other theatrical elements: effective and stunning production values can become even less linked to strict, repeatable narrative, allowing for more improvisation and interactivity, for example.
This kind of work represents the best collision of technology and performance, where the technology not only allows for a particular effect, but opens up possibilities for better ways of sharing a narrative.