As a choreographer and director of live performance events, I was recently invited to a panel on “Art and Tech” at the Fresh Festival in San Francisco, where one of the presenters opened with an observation of the “new” relation between the arts and technology.
When it came my turn to speak, I felt obliged to point out that artists, and particularly performing artists, have always been quick to incorporate new advances in technology into their work, often developing those technologies and their potential applications themselves. Indoor gas lighting turned theater from an often raucous, outdoor afternoon event to a much more focused cultural experience and facilitated the development of what Richard Wagner called the “total artwork” (Gesamtkunstwerk) that inspired what we think of as opera now.
At the beginning of the 20th century, choreographer Loie Fuller was one of the first to explore the new electric lighting, which enabled her to embed lighting in the dancing surface and create innovative (for the time), magical visuals with movement and fabric. The advent of computerized lighting opened a whole new territory of possibilities for theater artists.
The San Francisco Bay Area has historically been a center of technological innovation in the performing arts, with groups like George Coates Performance Works, Soon 3, the Paul Dresher Ensemble and Chris Hardman’s Antenna Theater developing performance styles that both incorporated and invented advances in computerized lighting, digital projection and many electronic musical possibilities.
Dresher invented a looping device 20 years before that technology became commercially available (and is now de rigueur for almost any electrified musician). Antenna used the Sony Walkman to create individual, interactive performance experiences for each audience member. As another speaker on that Art and Tech panel put it, we notice innovative technologies the first few times they get used in performance; after that, we just call those technologies “tools.”
In my current dance project, “Performance Research Experiment #2.2,” we are working with technology in a slightly different way. Inspired by experiences of self-quantification with heart-rate monitors, GPS and cycling power meters in my personal training as an amateur competitive cyclist, and also motivated by doing a PhD in performance studies, I became curious about what we might be able to demonstrate about how our bodies are involved in the process of experiencing a performance.
As a result, in this new show we are wiring up volunteers to measure their heart rates and skin conductance responses to see how our performance affects them. Our volunteer “subjects” sit in the front row, and we simultaneously project the data we are capturing from their physiological responses onto the side of the performance space so that their data also become a part of the performance.
Trying to make “art” out of data may seem a little antithetical, but our success touring a recent work, “Performance Research Experiment #1,” made us believe in the friction between the intuitive and the analytical. This most recent “software update” removes some dramaturgical glitches, has stylish new updated graphics and sound and runs much faster, for an enhanced user experience.
As a choreographer, I have often heard audiences confused or frustrated by their attempt to “understand,” “decode” or “read” the “meaning” of a dance performance. I like to ask my composition students to think more about what their work does, and less about what it might mean. Recent research and theory show that bodies moving in space literally move, and virtually touch, those who watch them in a variety of ways, even though the viewer might be sitting relatively still in a dark theater.
In our project, the whole audience can follow these two biometric parameters as the show presents a variety of performance actions, ranging from standing still to dancing, playing music and some images/actions borrowed from performance art and S&M practice.
My main interest in using the technology is to point people’s attention toward various physical phenomena that constitute the experience of attending a performance, and how they are all in communication, affecting each other. My favorite reports from audience members after the show (both those who have been monitored and those just observing) are about how they became much more aware of their own bodies and the details of their physical experience because of the tech we use.
Seeing a spike in one of the skin-conductance graphics immediately brought the focus of their attention to the feeling inside their own body, comparing the visual of the data to their felt experience and to their visual perception of the subject/bodies in front of them that were producing the data. The data gave them a tool to open up a sensitivity to a new level of embodied experience.
I find it satisfying how often our audiences speak about the experience in choreographic, spatial and directional terms. They talk about the literal movement of their attention between the data projection, the stage, the bodies being measured in front of them, and the inside of their own bodies. For me, this movement is another layer of the choreography, and has been a useful way to think about what we are doing as we continue to refine it. How are we choreographing and optimizing their attention? When do we demand attention, in which directions, and how much room do we leave for the audience/user to “improvise” as they choose where their attention goes? At what point do our attempts to insist on their experiencing one thing or another become coercive? When we direct sub-bass-frequency vibrations into the room, every body is penetrated by that force, and in a real sense is “forced” to have that experience, whether they like it or not, regardless of what they are looking at or listening to.
The project has also made me much more aware of the complex weave of the aesthetics and functionality of the visual presentation of the data. Yoann Trellu, my video collaborator, has spent weeks fine-tuning different ways of visually translating a seemingly endless stream of numbers into images that, on the one hand, will tell our audience something interesting about what is going on in our test-subjects’ bodies, and on the other hand take those images and create compellingly aesthetic visual landscapes for us to perform in.
Given the complex nature of collecting data during a public live performance, there are too many uncontrollable variables to get results that could provide “clean” enough correlations for publishable science, but as one of my data consultants, Weidong Yang, said the other day, this is hypothesis-generating work.
At this point, it’s not so much about answering questions as suggesting what the questions might be as we move into a future where this kind of biometric data is available to anyone with a smartphone. For me, this project is about using the digital data we capture as another performer in the space.
How this will play out, I don’t know, but, as we layer, bend, amplify, copy and mimic the lines, sounds and graphic shapes produced by the audience’s data and bring it back into the performance space to play with us, we are learning a lot about the dance (and occasional tug of war) between choreographing, programming and engineering that we think will produce inspiring and thought-provoking immersive experiences for both performers and audiences.
As artistic director of Jess Curtis/Gravity, Jess Curtis makes, watches, teaches and writes about body-based performance in San Francisco, Berlin and internationally. He is a PhD candidate in Performance Studies at UC Davis, studying the dynamics of embodiment in live performance. “Performance Research Experiment #2.2” will be performed Jan. 30-Feb. 1, 8 p.m., at Joe Goode Annex, 401 Alabama Street, San Francisco, Calif..
This article originally appeared on Recode.net.