For Death and the Powers, a team of faculty, staff and graduate and undergraduate students at the MIT Media Lab has brought a host of innovative technologies to the stage. From robots to visuals to sound-producing Hyperinstruments like the giant Chandelier, more than 40 computers are required to run the production, all backed by extensive wired and wireless networks. These computers run a broad range of distributed control systems that we developed for the production, in which each component can share information with any other in order to create a synchronized and unified presence of Simon in The System.
The Chorus of Operabots and three large bookshelf periaktoi are centrally controlled using software we developed specifically for choreographic robots onstage. This software includes a 3D visualization for monitoring and authoring the animation of robotic movement and lighting. If need be, puppeteers above the stage can assume manual control of any parameters of a robot using a typical video game controller. An absolute position tracking system monitors the location of robots and actors onstage to help the robots navigate, as well as affect sound and visuals.
After Simon Powers enters The System, the singer portraying him, James Maddalena, exits the stage, though he continues to sing and act as if he were onstage. In a technique we’ve coined Disembodied Performance, gestural and physiological sensors, as well as voice analysis, capture his offstage performance which is then used to generate in real time the visual representation of Simon Powers in the bookshelf displays and other aspects of the production. Mapping software was created that can connect sound, robots and visuals to the singer’s performance. A custom graphics environment allows these live performance parameters to generate expressive graphic representations of Simon in The System.
Another method of representing The System’s omnipresence is through sound. Over 140 speakers are used to create a unique sonic environment. Two formats of surround sound are used in the production. Wave Field Synthesis uses an array of tiny speakers across the front of the stage to create the impression of a sound emanating from any point in the space. Ambisonics is used to move sound all around the audience. Software and plug-ins for common audio packages were engineered to allow the hundreds of streams of audio to be processed in real time.
All of this technology - although complex - is mostly meant to work invisibly "behind the scenes," helping to draw audiences into the unusual, mysterious world of Simon and The System.
- Support us
- Contact us