A Flying Quadrotor Light Show Spectacular


Now in its 22nd year, the Saatchi & Saatchi New Directors’ Showcase hit the Cannes Lions Festival – the festival of creativity – again, unveiling another presentation of the new directorial talent.

Marshmallow Laser Feast (Robin McNicholas, Memo Akten and Barnaby Steel) were the creative and technical directors of the production which included a theatrical performance by 16 flying robots reflecting light beams on the stage.

Memo describes the goal as to create something simple, beautiful and mysterious: “to push the experience to that of watching an abstract virtuoso being made of light, playing a bizarre, imaginary musical instrument.” The performing Quadrotors are not the *stars* of the show but rather the light forms generated. Their role, Memo describes, is to manipulate the space by sculpting the light creating a ballet of anthropomorphic light forms. As the audience anticipates performance on the stage, buzzing fills the dark auditorium – causes confusion with no idea what is about to happen. 16 Quadrotors take the stage, hovering above a pyramid with light beams aimed at each one and light reflected back onto the stage. They move, reassemble, reshape the space.

The team knew that the hardware was going to impose constraints and they will have to stay within the confines of “physically possible”, after all these are flying machines and gravity is at play. They also knew they were not going to get the exact same motion they had in their animation and simulations. They didn’t expect that the flying robots were going to add such a distinct charming characteristic of movement, all of the team members falling in love with them instantly as soon as they saw the machines flying the trajectories.

The Quadrotors themselves are the brainchild of Alex Kushleyev and Daniel Mellinger of KMel Robotics. University of Pennsylvania graduates Alex and Daniel are experts in hardware design and high- performance control. Their Quadrotors push the limits of experimental robotics, and the Quadrotors performing at the NDS have been built and programmed specifically for the event.

The MLF team started working on the project back in January this year. They brought KMEL Robotics onboard to collaborate on the robots and asked them to build a bunch of quadrotors with a (polycarbonate) mirror on servo, and super bright LEDs. The MLF animated the robot (trajectories, mirrors, LEDs) in Cinema4D and developed a simulation environment (using espresso + coffee) to track the virtual vehicles with virtual spot lights, animate the mirrors, bounce the lights off the mirrors etc. They could simulate everything accurately (minus the airflow dynamics simulations) – including warnings for impossible or dangerous manoeuvres (i.e. acceleration, velocity, proximities etc.). Generated data  (trajectories, mirrors, LEDs) was then exported using a custom python exporter, into a custom data format which they could feed straight into KMELs system. So C4D wasn’t used for just previz, but ended up being fed straight into the flying robots.

The quadrotors were tracked by a VICON mocap rig. The setup included about 20 cameras mounted on a truss 7.5m high, covering a 9m x 4.3m area. KMEL wrote the software to use the VICON tracking data to control their vehicles. Each vehicle knows where it wants to be (based on the trajectories they exported) and knows where it is (based on the VICON data). It then does the necessary  motor adjustments to get to where it wants to be – much more complicated process than it sounds. The VICON tracking data also feeds into openFrameworks app Memo created. The moving head light animations are all realtime and based on the tracking data. i.e. VICON says “quadrotor #3 is at (x, y, z)”. OF app says “ok, adjust Sharpy#3 pan/tilt so that it hits quadrotor #3). (using DMX). The oF app also runs the data to turn lights on and off (DMX) and launch other lighting presets at particular times (gobos etc).

The team also used the VICON rig to calibrate the VICON space (quadrotor coordinate system) to stage space (the coordinate system we used for their animations) – by placing tracking markers on all of the lights on the floor (so the software knows where all the lights are in the world). Likewise VICON rig was also used to calibrate each of the individual sharpy orientation motors. i.e. when they send the instructions to set Pan/Tilt to 137 deg / 39 deg, to Memo’s dismay they were always considerably off! (even though they have very precise motors). So he had to map his desired angles, to real world angles (specific to each device).

Most of the testing, playing, calibrating, setting up was done on the iPad using Lemur configuration Memo built for the setup. For the actual show everything was preprogrammed and nothing performed live. Music was created by Oneohtrix Point Never with whom the team worked closely and iteratively to develop bespoke piece for the performance. The team would animate, send him the simulation, he would add music, sends back, they animate, send him the simulation, he changes music, sends back, they animate….etc. This went on until a few days before the performance.

Source: Creative Applications Network

Leave a Reply

Your email address will not be published. Required fields are marked *