Pure Data Composition, 2014
Short music piece composed specifically for AR.Drone 2.0 Parrot Quadricopter. It consists of the sound synthesis generated by Drone’s movement in the space as well as the sound of engines. When the tracker detects the drone flying in defined area it generates data, which is then interpreted using a pre-defined algorithm in an open source visual programming language Pure Data. In this way the Drone shapes and structures musical material and becomes an instrument.
Drone is controlled via FreeFlight application running on iPad. Depth image from Kinect sensor is evaluated by a patch in vvvv which detects the drone’s location in the space (more precisely its relative position inside the bounding-box). A vvvv is sending it to Pure Data, which are transmitting it to a music system. When a quadcopter flies into the defined area it generates musical pattern sequences based upon a predefined algorithm. In this way the drone takes control over the sound synthesis.
Music composition: Mária Júdová
Tracking system: Andrej Boleslavský
Camera and editing: Mária Júdová
Sound: Petr Zábrodský
Thanks to: Institute Of Intermedia
Presented at DEAF (Dutch Electronic Art Festival), 2014
see more at mariajudova.net
Light and shadows.
EPILOG is an interactive room installation applying light, sound and haze. By immersing into a 25-minute constantly transforming world of moving images and sounds, you are invited to swing into interaction by physically replying to the moving patterns – either on your own or together with others.
An interactive floor projection reacting to your movements in real time gets the visitor moving and, through the movement creating three-dimensional spaces made of light.
Tracking cameras, fog and a smooth double-projection are spawning searchlights, reactive light walls and rays bouncing off your body.
Each of the eleven scenes follows a unique principle: Whether you are chased by a simulated, abstract school of fish, or your movements brand into the fog and leave behind a swath out of light. Real time generated sound illustrates the movements and intensifies the immersive experience.
Concept / Design / Code:
Sebastian Huber, Johannes Timpernagel, Michael Burk – http://www.schnellebuntebilder.de
Moritz Haberkorn – http://www.morast.at/
büroberlin – http://www.bueroberlin.net/
Prof. Dr. Axel Buether
Pure Data – http://puredata.info/
tanz! Wie wir uns und die Welt bewegen
Oktober 2013 – Juli 2014
Rosenpictures – http://www.rosenpictures.com/
Exerpts of the live AV performance by Cristian Vogel and Sune Petersen at the third install of Champs Magnetique, techno and visuals in Le Balzac cinema in Paris.
Both music and visuals is generated live on stage.
This music is improvised in real time using a computational approach to rhythmic electronic music. I use my own sequencers and synthesis techniques in a music language called Kyma, which I have been programming with since 2006 - I call my performance software "The Never Engine" (also the name of my 2006 album on Tresor records where you can hear the first prototypes).
The visuals is likewise improvised in real time using a simple geometric shape and a video feedback loop to create a live visual instrument that has a very organic feel to it. I call the instrument "The Schmidder"
The motion of the geometry it affected by control data sent from Cristians Kyma system.
The Schmidder is patched in http://vvvv.org
anonymous user login