Date/time: Thursday, 30/July 2015, 7pm
Location: Goldsmiths, University of London, Professor Stuart Hall Building, Room LG01
On 30/July, we presented audiovisual projects developed during the second Gen.AV – Hackathon on Generative Audiovisuals (25-26 July). The projects were performed by their creators. Additionally, the projects are open-source, and can be downloaded from https://github.com/avuis.
About the projects
A sort of general purpose scope viewer that can be mapped arbitrarily to any internal bits of SuperCollider synths. It is intended as a interactive instrument that can be manipulated in real time for generative audio visuals performances. The visual tool can be used with any SuperCollider synth definitions, even your own as long as they follow some naming conventions.
Authors: Fiore Martin, Patrick Hartono
Project link: https://github.com/AVUIs/Butterfly
Author: Louis Pilfold
Project link: https://github.com/AVUIs/cantor-dust
Esoterion Universe Gestenkrach
Esoterion Universe Gestenkrach is a fork of Esoterion Universe which adds support for SuperCollider sound engine with much wider variety of sounds, and LeapMotion sensor integration for more playful universe navigation and planet sculpting. Also the UI was adapted and made more directly responsive to LeapMotion input. Original description: under Gen.AV 1
Authors: Borut Kumperščak, Jens Meisner
Previous contributors: Coralie Diatkine, Matthias Moos, Will Gallia
Project link: https://github.com/AVUIs/EsoterionUniverseGestenkrach
tap reactive audio visual system. The system plays with the tactile, analog feel of tapping surfaces as a digital input device. This input and it’s gestures in turn drive sound and visuals expressively.
Authors: Alois Yang, George Profenza, Sabba Keynejad
Project link: https://github.com/AVUIs/OnTheTap
residUUm is a an attempt to sonify a particle system whose inhabitants exchange and discard their sonic characteristics as they collide leaving remnants that contribute a din of noise as their larger bodies fade. The sound engine and graphics are done, but the exchanging of characteristics has yet to be accomplished. This project uses processing to send visual characteristics of particle bodies to be sonified in pd.
Authors: Ireti Olowe, Giulio Moro
Project link: https://github.com/AVUIs/residUUm
An Audio-Visual Exploration of Chaotic 2-Dimensional Dynamical Systems (Or ‘Wat’). We would like to develop a 2D or 3D visualization based on Continuous Cellular Automata with various evolving rulesets, and sonify the result in a musical way. The core principle involves applying a matrix of mathematical operations (generally non-linear functions) to an image specifying the starting conditions. We apply our operation matrix my sliding it across the image (as in convolution) and applying the operation in each element of the matrix to the corresponding element in the image matrix. The result is complex evolving, moving, unpredictable textures which can be sonified with the right method. Furthermore, the rules of the system can be ‘performed’ by varying the operation set and coefficients in real time through some sort of input.
Authors: Alessia Milo, Bogdan Vera, Christian Heinrichs
Project link: https://github.com/AVUIs/wat
Thank you to London Music Hackspace for hosting the Hackathon.
Thank you to Peter Mackenzie for the technical support.
Thank you to all hackathon participants.
Gen.AV 2 is co-organized by EAVI (Embodied AudioVisual Interaction) group at Goldsmiths, University of London, in collaboration with London Music Hackspace. It is part of the Enabling AVUIs research project being conducted at EAVI, and supported by the European Union (Marie Curie programme). More information: http://avuis.goldsmithsdigital.com