Saturday 27th September & Sunday 28th September // 1000-1800
Synchrocities is a multi-channel audio installation in which a series of open microphones streams are analysed, in pairs, through FFT based spectral analysis. When the governing system, based in Max/MSP, determines a synchronous event, it performs a specific process. A synchronous event is determined as a period of time in which simultaneous activity exists above a certain amplitude threshold, and within a pre-defined frequency range (FFT bin). Four streams are analysed, in a bi-focal system. When a simultaneous event occurs, specific processes intervene. In the first instance, the governing system replays the specific FFT bin in which the synchronous event transpires. This may be called a ‘frozen’ event. In the second instance, the system replays the sonorous activity through a convolution technique. The synchronous events from each stream are convolved with each other, and then replayed though the space – accentuating the interrelations between the two remote places.
The installation also contains a visual element, in which a map is displayed on the front wall. The map remains hidden until that time in which a synchronous moment emerges. The event creates a revealing of specific locations. The interplay between the visual representation and the sonic events allow the listener to forge an understanding of the spectral relationship of the paired sites. Synchrocities would ideally be displayed within an quadrophonic array. Two pairings will be made from streams sourced through the Locus Sonus open microphone platform.
The pairings will be decided on an arbitrary basis. As the ISSTC 2014 theme is site specific, one of the microphone streams will be located in Maynooth for the period of the conference, streaming from a Raspberry Pi and microphone set-up. A custom Max/MSP patch will govern the installation, completing the FFT analysis and distinguishing the synchronous events. The patch will also govern the replaying of the streams, the frozen FFT bins, and the convolved synchronous events.
Robin Renwick is a current PhD Candidate at the Sonic Arts Research Centre at Queen’s University Belfast, Ireland. His specific topic, under the guidance of Prof. Pedro Rebelo, is directed at researching a network sourced approach to network music; the concepts, theories, abstractions, and implications of such a strategy. He began his formal academic career at Trinity College Dublin, completing his undergraduate degree in Business and Economics. Following this he completed a BETAC course in Sound Engineering at the Sound Training Centre, Dublin, followed by a two year FETAC course in Theatre and Drama at the Further Education College, in Cork. Upon completing these, he decided somewhat foolhardily, that a more rigorous education path was called for, and so he went on to complete his MSc in Music & Technology at Cork’s School of Music, before applying and being accepted at Queen’s, to pursue his current path.