In LoverDre84, we are going to use the first version of the MIMIC platform to do a collaborative coding performance in the electro style of the early 80s West Coast Hip-hop.
The recent bout of isolation has allowed us to catch up on our favourite streaming shows, one such being Netflix’s informative nostalgia-fest Hip-hop Evolution. In the first season, we see a young Dr. Dre, pre-NWA, cutting his teeth DJing West Coast electro. Whilst this shouldn’t be too surprising, it was still a revelation to the two performers of this piece. This proposal is a tribute to a young Dre, the shiny silk shirts, the flat tops, the breathy samples.
For some more context, Afrika Bambaataa in NYC, and Dr. Dre and Egyptian Lover in Los Angeles, started using the Roland TR-808 drum machine in the early 80s to make beats that were unheard before. They also started sampling and blending other musician’s work into their compositions to create unlikely mash-ups. Their beats, groove, and sound, and their playful but intellectual approach to music-making, are still relevant and groove the dance floor almost 40 years later.
What would happen if these artists had an environment for collaborative music-making? In the LoverDre84 performance, we are going to use the MIMIC platform, an interactive web-based coding environment, to explore electro music with current technologies. In particular, we are going to use the MaxiInstrument library, an AudioWorklet-backed library with synthesizer and sampler capabilities, to deconstruct and play around with electro-style beats and gimmicks.
Louis McCallum is an experienced software developer, researcher, artist and musician. Currently, he is an associate lecturer and researcher at the Embodied AudioVisual Interaction Group, Goldsmiths, University of London. Recent work has involved developing the IML tool Wekinator with Dr Rebecca Fiebrink, and also as a researcher on the AHRC funded MIMIC project (https://mimicproject.com) building web based tools for musicians to use machine learning. His work, both individual and collaborative, have been widely exhibited across London, Dublin, New York and Austria.
Gabriel Vigliensoni is a Montréal-based musician, producer, and researcher from Chile. His artistic work is informed by formal musical training and extensive studies in sound recording, music production, new musical interfaces, and music information retrieval. He is currently a postdoctoral research fellow at Goldsmiths, University of London, doing practice-based research on the creative capabilities and affordances of the deep learning paradigm and applying it to assisting musical composition.