This project began with a chat to a scientist at the Babraham Institute in Cambridge, Mikhail Spivakov. His work is related to my old area of research, and he had the idea that maybe I could use some of their data to create a piece of music. The EP began to unfold from there...
So I travelled up to Cambridge to meet him and the rest of his lab group, and their collaborators, to talk about what they are doing there, and think about how it could translate to a music/visual project. We decided to focus on their research on chromosomal conformation capture, which experimentally detects points of contact between chromosomes (long strings of DNA), and then employs computational models of folding, to predict how the chromosomes are packaged up in a complex tangled bundle of strings. This process of simulated folding to create our best guess of real chromosome structure is a beautiful process, so this beauty became the focus of the project.
Luckily for us, Andy Lomas was interested in getting involved, and being the genius he is, he built an interface from scratch for mapping the raw data into the Unreal gaming engine, which could be used for creating video sequences, as well as a VR experience. Once we had the ideas and video aesthetic is was just a matter of me figuring out how to make the music fit. The raw data wouldn't map well explicitly to music, given music is of a very particular form, and raw data not suited to this form just yields indiscernible noise. So I set about two techniques for musical representation of the video and ideas, one based on live instrumentation, the other based on generative computational approaches.
Chromos is the first of these two pieces, using live instrumentation, and accompanied with the simplest visual form of the project showing a single chromosomal aggregation process. We wanted to show the data is this unadulterated form, so you can see the real science in action, and the real chromosome structure, along with glowing red appearing to show where genes are most highly activated. It's a glimpse into the complexity and form of one of the most important molecular structures in all of life. So musically I wanted to try and capture some of this grandeur, along with the complex messiness involved, which still yields a coherent, functional outcome (the living being it codes for and creates!). To try and achieve this I turned to one of my favourite instruments, the sansula, and I played lots of tempo-free melodies, which I layered, to mimic the complexity, and then used to define the underlying chord sequence as the coherent centre and function. I decided to keep it free of percussion and other elements, to focus on the beauty and peace of the visual process. I just added a classic wash of real Roland RE201 spring reverb and some work on the widening and pyscho-acoustic space with some simulated binaural effects.
Coils of Living Synthesis
The second track on the EP is part of the collaborative project with Andy Lomas and researchers from the Babraham Institute in Cambridge - Mikhail Spivakov, Peter Fraser and Csilla Varnai.
Andy built a system in Unreal to map their data on DNA structure, to a visual simulation, which output footage Jennifer Tividad has used to create this video.
The first of the two videos from the project shows unadulterated footage of single simulated chromosomal folding process, titled Chromos. Whereas this video uses the data visualisations more freely to create pleasing visual forms, rather than being so focused on delivering the raw science. You still see our best predictions of real chromosome structure in there of course, just with layering and inversions and various treatments. But what you see is still an indication of the types of molecules that control every cell in our bodies. It always amazes me how such a messy tangle of strings with a code running along them can direct and control something as complex as a human being!
The red areas which appear show regions where genes are highly activated, and later in the video the green and blue colours mark out separate chromosomes. The bright lighting strikes just show fragments of DNA which travel far from the centre of the simulated folding process - that's not something biological, it's a result of the simulation, but something that had a nice visual effect. Another artificial property is all the straight lines - we had a big set of spatial data points, and these were joined by straight lines for simplicity, but in reality it would presumably be all curves and flowing lines rather than the jagged effect we see here. Although I do like the aesthetic it creates.
One of the main scientific messages of these video projects is the relationship between structure and function. DNA folds up in very complex, and seemingly messy, ways, as you can see in the video, but it is thought that this complex folding is directed, in order to expose, or hide, specific parts of the DNA molecules. So the parts which are locked away between other strands of DNA cannot be accessed, and therefore the genes at those parts cannot be switched on. So the complex structure you see in the video has a function, it's not entirely, random, and can be controlled from one generation to the next in order to influence how genes are switched on and off. This area of research is called epigenetics, and it caused some arguments when it was first proposed, because it adds an extra layer of heritable changes in DNA function, which don't operate through the usual mechanisms of evolution explained by Charles Darwin. It was like a religious heresy to question such a long established and supported scientific dogma as Darwinian evolution, but the evidence was there so everyone had to accept the ideas eventually.
Getting back to the music, I wanted to try and capture this messy complexity in the sound track, but it was too messy and too complex for me to achieve by hand, so I went of a partly generative computational approach whereby I set up more than 100 layers of sound, many of which had their own sound synthesis path, where each synthesiser was being constantly randomised to yield a mangled output. It was a right mess. But that was a good starting point in line with what I saw from the data. The task became a long process of finding reasonably musical parts in the mess, and arranging them in as many layers as I could, while still having them combine to form a consistent musical whole (the amazingly functional role of the messy bundle of DNA molecules).
I used a lot of simulated binaural processing on several layers, so that when you listen with headphones you can hear certain layers of sound pop out around your head and be clearly heard even though there's a lot of other layers playing simultaneously. Because of the randomised synthesis approach, most of the audio is digitally generated, but I also added some old analogue synths for the underlying chords, and I also applied many layers of partially randomised processing to the percussion to try and give it all a constantly morphing organic feel in line with the video content.
This music and video project came from a collaboration with a mathematical artist who works under the name Cornus Ammonis. He creates simulations of pattern formation, using ideas originally created by Alan Turing in his attempts to explain and model the mathematics of living systems. These systems create beautiful warping interacting layers of simulated substances, and we thought this visual effect could lend itself well to an imagined alien planetary surface with some unknown form of geological process - molten landscapes.
It's a slowly evolving hypnotic visual process, so the music has the same form, with a repetitive syncopated two note motif playing under a gradually developing chord structure which becomes ever more layered. It's all about atmosphere and sitting back to let it wash over I think, it's not a sharply defined piece, and it needs to be given some time to develop, but hopefully the result is worth the time you give it.
After Cornus Ammonis had created a load of stock video sequences using his models, I then passed the footage on to Morgan Beringer (who created the awesome Unbounded video from Emergence). Morgan applied his trademark processing style to the previously raw footage in order to blend and layer, and make the whole visual have a lot more smooth movement and rich organic feel. Sorry that's not a very technical description of what he did, but hopefully you'll know what I mean when you see it!
Cornus Ammonis said about the project:
"Two systems are at work here. The first simulates a grid of points connected to their neighbors by nonlinear springs—that is, springs that break (exert much less force) once they extend past a certain point. Each point has both a position and velocity, which is updated by summing spring forces and applying reverse advection. The collective action of the spring system produces constantly-shifting hills and valleys of points that become oriented in nearly the same direction. Straddling these valleys are groups of springs that have "fractured" in opposite directions, in a process similar to the forces governing the patterns that form in the cracked earth of a dry riverbed.
The second system is a simulation that rides along the "surface" of the spring grid. This system produces Turing pattern-like stripes and blobs that split and merge as they propagate across the underlying spring system vector field, yielding an overall effect analogous to seafoam on an ocean wave. Like a fluid simulation, this system propagates quantities over a vector field, but no divergence minimization or advection process is used—the process is mediated only by the interplay of diffusion and various quantities computed on the vector field, namely, curl (vorticity), divergence, and the average distance between neighboring vectors. The equations governing the interactions between these quantities were chosen so that the system remains constantly in a state of flux, rather than settling into a stable configuration—they were not chosen so that the system models some specific physical process.
The output of these two systems is a vector field. In order to turn this vector field into a terrain, a solver integrates over the field to produce a height at every point (i.e. a heightmap). It is not possible to perfectly recover a heightmap from these systems, so the output of the solver is a continually-updated approximation. The terrain is rendered using raymarching, so that the entire process, from simulation to rendering, is computed entirely on the GPU."
Four Tone Reflections
This track is a bit of a reaction to our constant information bombardment and instant gratification music - you just need to step back from everything for a while and let it wash over for this one to work.
The whole thing is hinged on a sequence of 4 notes playing at different rates forwards and in reverse using Alexander Randon's Fugue Machine. It's a simple pattern but it grabbed me and made me want to strip the track back to the minimum possible number of elements to focus on the pattern.
I think we pick up on the fact that there's an interesting pattern hiding in there in the simplicity. The fast riff that opens up is a classic Juno 6 sound too, which also made me want to put it out there unadulterated. I gave in eventually and added one percussive element and some extra chord layers, but you have to wait about 8 minutes for that.
It's certainly not for everyone, but I hope it works for you...it gets pretty intense too btw, but I guess you've had that before from me.
Thanks for giving some of your time for a listen!
Chromos [Cosmin TRG's Mutation]
Cosmin turned in a great remix for this that's been picked up by a lot of DJs and works so well on the darker times on the dancefloor. It really is a Mutation!