Hi, everyone –
I’m a newbie to this space – referred by a dear friend. So far, I’m intrigued by the interesting ML/AI functionality of the site (seems in service of the good) and what seems to be a bunch of great people.
I’m here to learn more about the amazing technology with which we’re surrounded, other people’s workflows, processes, and musical output.
It so happens that I’m on the cusp of releasing some music that I might as well share here by way of introduction.
This music has an interesting backstory – at least to me: Tim is one of my oldest friends and musical soulmates. We released a record of improvised ambient-ish acoustic piano/electric guitar duets in 1992. Since then, we’ve often talked about doing another one, but finding the time proved an elusive quest. Enter pandemic lockdown and no live music, and voila – some time presents itself. So this is a file-sharing process with a little bit of a twist: Tim doesn’t have a computer or home studio setup. So I gave him a Zoom H6 with a card loaded with sonic ideas for him to react to with his guitar. (Funny aside: I don’t have a card reader device, so I thought I could just drag the files off my computer and onto the Zoom, but it turns out you can’t do that, despite the USB connectivity. So I had to dub down the audio. First time using the record button on a hardware machine in ages.)
For my part, I decided to do a deep dive into Ableton and really try to discover the magic that I know resides in it and animates the faces of so many of my friends. I’d used Live over the years for playing VSTs on stage with my controllers, but tended to favor traditional linear DAW recording via Logic, usually. (Logic also sounds better, to my ears, and has higher-fidelity preloaded sounds.) I got the 10 Suite (just upgraded to 11 the other day and am excited to check it out), gathered my hardware synths and the hippest plug-ins I had – with the idea of creating electronic soundscapes while avoiding conventional keyboard playing. I was going to use arpeggiation and real-time parameter manipulation to make the oscillators come to life without hijacking them with a tempered keyboard to serve the master of diatonic harmony. I haven’t yet gone down the modular rabbit hole, but I have several hardware synths and some cool plugs and Ableton packs. I was going to use the various flavors of the Live arpeggiator, as well as the absolutely killer internal arpeggiator in the novation Launchkey mini 25-key product. My I/O sitch is the RME Babyface Pro, which sounds fantastic. I had MIDI control info coming out of Live and into some synths with audio signal returning. You’ll DSI Prophet '08, Mopho creating a lot of bass and textural content, with Behringer’s Model D knockoff outputting some arpeggiator-driven licks (triggered mostly by the Novation). The Live arpeggiator spoke to the Mopho mostly, but also to the Prophet and a Deep Mind 12 (I also used the internal arpeggiators of those instruments). The Live arpeggiator worked best in terms of clock sync, but warping makes everything external work out in its own way. Anyway, I love arpeggiators, and was swimming in them…
My aural vision was to create a bed of babbling circuits for Tim to lay down his typically gorgeous space guitar sounds. I envisioned that the process would involve him throwing down ideas, sending the box back to me, where I would pull his tracks into the Live session, and from which I would create guitar clips. I thought we might send parts back and forth, but ultimately, I’d “compose” the music by recording and saving a few clip playback performances to evaluate. Or such was the plan, at least. In order for this sound collage idea to work, it was imperative that Tim run a direct-in signal from his guitar and pedal rig into the Zoom. That way, I’d get sonic information I could edit and loop as I saw fit.
Of course, as guitars players are – quite understandably – wont to do, he preferred to play through his amp and record the sounds in the air. (Had I known, I would have given him a good mic for that purpose, and not use one of the mics that are included with the H6.) He was very happy with the results of the experiment when he called to tell me that the Zoom was winging its way back to me in the mail. He’s generally not effusive and very critical, so I was way psyched. But when I uploaded his audio into the session, I was immediately disappointed at the realization I wouldn’t be able to edit the audio and that I got such a room imprint on my cosmic ethereal tracks.
But then I listened. The guitar was absolutely beautiful. So soulful and so dialed into the environments I sent him. He heard a narrative arc in the sequence of sonic vignettes. There was indeed a flow, but born more from my sort of natural aesthetic sense than out of intent. I began to understand that not only was it fine that I toss my original concept out the window, but that it was turning out much better than anything I imagined. I was hearing a complete 26-minute composition unfold before me. So I ended up not editing a note of this linear, 26±minute flow. We did end up (after deciding to release it publicly) breaking the piece into 3 parts (tracks 2-4 at the link above) so that we could make a vinyl record.
But now I was faced with a new problem: I had no record of the music in the Live session since sending him a random clip performance a couple of weeks earlier. All I had was that dubbed-down audio track from the Zoom, and that just didn’t sound good enough, nor would it afford me any mix flexibility in the end. So I had to listen to that audio carefully in order to recreate what I had sent him. At this point, I switched from my session view orientation to arrangement view. It would no longer be a clip-based project. And I needed to sync up the tracks – much more of a painstaking process than I had envisioned. Long story longer, I got it all synched after much listening, much back and forth with Tim listening down to every mix attempt I posted to a private Soundcloud link. While the 3-movement piece is definitely still driven by synth based rhythms and textures, I ended up layering in some harmonic and melodic content.
For the rest of the record, I took his alternate take information and used that guitar as the basis for my own reactions. So we sort of flipped the concept, adding 5 more pieces to the work. Here I added some more traditional keyboard sounds & playing, some samples, and even a backbeat-driven sampled drum part on a tune where I was able to loop a rhythm guitar part and create a funk piece, more or less. The album closes with a generative Max for Live patch that works swimmingly with the ethereal cry of Tim’s Tele.
Thanks for reading, and I hope you enjoy. I’ll be diving into the material posted here.