Disquiet Junto Project 0589: Auto Play

Welcome to the latest Disquiet Junto project. These instructions popped up at disquiet.com/0589 (thanks, powers of automation!) shortly after 12:10am Pacific Time on Thursday, April 13. (I was deep asleep at the time.) The email containing those instructions went out via tinyletter.com/disquiet-junto later in the morning (after I woke up), and then I posted them here, on the Junto Slack, and my Mastodon account, and Instagram, etc. (I’m taking a Twitter break at the moment.) And if you’re on a platform, like Mastodon or Instagram, that uses hashtags, please use the #DisquietJunto tag. Much appreciated.

Disquiet Junto Project 0589: Auto Play

The Assignment: Automate something manual — or vice versa

Step 1: Think about things you’re used to doing when you make music, habits you take for granted.

Step 2: Record a track in which you automate something you usually do manually, or manually do something you usually automate — or both.

This project was inspired by recent conversation in the Disquiet Junto Slack.

Eight Important Steps When Your Track Is Done:

Step 1: Include “disquiet0589” (no spaces or quotation marks) in the name of your tracks.

Step 2: If your audio-hosting platform allows for tags, be sure to also include the project tag “disquiet0589” (no spaces or quotation marks). If you’re posting on SoundCloud in particular, this is essential to subsequent location of tracks for the creation of a project playlist.

Step 3: Upload your tracks. It is helpful but not essential that you use SoundCloud to host your tracks.

Step 4: Post your track in the following discussion thread at llllllll.co:

https://llllllll.co/t/disquiet-junto-project-0589-auto-play/

Step 5: Annotate your track with a brief explanation of your approach and process.

Step 6: If posting on social media, please consider using the hashtag #DisquietJunto so fellow participants are more likely to locate your communication.

Step 7: Then listen to and comment on tracks uploaded by your fellow Disquiet Junto participants.

Step 8: Also join in the discussion on the Disquiet Junto Slack. Send your email address to marc@disquiet.com for Slack inclusion.

Note: Please post one track for this weekly Junto project. If you choose to post more than one, and do so on SoundCloud, please let me know which you’d like added to the playlist. Thanks.

Additional Details:

Length: The length is up to you.

Deadline: This project’s deadline is the end of the day Monday, April 17, 2023, at 11:59pm (that is, just before midnight) wherever you are. It was posted on Thursday, April 13, 2023.

Upload: When participating in this project, be sure to include a description of your process in planning, composing, and recording it. This description is an essential element of the communicative process inherent in the Disquiet Junto. Photos, video, and lists of equipment are always appreciated.

Download: It is always best to set your track as downloadable and allowing for attributed remixing (i.e., a Creative Commons license permitting non-commercial sharing with attribution, allowing for derivatives).

For context, when posting the track online, please be sure to include this following information:

More on this 589th weekly Disquiet Junto project, Auto Play (The Assignment: Automate something manual — or vice versa), at: https://disquiet.com/0589/

About the Disquiet Junto: https://disquiet.com/junto/

Subscribe to project announcements: https://tinyletter.com/disquiet-junto/

Project discussion takes place on llllllll.co: https://llllllll.co/t/disquiet-junto-project-0589-auto-play/

1 Like

The Junto directive to “Automate something manual” led me to apply arpeggiators on the MIDI tracks I’d manually recorded using the computer keyboard in Ableton Live.

Except for the drums – which got the Automaton effect!

Then I went looking for a film about automation in the public domain.

9 Likes

And the project is now live

Making nice loops requires skills and some manual work. Or not.
Intro and outro a sinus tone recorded without listening
Main part, “the loop”. With a turn table, sewing thread and tape you can automate the loop making.
Well, you don’t need tape, but it makes it simple…
EP from 1966 with Country Four (Label: Amigo AMEP 501). Guess the track :smiley:

7 Likes

And the playlist is now rolling:

2 Likes

8 Likes

This is just the sound of a self-oscillating Erbe-Verb which has got some of the parameters linked to Mordax DATA. These have been set on a number of slow, interrelated cycles to give each of them some sort of continuous evolving movement, but I also ‘play’ them in real-time by raising the tempo of the cycles to get to the extremes and then pulling back to hold them there. I also manipulated the unlinked parameters on the Erbe-Verb manually.

This is just a short section of a longer recording that I quite liked - and think has some structure.

5 Likes

I am exploring repetition as a narrative and form building device. For this reason I often abuse of copy and paste features available on modern music production support tools. In order to answer to this week challenge I’ve played all occurences of repeated motives. I enjoyed so much doing this that I’ll stick to it.

Made with NI Maschine+ and ASM Hydrasynth.

9 Likes


I took either a literal or lateral view of this week’s challenge and used randomness as a form of automation. I used the Korg Opsix’s random patch generation to “automate” sound and note generation via the inbuilt sequencer, though I did manipulate the operator ratios as I recorded it into Ableton. I then processed the recording through Max4Live’s Xformer (for gates and accents),Sonic Charge Permut8 (for further glitches/pitch shifting) and Vallhalla Supermassive (for reverb/delay) I then automated key perameters of each of these plug-ins using Max4Live LFO Random. The Drums (which were programmed in Maschine) were also processed through Xformer and automated in the same way with LFO Random.
The result is a piece that I had relatively little to do with and comes out differently every time I pressed play.
Have a great weekend everyone,
Rupert

5 Likes

Usually when I record something, I do everything by hand - especially mixing and recording. So for this project, I decided to automate mixing and recording in a semi-random manner (with Drambo and AUM) and do the rest manually.

First I tried this with singing but it turned into too much of a mess. So in the end I jamed with some percussion instruments with the help of my five year old daughter.

We played into several loopers, with automated recording and overdubbing of the loops. For recording we simultaneously used four mics or rather a mic, a Zoom audio recorder (via line out), the built in Organelle mic with the Cannabits effect and Microfreak Vocoder. The looper channels went into a master channel with four effects that randomly were turned on and off (same pattern as loops).

We recoded one messy session and then overdubbed it with a second take. The result is one automated manual percussion mess (with some spoken instructions).

9 Likes

Funny, I went completely overboard by automating almost everything I normally do manually, volume fades, arpeggiating, turning devices on and off, reverb amounts, stereo width; I did enter the main notes and manually adjusted where tracks start and stop.

7 Likes

I spent all day re-organising my modular, and realised that the best way was to create a ‘side instrument’ in the smaller box- which I thought of like a VCS3, with joystick, poorly tracking oscillators and spring reverb, but without the pin matrix. The idea is to get me out of my comfort zone and force a different workflow. I used this box to create a tuned-by-ear drone, which I then played hands-on the synth for filter sweeps, levels, and feedback controlling - things I normally automate (I usually patch the modular to play itself as much as possible, and then play other keyboards over it).

The result was quite different to my usual sound, and actually maybe retracing some of the direction I started out in when I first went modular, which could be a good thing.

8 Likes

Hey All,
I decided to go with an automated vocal. I chose a Mark E. Smith song to go over the tracks.
I also went in and manually cut tracks out in places. Hope all are well.

Peace, Hugh

6 Likes

hello.

this one was quite challenging. right from the outset i had an idea to compose the track using one of the ai tools i have, thus making it ‘automatic’, but this proved harder than i expected. after much trial and error i managed to put something together using orb composer s which i found quite difficult and unintuitive to use.

i exported the midi and brought it into bitwig, assigning instruments (chromaphone 3, pigments 4, analog lab v) and effects to the various tracks that orb composer had created. after listening through it a few times i felt it was quite pleasant but missing something. i had an idea and decided to search on freesound for the words ‘automatic’ and ‘manual’. curiously ‘manual’ yields only 518 results whereas ‘automatic’ gives 2626, more than five times as many.

i downloaded five sounds each from the searches for ‘automatic’ and ‘manual’, and brought them into the bitwig project. i added various effects to make the sounds into idm-ish clicks and cuts. i also had to automate the volume on those tracks due to the effects being a bit unruly and continuing to output sound long after the samples had finished.

overall i’m pretty happy with how this piece came out, even if i might not have adhered exactly to the prompt. as usual, title from word association and artwork from nightcafe studio.

i feel like both the orb composer output and the processed automatic/manual samples could form the basis of their own tracks, so i might explore that at some point. or not.

5 Likes

In March 2020, Korg temporarily made their iKaossilator app free to download. Although I acquired the app during that time frame, I never used it on a recording until now. Within the app, there is an option to toggle the auto-play feature; this fulfilled the automation portion of the prompt. To supplement the recording, I played two other apps in real-time, namely Synth One and SpaceCraft. The title has no particular meaning and was simply the first thing that came to mind.

Instruments: iKaossilator App, Synth One App, SpaceCraft App
Plugins: 2-C Kaleidoscope, CromoVerb, Valhalla Uber Mod

5 Likes

I’m currently in the process of learning how to use a Torso T-1 generative sequencer, so I figured this project would be a good one to practice on. I usually like to utilize a combination of automatic and manual processes when creating music. But in this case, I let the hardware and software automatically generate almost all of the notes, patterns, parameters, etc.

I used the T-1 to generate the Ableton Live wavetable mallet melody, and used two instances of Max for Live Patter devices with Slate+Ash Choreographs to create additional sequences. I can get a little too granular and manipulative on many of my tracks, so it was refreshing to go on autopilot for most of this journey.

4 Likes

Manual filter control. Normally, I would automate the filter frequency. In this track there is a Shakti feedback module generating feedback via a Moog-style ladder filter and a reverb. Then the signal is split into several paths, one of which is passed through a Doepfer Wasp filter. The frequency is modulated by a ultrasound detector and actuated by my hand moving nearer and further away. There is another ultrasonic filter adding to a patch that provides an off kilter clock driving some envelopes that are modulating the other signal paths via a VCA and an LPG. All this was then recorded into Ableton and further manipulated.

6 Likes

I had a sound installation last week and I recorded some of it. So I took some of the recordings and ran some through Paul Stretch, then put it in Fl Studio and ran it through delays and loops. Then put the wav file in Audacity, slowed it down a bit and ran it backwards. Et voila!

5 Likes

Disquiet0589
Autonomic Rollick
• Key: Bb Lydian/F Lydian/Eb Lydian BPM: 121 Time signature: 4/4 DAW: Reaper
• Instruments: Drums, Bass, Synths, Guitars, Cello
• Plug-ins: Izotope, Captain Plugins Epic
• IDEA: Automate the entire song
• PROCESS: Used captain plugins epic to compose all the parts
• RESULT: 4 tracks of instruments with all sounds and composing done with and by The Captain with some final decisions made by me

3 Likes

When thinking of things I usually automate when I make music, the first thing that came to mind is volume. I often fade sounds in and out and do so on a straight line. I know it is not that interesting, but it usually works for what I am trying to do. While thinking about this, I was listening to a rock song with a specific drum sound I love - straight eighth notes on the kick and two crashes. Then, I thought I would create a track for this prompt that is just a build on the kick and crashes with distorted sounds over top. The layers I created are nothing sophisticated and I tried to limit myself to just a handful of guitar-like sounds to mimic a rock band. I used Ableton’s audio-to-MIDI function to create a single-note bass line that matches the kick. Then, rather than fade the volume in using automation, I drew in the automation using a straight line - which might not technically be “automation” but it is something like that and I usually do not do it. I did the same thing for the lead guitar/synth sound and then also automated the send to the delay return track - another thing I usually do not automate. I still faded out all of the sounds at the end, but this was an interesting experiment that gave me some ideas I can incorporate if I am ever composing a build-up like this in a song.

4 Likes