Disquiet Junto Project 0477: Flying Blind

I was looking forward to this Disquiet Junto Project since I really enjoyed taking part for the first time last week.
My approach for Flying Blind was

  1. Choose the sounds blind
  2. Play the music blind

For the selection of sounds I chose the following method: In the Mediabay of Cubase I searched for all content containing the word flying or the word blind in its name. Since there wasn’t any with the word blind I reduced it to blin.
This gave 9 synth sounds, 2 samples and 6 effects channels (4 Amp Racks for Flying V…). Surprisingly all electronic sounds or noise.
In the track I used all but 1 synth sound, 1 sample and 2 effect channels.
I only changed two sounds: For the “bass-drum” I shortened the attack, and for the “snare” I reduced release and delay.
No other sounds were used.

First I played the Pad-harmonies (without looking at the keys), then the lead melody and then the additional sounds (all neither looking on the keyboard nor on the screen).
Then I added the “drums” and additional sounds, doubled the lead sound, added effects, EQs, automation…
I rearranged the resulting score adding an intro, repeating the first beats and copying the first part to the end.

Choosing sounds blind was a succesful approach for quickly finding inspiration !


In 1997 under cover of the college darkroom darkness I first held my now-wife’s hand while others were oblivious. In 1997 I also spent a good deal of time with friends making loud noises that occasionally touched on music. Those were good times. Today, in 2021, I took a random cassette from that era, plugged the boombox into a darkroom timer in my now much-too-small basement closet darkroom, turned off the lights, and flipped the switch. These are still good times!


Yes those were good times! I was trying to figure out if I could recognize the sounds from a session I was at, but I don’t think I was there… it certainly brought me back!


Boy, this was a fun assignment! Don’t know about you, but it felt great to close my eyes and just start to play. But first I learnt that it’s impossible to patch and play a three-row Modular blindfolded. There’s too much tactile information, too many knobs, and you get tangled in the cables.

So I turned to the Cocoquantus for a gritty base pad sound. Finding knobs and patchpoints was much easier with this instrument.

Next step was an Easel Command, which arrived just two weeks ago. I was quite nervous while playing it (I still have a lot to learn), but actually, it was easier to wiggle than I thought. And it just felt great.

Last step was the Moog Matriarch and its lovely delay. Again, I patched with my eyes closed, corrected a misplaced connection and started playing a bass, some chords and just some weird delay and filter sounds.

Threw everything together in Ableton, pre-mastered in Ozone. Time spent: 3 hours.


I used the app Monovista on my iPad Air, running it into my Empress Echosystem on “blue tape”, and then into my Empress Reverb on “green ghost”. Click record on my Tascam DR-05. Monitor using Grado headphones.

Turning around in my desk chair, I blindly reached backwards and swiped the iPad’s screen 5 or 6 times during the recording.

What can I say? That turned out excellent.

Got real lucky at the end with the rhythm part doing a bit of a ‘double time’.


I have numerous unmarked tapes containing similar sounds that were way more enjoyable to make than to listen to now (although listening now is still pretty fun!) If not in this particular minute of memory, your magnetic legacy definitely lives on somewhere in the ‘archive’.


It took me a while to figure out how to do this… Any instrument that I can play, I can play in the dark. The modular has too many blinkin lights.
I decided on pedal steel, since it is mostly by ear with some visual cues. Played 3 tracks in the dark. I still had to go with a blindfold because of the Ebow light. The only thing to respond to was the previous track.

Nuclear Winter title comes from the dark, the cold, and the fretboard symbols on the 1974 Emmons Push Pull steel.


That was a lovely blue vibe!


This is by far the most interesting Disquiet Junto experience I’ve had so far–the prompt made me take my most creative approach yet while the resulting track is my least favorite submission to a prompt. My takeaway from this is a reminder to focus on the journey, not the destination.

I had no idea how to approach this blindly since I usually work from my modular and it occurred to me to make the blindness occur in the DAW in two steps. Step 1: I recorded three different looping melodies into three tracks with no monitoring via Grid/Kria/Ansible into Mangrove. For each track I randomly changed the notes of the sequence prior to recording and while recording I played the volume and faded the track in and out. Step 2: I ran each track through Virta and blindly chose a preset, followed by a random preset in a Valhalla reverb (Shimmer and Supermassive) with knob-tweaking.

I adjusted panning and bounced it to SoundCloud and didn’t listen to the track until it posted… Not really feeling it all because one of those presets was an incessant drone… I felt it wouldn’t be in the spirit I approached the prompt with to take it down and go back to change that preset, so I have kind of a grating drone with a few small peeks at not a wall of drone… Enjoy?


I made a video explanation of my track "Swipe Right Monovista (disquiet0477) - I duff a few things in this video. I didn’t make 0476 yesterday, but instead last week. And “aplomb” is spelled without an “e”. I was barely awake when I made this video:


Drone structure. Real-time recording made in Pure Data, with partially random sequences.


I was doing something else on my iPad, sitting outside in the sun this afternoon, when the nearby bells belled. I suddenly remembered a sentence I read just yesterday, that natural idiophones where the best base for granular synthesis. I managed to record the already fading bells with the iPad and loaded copies of the samples into the Samplr App. It’s three tracks with the original sample (I mixed the unchanged version to the beginning of the track), two of them very normal, one very high und unrecognisable as such (the “stabs”). These settings and the recording in AudioShare I did by looking, then I flew blind, which is not impossible as Samplr gives you some defined area to touch (and you need to look to stop the machine or tracks with two fingers).

But as the iPad surface does not give any haptic feedback, there were a lot of things going wrong anyway. This track uses around 5 minutes that sounded pretty good to me, but maybe it’s just because I like church bells … delays and reverbs are from Samplr, mastering in Reaper, seeing.


I set up a stereo pair of mics and a few instruments on a rolling desk in front of me, set up a looper in my DAW, hit record, covered my face by wearing a hoodie backwards, and started playing these instruments with my hands while listening to the loop build up in my headphones. The instruments I used were:
-steel tongue drum
-empty gallon-sized plastic jug
-children’s metallophone

You can also hear my children playing and talking, which was not entirely intentional, but kind of neat. They get worked into the loops and it turns into a different experience entirely. After recording, I chopped the performance into four parts (one for each instrument) and processed and arranged them separately. This is the result


I’ve actually been working on a ‘blind’ setup this last year for my use of the laptop. If we ever get to play live gigs again, I’d like to use pure data, emissions control etc but with the laptop off to one side, with my midi controller as the only interface. I want to rely on the sound, and my own knowledge of the workings of the patch, rather than looking at the screen. Anyway, although this lines up with this week’s prompt in some ways, it didn’t seem in the spirit of the thing as I wouldn’t be doing something different to usual.

Instead I went to my folders of phone recordings and picked out three samples with my eyes closed (one each from 2018, 2019 and 2020). Then I fed them into Joo Won Park’s mouse-controlled granulator, and ‘performed’ the samples with mouse gestures without listening. I took that recording, cut it into three parts, and then fed that into another of Park’s systems, which automatically layered the parts. All pretty blind, in the same way that when rummaging around in a bucket of Lego you don’t know what bricks you’ll pull out.

I only ‘cheated’ by trimming out sudden volume spikes, and by running the last stage a few times with different parameters to get a result I liked better. Not totally rigourous then, but mostly flying very much blind: I’m still not sure what the samples are, apart from the one labelled woodpecker.wav…


I sat in the dark with my eyes closed whilst stroking the touch plate of my Plinky synth with the output unmonitored. The audio, pitch and touch output were sent into a migraine inducing spaghetti nest of switches, dividers, envelopes and switch registers ttiggering randomly assigned voice outputs.
I ran the resulting bounced audio file into delay and reverb to smooth out some of the jarring modulations and slowed the playback down a bit to make it less fidgety.


For this one I queued up a rumbly LFO > amplitude drone (Ripplemaker), a field recording of a walk through an empty parking garage near an office I used to work at, and a Borderlands session with a short meditation I had previously recorded. I held the grid in my left hand to do a blind grid drums jam (with Mark Eats Sequencer on laptop > AUM) along with the walk recording, adding touches of gravity-enabled grains in Borderlands throughout with my right hand. I had the iPad balanced on my leg so the grain generators would naturally fall offscreen while I moved around.

I recorded about 6-7 takes doing weird device balancing act with my eyes closed before getting one I liked, but I think the last one came alive / captured the consistent uneasiness I would feel while walking through this particular cricket-filled Beverly Hills parking structure to work at this point in time (late 2018).

This was definitely an interesting challenge! I’m always looking at a screen – even when making sound – it was a nice reminder to consider different kinds of gestures.


I wanted to explore the concept of improvisation as my “flying blind” interpretation for this week. For this reason, I programmed a sequence in Ornament & Crime, then modulated a couple of the characteristics (sequence length & direction) using Arc Cycles. This meant that I wouldn’t be able to second-guess what the sequence would do, although I knew the melodic content would be in A Pentatonic Major.

This sequence was sent to a sample & hold module, with a trigger being sent once per bar to allow a bassline to follow the melody at this trigger point, adding a further element of unpredictability to what I would be playing over.

The improvised element played blind on top is electric guitar. I sent both this (& the main sequence, for which I used a Mutable Instruments Plaits in granular formant mode) into my new Mutable Beads module, which processed the instruments in “scorched tape” mode.

Bassline provided by a Moog Mother 32, folded by Instruo tanh3.

Rhythmic sequence is an old 606 loop I made processed by Qu Bit’s Data Bender.


I was excited to do this in EarSketch, which I only heard about last night. EarSketch programs are written in Python and I was pretty sure I couldn’t code with my eyes closed. I would probably get a lot of compilation errors. (Although I do actually know a blind programmer.) I decided the next best thing would be to make parts of it random. In particular, I started with the drums and made them random:

for i in range(1, 65):
  kick_beat =   shuffleString("00000000--------")
  tom_beat =    shuffleString("0000------------")
  snare_beat =  shuffleString("000-------------")
  cymbal_beat = shuffleString("0---------------")
  makeBeat(kick, 1, i, kick_beat)
  makeBeat(tom, 2, i, tom_beat)
  makeBeat(snare, 3, i, snare_beat)
  makeBeat(cymbal, 4, i, cymbal_beat)

You can see my EarSketch here and even remix it if you like.

I used some Conet Project numbers stations transmissions.


infinitedigits: This is great! Can you tell more about your signal chain? What electtic piano, what processing. Its an interesting sound for sure.


Flying Blind. On soundcloud:

and on YouTube:

Its the end of winter, 2021. We are all flying blind, here at the end (¿) of the covid times. what is next is unknowable. what is past cannot be recalled to mind. we have each created an inharmonious equilibrium from which we seek to escape. Yes; thats it.

I first created a generative arpeggio in G on the modular, using Sloths for chaos, Rings (of course) for sound, and the 2hp ARP for notes. The arp was being modulated by sloths…you can see the patch in the video. I clocked it from the Drumbrute which also provided the beat, which was further augmented with the SSF Entity Bass Drum module. The drumbrute passed through an 1176 on the way in. Having got that noodling away generatively to my liking, I then added the main event; a fingerstyle electric guitar ditty on my gretsch/mesa V:25/H9 (tape delay algo) in the loop.

The modular and drums were one stereo track which got some limiting and drum bus treatment in Live; the guitar got pretty much nothing.

I composed this in 20 minutes or so while blindfolded, and then did one take while a kind person held the camera. Then I struggled EYES VERY MUCH OPEN to align the audio and video in iMovie.

Several clams in there, but so it goes. Being blindfolded listening to the modular make its way into the darkness while finding a fit was really nice. Recording it blind was fussy though.


Hi all :slight_smile:

Finding this project motivated me to make an account and a track to share. Gear used: Nord Drum, Deluge, Peak.

My initial thoughts when I read the prompt were to make a track as fast as possible. I don’t know why I interpreted blind as very quickly. Made this track in the time it took to warm up a muffin and make some tea.

I wanted to use audio tracks only (MIDI is too tempting to edit) and to perform each track as a blind improv. I started with the Nord Drum as I just got it (!!), then a FM track on the Deluge, and finally the noise/lead from the Peak. It was a struggle to keep my eyes closed during each take -when I caught myself cheating too much I would start over. I think going blind helped me use less notes and have a more varied approach to the composition.

Lastly, I “bounced” the song into a master audio in the Deluge. Blindly performing automation on each track (that accounts for the FM craziness about 35 seconds in).

Happy to have my creativity pushed and to join this music community :blush: some of the other posts are very inspiring in their process, looking forward to listening and learning

Cheers :honeybee: