The project is now live.
The Gibson/Zorn Band is
Detritus Tabu3- Trombone
Hans Zimmer=Sound FXs
Photography- Manny Manischewitz
a duet–a brawl?–between piano loops and processed noise. Who wins??
Coded and performed in ChucK, I’ll post the code on my GitHub later.
this is a very nice idea. I’ve been obsessed by decomposing and deconstructing a melody into distinct mini fragments for like a year or two, but one second? Great.
Looking forward for the juntos’ tracks.
The kind of challenge and limitation that kicks imagination. Hope I can find some time on a busy week end…
Earlier today I’d recorded my son playing the drums and had attempted to match up a couple of loops.
One is in 6/8, while the other is 4/4; and one had higher fidelity sound as I’d plugged in the VideoMic.
So I turned to these recordings and it’s a thrill to collaborate with a family member on a Junto, which doesn’t happen very often.
I added distortion, as well as different EQs and compressors to each part.
The parts are each a bit longer than a second and share the same fill, but I liked the constraint of using drums for the composition.
And I also liked not having to worry about recording because it’s scorching hot today.
Inspired by the Feedback February thread and especially @simondemeule’s earlier post here, I’ve wanted to try no-input feedback, and after thinking about this prompt a bit I wanted to see whether I could create distinct A & B lines from a fairly simple feedback loop with an LFO shuttling it between two different dynamical regimes. I’m not sure I succeeded but I started to feel like I might be onto something before I ran out of time/energy/interest, so,
I can’t seem to upload a screenshot (if anyone cared) but: The main effects are EQ and Flanger. There’s a square-wave LFO that inverts the polarity of the Flanger’s internal feedback and also flips the 198 Hz and 2.10 kHz EQ bands between boost/cut and cut/boost. There’s an Envelope Follower that negatively controls both the Flanger’s feedback and the return track’s send level back into itself. Finally, the Utility reduces the stereo width slightly and monos it completely below 98 Hz; this keeps the system out of the regime where one channel is silent.
There’s also a Limiter on the Master track. The only automation is the Flanger’s Dry/Wet control. A single-sample impulse sets the thing in motion.
Eclipse is my contribution to this weeks Junto. I have been exploring different parameters of the algorithm that I used to produce Moire in Sound for Junto 0409 and I thought one such experiment would work well for this challenge. So I have created two sequences A and B which you can hear separately at the beginning of the piece (representing the two sigils) and steadily decreased the time between initiation of each sequence so that they begin to overlap and then by the end of the piece, they are essentially intercalated- again using Supercollider. It is approximately the length of a Gibson chapter. Still, I am slow and have to catch up having only just finished Peripheral.
This is the sort of challenge I could spend years exploring. There is so much one could do… Perhaps why I chose a simple option of doing a short jam with an EHX space drums, guitar and some fx pedals. All sampled and mangled with the octatrack and then mixed in Ableton.
The tempo is set to 60 bpm per minute which gave me 2 parts of 1 second each. For the final composition I turned on an off different layers of the 2 parts.
been making 15 sec videos
using this chance to make a long-form song (a minute)
orca, ardour, andes, u-he tyrellNo6, Yoshimi, laura’s vocals, linux, jack_capture
This project comes at a great time for me. I’d been reading this book from Ableton and the chapter on finding micro-rhythms in small snippets of sound interested me and I’d already made a mental note to try it out. I also learnt the word Pareidolia from the book too - for humans innate ability to find patterns in nature.
And just this morning a friend linked me some music from Oval which to my ears sounds like lots of tiny snippets of music put together to great a wall of sound that finds a macro-rhythm out of the various micro bits.
So this was a great opportunity for me to try out this glitchy and destroyed pop music. As today is the day the UK is leaving the EU (8 minutes!) I decided to use as my source materials some music from the current singles chart as I figured that by some metric this must be the music most people are soundtracking / hearing in their final day as part of the bloc.
The songs I used were:
Before You Go - Lewis Capaldi
Blinding Lights - The Weeknd
The Box - Roddy Ricch
Don’t Start Now - Dia Lupa
Everything I Wanted - Billie Eilish
I sliced everything up in Ableton, created a drum rack and then used my (criminally underused) MPC Studio Black to bash in the midi notes in a single take.
The description of this project is basically the way I’ve been playing the last couple of weeks after discovering Spacecraft on iOS, which is basically a granular synth with two voices, and I made this track entirely with that.
I made two different sequences, I was changing the length and speed of both of them and I used a heavy dose of reverb and filtering to make the track sounding lo-fi
I will keep on experimenting with the app on my youtube channel at this link:
The challenge, to make very short collections of noises, in a relatively prescribed A/B pattern, immediately clicked in my head with an idea. I would write 3 very short pieces of music, two a pattern and one a loop. Then I would “play” the loop with the patterns radically sped up being each note of the loop.
So some constraints. For the fastest rendition of the patterns to work, they could not have repeating notes. So one pattern is a sort of opposing pair of melodies and another is three rising chords. The loop has one repeated theme of 7 notes and another of 5, so the item ends up 140 “notes” long.
And then I had to execute it. About half the disquiets have me writing some custom code, and I did so here, writing some python to take a performed version of each theme as midi, and radically speed it up. And once I was in code, I could do things, like speed it up a bit less as time went on; and offset the themes at the start and phase them towards not being offset at the end; and some other tricks. It’s all on my GitHub if you want.
And then take the midi files into Logic, set up some Pianotec and some reverbs and delays and stuff, and voila.
In some ways I really like this, and I think it was super successful, especially as I let it open up. But in other ways it sounds exactly like the sort of mechanical formal toneless music that makes people not like “modern” music. I hope you enjoy listening and making your own decision, and I’m glad to share “3 Ideas; 5040 Notes (disquiet 0422)”
Thank you Marc for another brilliant prompt.
I am a big fan of John Zorn, including his Naked City, and have been listening to their spiritual descendants Fantomas a lot recently. This track uses short samples from my recent bandcamp album which explores ways of converting philosophical texts into sound. Funnily enough, I had titled most of the tracks “Chapter 1.1” etc. which fits nicely with the name for this week’s challenge. I played live guitar in short bursts and used “Convert Drums to New MIDI Track” to translate those bursts into the bases for some beats. I’ll definitely be using this process in my upcoming work on Beowulf.
Have a great weekend everyone!!
I picked two instruments, and played two related, but distinct lines. They were each performed as a continuous line, but then chopped into tiny bits and alternated. I then took two longer ambient recordings - a rainforest storm, and a subway platform - paired them with the other two parts, matching the cuts.
I did “extend” the idea of alternating chapters to include something somewhat like the parallel chapters down the page in Samuel R. Delany’s “Dhalgren”. In the end I’m not sure how “aesthetically distinct” the two sides are… perhaps more like “two sides of the same coin”.
Composed with Ableton Live, performed on Launchpad Pro Mk3.
Ambient samples from: https://www.freetousesounds.com/
I feel like I missed the mark on the intent here. Looking at the prompt I think the emphasis should have been placed more on tiny than on the cuts between them. I wasn’t familiar with any of the referenced material and didn’t want to look at it prematurely either. In any case, the result is short and moderately interesting.
My idea was to have two parts in polymeter and cut between them in a third meter. The core waa two parts is 3/4 and 4/4: both monosynth lines. They were both 4 bars long and repeating. There is a track length modulation on both and some delay and reverb to stitch things together a little better. The cuts were built around a 7/8 meter but change in length. It felt a bit thin so I added a bass drone and a pad which sort of merged into one.
The synths were a Microfreak and a bass station 2 for the main lines, a Peak for the pad, and tubesynth (a soft synth in the standalone MPC) for the drone. The sequences and audio chop automation (adjusting levels) was done on the MPC. Delay (tera echo algorithm), reverb (SRV algorithm) and chorus sends were my usual pedals. Nothing very special but I didn’t think I’d have time to look at this over the weekend so I wanted to get something done early(ish) this week.
Love this! Any recommended resources on coding music/audio in python? I know the basics of python.
Hey all, I little remix of some of the excellent tracks from this week. Thanks to Bacon Paul, Duckpow and Non-sense Mediated Decay.
Just beep and fart noises …
Alternate gating of two distinct voices on my diy modular synthesizer. In steeee-----reoooooo. Added some reverb in Audacity.