Disquiet Junto Project 0424: Fluctuating Rhythm

Oops! Strayed quite a way from the brief on this one!

OK … the original idea was to use weather data to ‘conduct’ the tempo on a piece … but I ended up using weather data from https://rp5.ru in TwoTone [app.twotone.io] to generate some ‘music’ which I then layered, morphed and messed with …

Have a great week! h u :slight_smile:

3 Likes

Non-submission (with acknowledgment to @DetritusTabuIII!), as I’ve applied the technique shared by Brian Crabtree in project 223 and layered all the takes – including the interruption at the end.

4 Likes

I tidied this up into a slightly more formalized statement. Thanks for the prompt to write about the prompts.

https://disquiet.com/2020/02/14/reverse-engineering-musical-composition-prompts/

3 Likes

Hey All, I started by using some nature sounds from freesound (including one from the marvelous uploader klankbeeld) since I don’t really have the technology to record outside plus it is freaking cold out there with the wind blowing. I focused on the performance aspect and improvised the parts and tried to accompany the field recording tracks and keep it kinda flowing tempo wise. There is a great freedom in getting away from tempo and notes of a melody but I hope it retains something.

Peace, Hugh

4 Likes

After reading “Philosophy Is a Public Service” by JONATHON KEATS I decided I had to do something with trees. So I wrote a program in ChucK to generate tones from a fractal tree. These trees grow exponentially, so my processor could only handle 3 generations of tree growth.

I’m too tired to explain in plain English how I did this, so I’ll just include the code in-line. It’s quite short and could even be a little shorter if I spent more time on it. Full code is here.

[0] @=> int axiom[];

fun int[] apply_rules(int tree[]) {
    get_next_gen_length(tree) => int next_gen_length;
    int next_gen[next_gen_length];
    0 => int j;
    for (0 => int i; i < tree.cap(); i++) {
        tree[i] => int symbol;
        if (symbol == 0) {
            1 => next_gen[j];
            -2 => next_gen[j + 1];
            0 => next_gen[j + 2];
            -1 => next_gen[j + 3];
            0 => next_gen[j + 4];
            j + 5 => j;
        } else if (symbol == 1) {
            1 => next_gen[j];
            1 => next_gen[j + 1];
            j + 2 => j;
        } else {
            symbol => next_gen[j];
            j + 1 => j;
        }
    }
    
    return next_gen;
}

fun int get_next_gen_length(int tree[]) {
    0 => int len;
    for (0 => int i; i < tree.cap(); i++) {
        tree[i] => int symbol;
        if (symbol == 0) {
            len + 5 => len;
        } else if (symbol == 1) {
            len + 2 => len;
        } else {
            len + 1 => len;
        }
    }
}
    
fun void sound_notes(int tree[]) {
    60 => int note;
    .5 / tree.cap() => float gain;
    .5 => float param;
    for (0 => int i; i < tree.cap(); i++) {
        if (tree[i] == 0) {
            Moog moog => dac;
            note => Std.mtof => moog.freq;
            0.8 => moog.noteOn;
            gain => moog.volume;
            param => moog.filterQ;
            <<< "volume:", moog.volume() >>>;
            <<< "filterQ", moog.filterQ() >>>;
        } else if (tree[i] == 1) {
            Moog moog => dac;
            note + 12 => Std.mtof => moog.freq;
            0.8 => moog.noteOn;
            gain => moog.volume;
            param => moog.filterQ;
            <<< "volume:", moog.volume() >>>;
            <<< "filterQ", moog.filterQ() >>>;
        } else if (tree[i] == -2) {
            note - 1 => note;
            param / 2.0 => param;
        } else {
            note + 1 => note;
            (param + 1.0) / 2.0 => param;
        }
    }

    5::second => now;
}

sound_notes(axiom);

apply_rules(axiom) @=> int tree[];
sound_notes(tree);

apply_rules(tree) @=> tree;
sound_notes(tree);

apply_rules(tree) @=> tree;
sound_notes(tree);
4 Likes

Last week when the Hold Music email I arrived, it occurred to me that there have been a disproportionate number of Junto projects about telephones.

1 Like

With apologies to Franz, Peter, & Sviatoslav; Marc & Jonathon; and Listeners Like You (especially if you have perfect pitch), I proudlyish present “Schubert in a Tube”.

Yesterday, I recorded snowmelt flowing through a plastic culvert under the road by my house.

Like so.

I then high-passed & gated this recording to minimize the general rauschen of the water and emphasize the intermittent higher blips and blops, and fed this stream (lol) to two copies of BeatTrack.kr in SuperCollider (one for each stereo channel). I used their estimated tempi to control the speeds of four copies each of Peter Schreier and Sviatoslav Richter performing Franz Schubert’s “Wasserflut” from the song cycle Winterreise:

Take it away, Wikipedia

“Wasserflut” (“Flood”):
The cold snow thirstily sucks up his tears; when the warm winds blow, the snow and ice will melt, and the brook will carry them through the town to where his sweetheart lives.

BeatTrack has “biases to 100-120 bpm”, but the performance, although very fluid (lol) in tempo, is more like 60-70 bpm, so I scaled it by half. Probably BeatTrack2 would be better but I don’t know how to use it (which features, etc). The tempi are then smoothed out using VarLag with various parameters.

Finally, the performances are panned (lol) & mixed back together with the original recording of the snowmelt and also (softly) the high-passed/gated version.

The SuperCollider code is on my GitHub although there are probably better ways to do every aspect of that.

4 Likes

This was a really great project for me. I haven’t played my bass in months, not having much of a reason to. This project came into my inbox and it seemed like something I could do and not really worry about if the results were good enough for one person or another. Anyway, I went outside hoping to hear the birds that are usually chirping but they were silent. Instead I turned from the tree and saw the slowly moving clouds and synchronized with them. The clouds soon revealed the sun, maybe you can hear that part. Eventually the clouds covered the sun back up and the birds I was expecting began to join in.

6 Likes

This is true. An obsession.

1 Like

Judging by your Twitter posts, you spend some time on conference calls.

When I interviewed you last, I’d been thinking how many Junto projects capture a variety of everyday activities and give insight into the diversity of the community.

Those projects reveal participants’ environments and it can be surprising what we learn about each other.

The other thing is I sometimes look back on my recordings and get a sense of how different times of the year will lead to different moods and instrumentation.

1 Like

For this track I went down to the river, sampled the water sounds into the SpaceCraft Granular app and played with it for bit. I left the river sounds recorded with my iPhone in the background, and I occasionally mixed them in and out of the track as I saw fit.

I also made a video for it on my youtube channel that you can watch here: https://www.youtube.com/watch?v=rxVQEZ3mbe0

3 Likes

@BennDeMole I like how you’ve added your voice and it’s a really atmospheric piece.

@tristan_louth_robins The environment and stereo field work well together.

@DetritusTabuIII Love the warbling but I’m sober at the moment. Can imagine this would be more visceral if I were drunk!

@tatecarson It’s great hearing the detail of the strings and the description of the birds being silent reminds me how there’s something up when that happens.

@WhiteNoise Beaut results and it’s a good effect the way the sounds from the environment sit within it. Enjoyed watching it unfold too.

3 Likes

I set up a series of sine waves running at frequencies suggested by the Fibonacci sequence. Some are running at audio frequency and some are acting as LFO modulation sources. It ended up sounding more like nature in a metallic windstorm… but I’m reasonably happy with the outcome. I have a short video which inspired this…that I’ll add later.

5 Likes

‘Pipe Sentinels’ consists of recordings my feedback-based installation of the same name captured in the gardens of a museum in South-London. Placed in a small bamboo field, the installation recorded its environment, processed the audio according to shifting environmental data and played it back via speakers to the same surrounding - before it became recorded again.

The recordings were edited, layered and turned into this short composition.

6 Likes

I took my trumpet outside to our shed whilst Storm Dennis was starting here in the North of England. I placed my Tascam DR-05 on the floor to record both the external sounds of wind and rain and my trumpet. I had no preconceived ideas of melody and tried to respond to the weather. I wanted it to be as ‘in the moment’ as possible and do it in one take - so excuse the noodling! I left the wind noise distortion in towards the end (about 1min 50secs) - so be prepared!

5 Likes

Went slightly different than instructed… I took a recording of water sloshing irregularly from a drain pipe, In Ableton I then extracted the groove from it, applied it to a drum track full of artificial noises (metal scrapping and banging). Played two piano melodies, applied convolutional reverb to one with the water slosh as the reverb space. Then chopped up the water recording, had it play back via random midi notes, but in the same groove as the original. Finally, added some LFOs that took timing from the water waveforms peaks and applied them to the reverb parameters of the other piano section.

3 Likes

I live in NYC and wasn’t sure I was going to draw much inspiration from nature in the middle of winter. However, I woke up this morning and heard some birds chirping outside my window. So, I decided to record the birds and sound of the quiet morning chill. I made two separate recordings outside the windows of my apartment. One facing the south and one the north. The window facing the backyard was able to catch some chatty birds, while the other side picked up on some sounds of the city in the distance.

I then decided to play them over each other in Ableton. One panned left, one right, and each with their own respective “fluctuating rhythms”. I used those recordings as the bed for a simple, spacious acoustic guitar. It was my attempt to will Spring into fruition.

1 Like

I know it’s a cliche, but I mean how many times remaining do have to see things like this?

A patch on the sun37cv, with the cv outs modifying the vco1 shape and cutoff on the minilogue xd. Aftertouch mapped to lfo1 rate on sub37 and to loo interval on the xd. The string cheese is from Klevgrand’s Hillman.

1 Like

I’ve deviated slightly from the prompt: this wasn’t performed outdoors and the composition was created as a result of the process rather than beforehand.

I videoed a nikau palm in the work carpark then took 1024 frames and motion tracked a frond and also a distant tree swaying in what was a fairly stiff breeze. The next process was to take the motion tracking information and convert it into Ableton Live automation envelopes. This was more involved than my hoped-for drag-and-drop and involved writing a Go program to extract the datapoints from Apple Motion’s XML format then a Max4Live patch to allow the resultant data to be recorded as automation. I did this for both the x and y coordinates of the palm frond and the tree.

These envelopes were then used in various ways. I set up instance of HATEFISh RhyGenerator with 32 steps and used the automation envelopes to control the number of beats within the 32 steps, the speed, and the MIDI notes sent. The MIDI information was consumed by two Ableton Tension devices, one of which was arpeggiated having been scaled to my usual Cm.

At the same time a Quanta granulator was being modulated by the same envelopes using a recording of cicadas I recorded on Saturday at our local beach yoga session.

3 Likes

I taped myself having a little jam in the backyard. It’s based on a little tune I’ve had stuck in my head. I’d like to think I came up with it but more likely it’s just a mash up of ideas I’ve heard before. The wind and the pigeons and the plant life made the moment quite tranquil. So tranquil that I forgot to tune my guitar.

Also it’s about time I picked those chillis!

3 Likes