Latest tracks + videos

llllllll

#2131

Recorded a short record this weekend. It’s the first time I’ve gotten round to recording w/ w/. All songs feature w/ in some form. Sometimes as a delay, sometimes as a playback device.

The songs work best when they mix with your surroundings. Allow them to be part of the atmosphere and they will try to complement it.


#2132

Really beautiful! Great selection of sound sources. Very calming and peaceful :herb:


#2133

Listened to this earlier today. Had to come back for a second helping. Really lovely sounds.


#2134

Spent some time diving into Just Type for the first time today - so, so great. Here’s a little thing – two layers of Just Type, with a bit of background work from Mangrove.


#2136

Thanks for the kind words @plym @waynerowand !

@Olivier that is beautiful work. What is making the mallet type tones (they almost sound polyphonic)?

This forum has opened my eyes to Teletype (esp. integration with Mannequins). Did you find it easy to get to grips with?


#2137

Short cassette tape loop track. Dual Chronoblob and Sisters with Rings and Elements.

Thanks to you all for the endless inspiration x


#2138

Thanks, @Ithacus! In the piece above, I’m using an alternative mode for Just Friends (accessible only via TT) called Synthesis – it turns the module into a polyphonic synthesizer where you can address each individual channel. Incredible.

Teletype has been a huge eye-opener for me. I’ve only scratched the surface, but it’s been easy to get the hang of it so far.


#2139

Flute and modular sesh, performed for the dust bunnies in my bedroom and recorded ‘live’. Primarily morphagene and jupiter storm. It changes up a bit in the middle.


#2140

Warm noise, echoes, hidden sounds and sweet bass.


#2141

First rehearsal session recording for a set to be performed in collaboration with dancers Amanda Maraist and Kara Brody on 28 April 2018 at High Concept Labs. Amanda’s partner also filmed throughout the rehearsal session so there may be a video edit to post soon.

https://caelmore.bandcamp.com/album/rehearsal-15042018


#2142

My new drone album, which almost turned into dark techno but went back to lurking, is now released. :slight_smile:


#2145

Finally got things sorted using the i2c bus with Telex, Ansibles, and ER301.


#2146

newly designed monome looper running on organelle, fender rhodes, 0-coast, wine glass processed by morphagene, + 30 second cassette loop.


#2147

Wow… Really good…20 characters


#2148

thanks for listening!


#2149

Found this one a memory card. An unintentional gas-ripoff. I like it :slight_smile:


#2150

Experimenting with JustType in polysynth mode, with three notes chords being randomly selected choosing from three per-note patterns. Scripts can change individual notes in the chord or the full chord.

Peace :slight_smile:


#2151

Not a proper video, but a quick test I made with my Sensory Percussion trigger sending MIDI over to Max and then using a generative synth patch (based on the ciat-lonbarde Fourses (this gen-based implementation of it)) along with some BEAP modules for Filter/ADR/VCA.

Position on the drum (center-to-edge) controls filter cutoff, dynamics controls amplitude, and hitting the rim of the drum generates a new/random synth setting.

I have to say that it works pretty well, very sensitive/dynamic. The “Timbre” controller is a bit more crude than I thought (in terms of determining position on the drum), but it’s still crazy good that it can do that through audio alone.

In doing some further tests I also determined that using the SP triggering takes (on average) 11ms longer than my native onset detection algorithm (if I don’t care about velocity), which makes sense. It takes actual time to do the fancy-pants machine learning stuff it’s doing. That being said, it is then 4ms faster than my onset detection that can determine velocity of the strike too, so that’s good.

And in doing all that I also made a native Max UI for the MIDI data that matches the Sensory Percussion (as much as possible without circles/curves).

edit:
Not worth making another post, but did a test with sample playback too:

Position on the drum (center-to-edge) controls the position in the sample to play and dynamics controls both amplitude and playback duration.


Midi that goes thwack
#2152

Cool! I feel like this sensor would be very neat paired with a granular processor processing the audio of the drum. I saw Son Lux in concert and I thought Ian Chang was doing this at some point but there’s no way of knowing for sure. I know he was triggering samples with it though!


#2153

He definitely uses the same trigger/system, though their native sampler (which interests me less).

I plan on doing some stuff where I work with audio sampling/processing based on the triggering stuff too, this was just an early test of communication between the apps and stuff.