Latest tracks + videos



Recorded a short record this weekend. It’s the first time I’ve gotten round to recording w/ w/. All songs feature w/ in some form. Sometimes as a delay, sometimes as a playback device.

The songs work best when they mix with your surroundings. Allow them to be part of the atmosphere and they will try to complement it.


Really beautiful! Great selection of sound sources. Very calming and peaceful :herb:


Listened to this earlier today. Had to come back for a second helping. Really lovely sounds.


Spent some time diving into Just Type for the first time today - so, so great. Here’s a little thing – two layers of Just Type, with a bit of background work from Mangrove.


Thanks for the kind words @plym @waynerowand !

@Olivier that is beautiful work. What is making the mallet type tones (they almost sound polyphonic)?

This forum has opened my eyes to Teletype (esp. integration with Mannequins). Did you find it easy to get to grips with?


Short cassette tape loop track. Dual Chronoblob and Sisters with Rings and Elements.

Thanks to you all for the endless inspiration x


Thanks, @Ithacus! In the piece above, I’m using an alternative mode for Just Friends (accessible only via TT) called Synthesis – it turns the module into a polyphonic synthesizer where you can address each individual channel. Incredible.

Teletype has been a huge eye-opener for me. I’ve only scratched the surface, but it’s been easy to get the hang of it so far.


Flute and modular sesh, performed for the dust bunnies in my bedroom and recorded ‘live’. Primarily morphagene and jupiter storm. It changes up a bit in the middle.


Warm noise, echoes, hidden sounds and sweet bass.


First rehearsal session recording for a set to be performed in collaboration with dancers Amanda Maraist and Kara Brody on 28 April 2018 at High Concept Labs. Amanda’s partner also filmed throughout the rehearsal session so there may be a video edit to post soon.


My new drone album, which almost turned into dark techno but went back to lurking, is now released. :slight_smile:


Finally got things sorted using the i2c bus with Telex, Ansibles, and ER301.


newly designed monome looper running on organelle, fender rhodes, 0-coast, wine glass processed by morphagene, + 30 second cassette loop.


Wow… Really good…20 characters


thanks for listening!


Found this one a memory card. An unintentional gas-ripoff. I like it :slight_smile:


Experimenting with JustType in polysynth mode, with three notes chords being randomly selected choosing from three per-note patterns. Scripts can change individual notes in the chord or the full chord.

Peace :slight_smile:


Not a proper video, but a quick test I made with my Sensory Percussion trigger sending MIDI over to Max and then using a generative synth patch (based on the ciat-lonbarde Fourses (this gen-based implementation of it)) along with some BEAP modules for Filter/ADR/VCA.

Position on the drum (center-to-edge) controls filter cutoff, dynamics controls amplitude, and hitting the rim of the drum generates a new/random synth setting.

I have to say that it works pretty well, very sensitive/dynamic. The “Timbre” controller is a bit more crude than I thought (in terms of determining position on the drum), but it’s still crazy good that it can do that through audio alone.

In doing some further tests I also determined that using the SP triggering takes (on average) 11ms longer than my native onset detection algorithm (if I don’t care about velocity), which makes sense. It takes actual time to do the fancy-pants machine learning stuff it’s doing. That being said, it is then 4ms faster than my onset detection that can determine velocity of the strike too, so that’s good.

And in doing all that I also made a native Max UI for the MIDI data that matches the Sensory Percussion (as much as possible without circles/curves).

Not worth making another post, but did a test with sample playback too:

Position on the drum (center-to-edge) controls the position in the sample to play and dynamics controls both amplitude and playback duration.

Midi that goes thwack

Cool! I feel like this sensor would be very neat paired with a granular processor processing the audio of the drum. I saw Son Lux in concert and I thought Ian Chang was doing this at some point but there’s no way of knowing for sure. I know he was triggering samples with it though!


He definitely uses the same trigger/system, though their native sampler (which interests me less).

I plan on doing some stuff where I work with audio sampling/processing based on the triggering stuff too, this was just an early test of communication between the apps and stuff.