Probabilistic performance drum loop slicer

I’ve always loved MPC-style beatslicing where a loop is sliced into parts, mapped across pads and finger-drummed into new sequences. I’ve never had the skill to pull it off, so I made something to do it for me.

Load up a directory of drum loops. Set loop points. Set parameters such as stutter, reverse, jump forward and jump back probabilities. Whatever happens on a step, it’ll always return to the start of the loop on the “one” of the bar, and all changes are quantized to the beat.


MIDI controller (optional), Crow (optional)


This is an early version that I’m putting out for comments while I work through the todo list. It surely has bugs and is certainly incomplete, but I’m actively using it for performances and building it as I go.

Here’s how to use it:

  • put sets of loops (each one bar long) in folders in the Norns audio/beets folder
  • set BPM and select the folder name in params, and load the loops
  • map a MIDI controller such as a 16n or Launchpad to the parameters

Beets is built as a library and should be easy to reuse. It only uses Softcut and so could be combined with the engine of your choice in a more complex setup. If Crow is attached then it fires a trigger on each beat and at the start of each bar.



0.1: not-quite-ready-for-primetime version


  • Grid UI
  • More softcut-based FX
  • Multiple concurrent loops
  • Waveform visualization

20 characters of VERY NICE. Eagerly awaiting the grid UI :>

this is ace. I can see some fun things to be done, for sure; extracting the wav-file selection for each ‘slot’ (similar to how takt's mental model) feels like it’d be a nice thing to pull out to the params menu.

Love the code, too, with Beets as a library and a nice example of “class-like” Lua code.

1 Like

Definitely going to steal the decimation engine from @Justmat’s otis for this. Get that lofi crönch going :heart_eyes:


Sorry I missed a lot of trains norns wise (because mostly I was happy with my version so I didn’t bother because there was already too much for me to focus on) but does this mean we’re at full compatibility with bi-directionnal midi for stuff like Launchpads now ?

In other news : this app looks fantastic, thank you so much @mattbiddulph for the work

Thanks for the kind words! The Launchpad in the demo video isn’t bi-directional with norns. I just used the Launchpad mappings on the MK3 to put eight columns of the pad into midi fader mode, and mapped their CCs to params in the usual Norns way.

Ah alright, looks nice enough though :slight_smile:

Great concept, and looks like a lot of fun. Inching closer and closer to getting a norns…

I haven’t done a ton of norns scripting yet and am a bit confused why this isn’t working

I can hear the bit reduction occurring in so much as it kind of just adds some fuzzy noise behind the sample, but it doesn’t actually seem to be processing the samples, so to speak.

I’m guessing I have to do something to point the softcut buffer at the engine, but it’s not clear to me from looking at the otis code what I’m missing :slight_smile:

(note: you’ll need to have otis installed or some other lib that depends on

Also: if I press “Load loops” 2-3 times in fairly quick succession I can reliably crash my norns.

This is already great fun though, thank you!

This will be a cool thing for my almost retired V1 Maschine controller. Am curious if that will plus well in to the norns. Once we have grid controls, it’ll be great for my monobright legacy grid!

The problem is that right now you can’t route Softcut output into the Engine. I know that @zebra has been thinking about whether this will be possible in a future version.

(thanks for the bug report)

1 Like

I think you’d want the Engine --> softcut, in which case you would need to add this: audio.level_eng_cut(1) (see the routing study)

edit: I forgot Otis isn’t sample based when saying that so I’m wrong, sorry


Will probably add that route yes.

But for bitcrushing in particular it makes more sense to me to just add to softcut. Rather than tying up the whole supercollider stack for a single math operation

Plus its an opportunity to do more things: depth control per voice, compander laws, interaction with filters, &c …


IIUC the ask is for the other direction. It’s not a technical problem. I have misgivings about adding feedback paths that can only be controlled by menu diving and could kind of arise by accident. [*] But I’m the only one who has expressed this so far, so I submit

[*] e.g., someone releases a script+engine designed to send supercollider to softcut. eng->cut is turned up under script control. User has set cut->eng separately, or maybe another script sets it, anyways it is persisted to system.pset. user gets horrible noise and nobody else does, issue reports all around. Just something to be aware of.

Plus if we have enough routes it will impact performance and we’ll have to do fancier dynamic route management or something, then we’re making another digital patching environment instead of just a mixer.

Sorry to derail. @mattbiddulph Haven’t tried the script yet but it looks rad and a good reason to dust off the mpd controller I have sitting around


For scripts like this, could it be possible to have a binary switch that either puts Softcut-after-Engine or Engine-after-Softcut within the existing routing network? No arbitrary routing, no feedback loop, just the option to use Softcut to source the sounds and Engine to process them.

sure, the use case makes sense. will probably just add the mix route and not worry about it excessively. after all, one reason it’s been requested is to explicitly enable feedback structures, which is fine.

i’m currently thinking more about e.g. whether it might be a good idea to save mixer state before script launch and restore it after script cleanup, to avoid unintended stateful interactions between scripts. (That’s not exactly right though… You still want to save things touched by UI… Hm…)

1 Like

avoiding statefulness in script lifecycle (beyond default paramsets for a given script) seems ideal. As you mentioned before, can imagine a lot of confusion and headaches down the road if such a thing existed.

(apologies again to @mattbiddulph for hijacking the beets thread!)

Here’s what I’m working on next, currently tidying up and getting ready for a new release in the next few days:

  • Ability to run two voices of Beets concurrently, each with their own set of loops
  • Grid UI (with two side-by-side copies of the UI)
  • An editor to mark certain slices as kick or snare, in order to generate Crow triggers from them to sync modular synth events (e.g. fire additional synthesized percussion whenever there’s a kick)

I hooked my Norns up to Crow and the modular, grabbed a few neat samples from a sample library and threw together a quick and rough demo jam in case anyone wants to see the work in progress:

Bass and 303 is modular driven by Crow-clocked Ansible. The drums and vocals are run by Beets. 303 accents are fired by the kick output.


this looks so fun, excited to play with the grid ui :heart_eyes:

… and since I’m sharing demos, they just posted the live set I did at Resonant Frequencies in Oakland, CA with version 0.1 of Beets. Thanks to @mzero for putting the video together, with Sabina Luu on visuals.