Midi that goes thwack

Surface feels totally fine. Haven’t tried it with the cymbal mount yet.

I wonder if that will impact the (minimal) crosstalk between quadrants, as I can imagine a hard surface may induce more ‘bleed’.

there is only one way to find that out :slight_smile: thanks for the feedback!

1 Like

a friend of mine complained that the crosstalk between quads got worse and eventually made it borderline unusable live, huge bummer.

Hmm, I’ll have an eye out for that then. That would be a bummer indeed.

hasn’t happened to mine yet, but I don’t use it nearly as much as he does. keep us posted!

1 Like

A bit disappointed that it does not send out independent notes from the sectional areas of each pad.I understand there is a Max patch and Ableton project that sort of offer a workaround (although the Ableton project did not really work for me, probably because of me doing something wrong), but was planning to use it with some iOS app, so gonna have to figure that out or change plans :slight_smile:

Apart from that, it feels solid

1 Like

Finally got around to testing my Sensory Percussion trigger and sending MIDI to Max.

Posted all the info/video here:

(worth reposting in this thread for relevance)

1 Like

It’s pretty impressive ! Too bad I’m not in a band anymore so that I can force my drummer into using this while he mumbles in his drummer beard that this was all much simpler in the 70’s!

1 Like


I’ve begun to work on incorporating both a BopPad and a Jambé into my live rig:

I’ve been playing with the BopPad controlling either four channels of the Digitakt, or the four percussion notes of the Circuit. In both cases, a really fun thing has been to map radius to the decay of the sound, so you can get long and short hits from a single pad/sound. Another fun mapping is to reverb send. Current hurdle is range mapping: The full range on the synth is too wide to be musically useful. The BopPad editor lets you restrict… but it is finicky, buggy, and I’m not yet totally happy…

I’m finding two possible directions for incorporating the Jambé: The official iOS based synth for it is sample based, but very rich: LFOs, envelopes, continuous pressure control, external controls, multiple stacks of sample, modulation matrix - all per pad (there are 10)… and multiple strikes on the same pad can launch multiple voices (up to something like 24 voices total…) This makes playing things like the Taiko and Udu drum kits really rich, full, and a good compliment with the very sample based percussion I play on the Digitakt.

In the other direction - I’ve worked out how to instead use the Jambé to drive some physical models in SuperCollider. So far, this is just proof of concept, and these models are far from tuned, or made musical or interesting… but the promise is there. Demo video here:

1 Like

A bit of a bump here.

So I’ve got a cool thing I’ve built, with a demo and explanation, and an associated (hardware) problem and limitation.

One of the things I’ve been wanting to make for ages was to have an “automatic library parser” thing, where I can feed an arbitrarily large sample library into a process, then have that be playable without having to go through and map everything manually.

In this case I have a sample library of 3094 samples (Metal Resonance by @timp) which I’ve pre-analyzed for loads of audio descriptors (loudness, pitch, spectral centroid, etc…) and then I’m using my playing on the BopPad to “navigate”, by using the velocity and radius from the BopPad to map to a normalized version of the loudness and spectral centroid, respectively.

The video explanation:

A quick performance demo:

Even at these early stages it feels super promising. It’s quite expressive to play as things map in a manner that makes conceptual sense without you having to individual make those decisions for each of the 3k+ samples.

I also do some compensation in the code to further extend the dynamic expressivity by compensating the matching sample to the queried value (if I need a sample that’s -12dB and the nearest one is -15dB, I will boost the sample by 3dB on playback).

The spectral centroid thing is quite intuitive too, although I am experiencing a problem with this (more below).

In the long-term I want to implement a way more sophisticated version of this where instead of 1-to-1 mappings for each parameter/descriptor, I use some fancy machine learning tools (by our own @tremblap) to map dozens of audio descriptors and their changes over time within the samples to a dimensionally reduced self-organizing map. So the clustering and organization on the BopPad (or whatever other controller) will be able to query the samples in a more sophisticated way.

But for now I just wanted to post the quick proof of concept idea.


Now onto some more technical hardware things.

For some reason my BopPad is spitting out really shitty values for the Radius. Check it out:

I basically get values of [1, 32, 64, 95, 127] for around 95% of my hits, giving me effectively 5 “regions” for the radius, instead of 127 values.

I wrote KMI a while back asking if it would be possible to get 14-bit MIDI output from the BopPad as even 127 values would be fairly shitty resolution for what I’m trying to do, but 5 steps is…um, pretty bad…

That led me to find the Mandala Drum, which I had seen before, but wasn’t too excited by. They tout themselves as being an “HD position sensitive” instrument (which just means 7-bit MIDI), but I was curious to see if anyone had any experience with them. And if so, how they felt, how smooth the changes of values were etc…

Lastly, I was curious if anyone knew of any other drum pads that offered position/velocity information? Or even DIY projects.

I’d absolutely love something that did 16bit output over OSC, for example…


Very impressive project! Sounding amazing so far.

I wonder if Sensel Morph would be in your wheelhouse? My understanding was that it’s “true MPE” rather than BopPad, which has quietly been decribed by KMI as “MPE-compatible” but seems to be capable of 7-bit MIDI at best.

Sensel Morph is largely marketed as an MPE multi-touch controller, but there’s a drum overlay, and some of their videos show people using drumsticks on it with (seemingly) great sensitivity.

This might be of use for getting Sensel Morph to interact with OSC-compatible software (found via this thread)

“Smart Fabric” is KMI’s piezoresistive sensor system, and it measures force, not location. To achieve a radial measure, multiple sensors are placed along the radius, and the location interpolated from the forces sensed. Since the top material is hard rubber, it is likely that in the middle of a sensor, not enough force is registered on adjacent sensors to accurately interpolate a value.

I have a Boppad and did some experiments with it. If you press with a stick and drag it radially, you’ll see that you can get pretty much all 127 values, but that there are “dead zones” where you get just 1, 32, 64, 95, or 127, just like you see with hits.

This tells me that there are probably only five sensors radially. This image, from a KMIvideo, shows ten bands, but I suspect they are grouped in pairs to make five sensors:

To achieve 127 discrete values would require sensor bands less than ½mm, and scanning 25x faster. Probably out of range for a $200 device.

14-bit resolution, or 16k values, is clearly out of the question: With the sensors it has, it can’t even interpolate to 7-bit resolution well. With discrete sensors, they’d need to be 0.6µ width - clearly not going to happen! This also, however, shows that 14-bit resolution over a 10cm surface is asking for micrometere accuracy, which is almost certainly out of the range of affordability in an instrument - not to mention playability.


You couldn’t drum on a Sensel morph: It can only take up to 5kg which under what a drum stick can generate. I don’t know if the Sensel surface breaks at 5kg, or just loses all precision.

The Sensel surface is a set of discrete force sensors - so again, you get position resolution (XY or radial) based on interpolating force sensed at adjacent sensors. They list 20k sensors over 33k sq mm. - So 1 sensor per 1.3 linear mm. Given the dimensions - there are fewer than 200 sensors along the longest edge - and hence at best 8 or 9 bit position accuracy (with interpolation). Mind you, that is still approx. mm. accuracy over 24cm, which for drum hits is probably the mostly you could ask…

EDIT: They claim 6502dpi tracking - which is 256 per mm. If calc.s above are right, and there is 1 sensor per mm-ish - then we can see they are estimating position based on interpolation down to 8 bits. The force sensors claim 32k (15 bits) resolution… so this somewhat checks out (but the surface material needs to be able to spread the force over more than one sensor to get this… but at one per mm, this is reasonable.)


That makes total sense, and probably explains the behavior I’m seeing. Makes me curious as to what’s inside the Mandala.

That was my concern. Not to mention the form factor is a bit weird for drumming on.

That would indeed be a dream.


I’ve had moderately decent results using drumsticks on the morph, and Peter Nyboer has confirmed to me a few times that the “Drums” overlay is intended to be usable with drum sticks.

I’m my experience drumming on it with sticks, the main problem is calibration. The morph app (last I checked) only lets you set “sensitivity”, which is basically a threshold value for how much pressure is required to trigger a note. Setting the sensitivity higher both a) loses you some dynamic range (or seems to), and b) actually introduces some latency, since there’s physically more time between when your stick hits the pad and when the pressure crosses that threshold.

It’s entirely a software / scaling problem, and something they should fix by switching from a threshold approach to a velocity curve approach.

The drum overlay also activates a different scan mode (low latency vs high dpi), but as mentioned, losing DPI resolution is opposite the goal of tracking position within the sensor.

(Also, really neat idea Rodrigo!)


I’ll have to see if I can find someone with a Morph to try it out in person.

Definitely don’t like the “Drum” overlay though, so I’d aim/hope for something that’s more unified across the entire surface, with mapping coming from X/Y/Z coordinates, rather than virtual little drum pads.

Definitely. Some kind of differential rather than an absolute threshold to exceed.

I know very little about the Morph, but I guess the different overlays can feedback information to the sensor itself? The crazy DPI is probably overkill for sticks, but the more accuracy, the better, without sacrificing latency of course.

1 Like

I’ve written the Mandala people to ask some questions. I’ll report back what I hear.


really inspiring, thanks for sharing!

1 Like

Would you happen to know of any DIY projects/solutions/approaches to this kind of thing?

I wouldn’t be looking for MPE or “pressure” really, just accurate-ish “velocity” and “position”, and it can be monophonic.

what about 4 channel location with 4 mics being compared for loudness and delay - I think I’ve seen stuff like this at NIME, or along those lines. Now, i don’t know the speed of sound in material, but a dedicated small processor doing some sort of autocorrelation between the mics could tell you the distance, right?

1 Like