Ansible MPE (in the future)?

Is there any chance of an MPE implementation för the Ansible firmware? It would be brilliant to be able to hook a Roli Seaboard or a Linnstrument straight to your modular.

With the limits of 4 trig and 4 CV, of course. But it’d be great to at least use, for example, Strike, Press, Slide and Glide to generate control voltage.

I guess @tehn would know if it’s possible and perhaps on the roadmap.



This sounds like an interesting idea. The thought of MPE did cross my mind when adding MIDI support to earthsea.

How would you envision it working? Like a mono voice with the various CCs mapped to CV outs? I haven’t looked at the MPE spec in detail - I wonder if there is a pseudo standard mappings for many of these increasingly common control axis to specific CCs.

I have a Madrona Labs Soundplane and come to think of it I seem to recall @TheTechnobear might have added MPE output to the Soundplane client…

1 Like

I’m eyeing the Seaboard Rise and haven’t actually tried it hands-on yet. But I’m intrigued.

Looking at C74’s description and implementation of MPE, I’m probably thinking a (somewhat limited) monophonic application adapting to the 4 CV outs of Ansible.

Strike is transmitted as Velocity
Press is transmitted as Aftertouch
Glide is transmitted as Pitch Bend
Slide is transmitted as CC 74
Lift as Release Velocity

From my scenario - maybe mapping the first four to the CV outs. I don’t know anything of how the programming of the Monome Modulars work, so this is just wishful thinking on my part. :slight_smile:

…if I were to get a Seaboard Rise.

I did initially add MPE to the Soundplane client, but after that, Randy added it to the official client.

MPE is now supported by lots of controllers Rise, Linnstrument, Eigenharps, Soundplane and Continuum… and I’m sure the new ones planned will support.

these are kind of marketing terms by Roli, as they existed already on all other controllers…

Strike = (note on) Velocity
Glide = X axis implemented as Pitchbend
Slide = Y axis implemented as CC 74
Press = Z axis implemented as (channel) Pressure
Lift = release Velocity (most commonly left out attribute, but nice when exists)

(I really wish Roli would avoid trying to ‘take over’ MPE… or pretend its there invention, its not its basically a variant as Haken have used from the beginning, and was Roger Linns idea to extend into a standard)

spec is here, but honestly the spec makes it sound more complicated than it really is… and currently most controllers/synths only use it in the ‘basic’ form…
(I think most developer are ‘dipping there toes in the water’ at this stage)

MPE in its basic form, each touch is put on a separate midi channel , 2-16 (assuming no splits, which is the most supported version) , midi channel 1 is reserved for global messages e.g. say a mod wheel (cc1) or breath controller (cc2)

It has a couple of NRPNs, which which a client can use to ‘enter MPE mode’, and also set the pitchbend range (the default should be 48 semitones) , but whilst these are sent by all the controllers, many clients (synths) don’t bother/ignore with them, finding it easier just to have a mode switch and independent PB range control.

(actually the NRPN approach is flawed… since it assumes the controller will be switched to MPE mode after then synth has started, but of course, a musical might already have the controller in MPE then start the synth… so the synth wouldn’t get the NRPNs)

I implemented MPE on Axoloti (and Reaktor/Max), and have used on various synth patches I’ve created… the mapping is pretty trivial assuming the parameters you want to map are controllable per voice.

1 Like

Yeah. Sure.

But wouldn’t it be nice if these (pretty standard midi mappings) were hosted within for example Ansible, so you wouldn’t need a computer?

1 Like

Being a continuum owner I have a good sense of history…

The continuum + cvc combo is still without peer in my book. That said I could see some benefit in MPE support. Truth be told I’ve pestered the Oval sound guys about the possibility of MPE support - if they support it I’ll certainly have motivation to tackle the module side of things here!

1 Like

Sure… Id like to see anything that supports midi, support MPE (where it makes sense). I tend to post the above info to all developers that I see do interesting things :slight_smile:

thats said, whilst Ive been very tempted to get into Eurorack, the main thing preventing me, is it gets expensive for multiple voices, which is (for me) where MPE etc really shines, and as you pointed out, Ansible would only give you one voice… (MI Yarns could easily be hacked to do the same btw)

that said… even with one voice, having it all under your fingertips is really liberating!
( I often use my Soundplane monophonically, even with soft synths)

Im sure I will be lured to Eurorack, its almost inevitable when @randy releases his Soundplane to CV module :slight_smile:


Though not with MPE, I’ve been using my LinnStrument with my Earthsea module. I manually mapped pressure to modwheel and that seems to work nicely (though I added some slew limiting to the output, it was steppy).

However, I get stuck notes frequently, and it really doesn’t like me sending pitch-bends AND modwheel at the same time - I can do one or the other (I should try a normal controller, hmm). I’m thinking either the processor just can’t keep up or there are some bugs in the code, or both.

I haven’t received my Ansible yet, but I’m hoping it’s more capable. I’ve been poking at the Earthsea code but haven’t spent enough time to really work out what’s going on. That said I don’t think it’d be hard to add the MPE side of things, I just don’t know if it’ll be able to handle all the data.


@Ycros ,did you try this with the latest Linnstrument firmware?
we had exactly the same issues with the Linnstrument and Axoloti initially (see here ) , but Geert fixed this in the Linnstrument firmware.

theres not a lot of processing required, Axoloti is only using an STM32F4 and can cope happily, I can get 10+ voices (which includes sound generation, obviously this is dependant on patch complexity), so you would have thought a single voice would be easy.

(there are also options on all the controllers to ‘decimate’ the data, if the receiver cant cope)

I’m already running in the slowest data rate mode, but I’ll check because I’m probably not on the latest firmware.

1 Like

Nope, latest version doesn’t seem to change anything.

The biggest problem with these as a “standard” mapping IMO is that Aftertouch, Pitchbend, and CC values are different bit-depths.

Pitch bend is the only one that defaults to being 14-bit. The spec allows for the CCs to be ganged into 14-bit messages (i.e. send CC 102 as the LSB and CC 70 as the MSB), but there’s no such capability for Aftertouch to magically grow 7 more bits of resolution.

And then you also have to make sure that synth builders handle the MSB/LSB endianness correctly (from what I’ve found, it’s supped to be MSB-then-LSB)

MPE is a good idea for ansible extension. i don’t own any of these devices and i’m already overstretched, but this is the whole point of open sourcing the firmware.

if someone wants to extend the firmware i’ll happily pull it into the main repo.

that said! i’ll get the firmware posted public by the end of the week.


the latest MPE spec actually has 14 bit options for x/y and z (see here in the implementation section )

(the continuum, can also uses 21 bit for PB, by using PB + a CC msg, this is really needed for the full size continuum, as 14 bit is a bit low for 96 semis)

I will say though, that on the Eigenharp and Soundplane, I tended to find its often hard to ‘feel’ the difference between 7 and 14 bit… I suspect this is because synths tend to smooth anyway.

yeah, its a vague in 14bit midi generally, not just MPE.
Im pretty sure (like 99% without checking back emails etc) Haken defined that it should be LSB and then MSB, and that the synth should ‘react’ on the MSB only. the reason for this, is to avoid stepping, whilst also not having to ‘care’ if the controller is sending 7 or 14 bit.

(the issue is you only know the real value once you have seen both CCs
e.g. imagine sending MSB first 00 FF, 01 00… in this case if you just reacted on the MSB, and it was first you would go from 00FF to 01FF (nearly 0200!) to 0100

so with Haken approach, this doesn’t happen AND its compatible with 7bit (as you just assume LSB… thats another topic ;)) - this is the cool thing, about the continuum, since its been around such a long time, Lippold has thought this all thru, and refined it… so whilst MPE is ‘new’, its built on something thats been around for quite a while!

1 Like

Right, but the “solution” for the Z axis is to send pressure data as a CC, not as a Channel Pressure message, which adds implementation and configuration complexity, which translates to UI complexity. Pitch bend doesn’t have this problem because it’s always 14 bits.

Your description of how the Continuum handles it makes some sense, particularly around “not having to care” if the data is 7 or 14 bits, and it also matches with the byte ordering of proper pitch bend messages. What I’d seen that said MSB-then-LSB was from another message board… It’s unfortunate how poorly documented that feature of MIDI is at this stage :frowning:

1 Like

of course the ‘real solution’ would use Midi HD, but that is not going to arrive any time soon :wink:
(the MMA have been discussing it for years, and still we see no implementations, or public spec)

as it happens, I also don’t ‘agree’ with this implementation, I preferred the original approach, which was to use a CC in addition to hold the LSB in addition to the Ch Press, but for some reason Roli changed it.

but I don’t think it introduces UI complexity or much in implementation… Ive implemented this, and just have MPE mode, and MPE extended mode on the controller… the synth can just respond to either. (just treat ChPress and CC70 as the same MSB, and CC102 as LSB)
(which reminds me, I do need to update the Axoloti objects for this)

In practice, Ive found little support for 14 bit MPE, probably not surprising given the lack of support for 14 bit midi in general, and it has to be remembered, turning on 14 bit adds a lot more data, and at the update rates of these controller (~ 500-1000/second per touch) this can be significant.

fortunately as i said, generally 14 bit seems to not add that much, and others I know using these controllers seem to echo a similar experience.

anyway personally, Im very happy with MPE … works great with both my Soundplane and Eigenharp, makes configuration a lot simpler… and its getting wider support all the time.


There was an issue with the MIDI message parsing code which I discovered and fixed a few months back. Dense MIDI streams (multiple messages per USB read) almost always triggered the problem. The fix hasn’t been rolled into a formal release yet (one can pick up the fix if the firmware is built from source).

It may be possible to increase the rate of MIDI processing (I’ll start another thread to collect bug reports).

1 Like

Thanks for all the input. So it seems doable. Probably not by Tehn, but possibly by someone else who owns the skills to do some firmware modifications.

I’ve only done some Max programming, but have no clue how to do firmware stuff for the Monome modules. But I’m probably getting an Ansible to interface my Grid (and soon Arc) to my eurorack. Should maybe look into this type of coding. :confused: :smirk:

1 Like

I’m thinking about diving into alt-firmware dev as well in the coming months. A little intimidated, but @scanner_darkly has been very encouraging about it.


The MIDI handling code has continued to evolve since the earthsea implementation and I suspect it would be fairly straight forward to support MPE. The biggest hurdle might just be deciding how to expose control over switching in and out of MPE mode.

I have one or two irons in the fire right now that I need to tend to before I’d be able to look into MPE. If someone is interested in tackling MPE (particularly on earthsea) I can help point the way - all the needed infrastructure is there.