I’ve had one for a few months and love it. Creating maps is easy and seems to be getting easier with new updates to their studio app. Its super responsive too and nice to play with sticks which was the big question mark.
I created a setup to control a Misha pretty easily. This is just some noodling.
Turns out they haven’t put out the example Max patch from the video above yet.
From the API spec I’ve managed to piece together how to put it in API mode and draw pixels/shapes, and have managed to format the incoming finger stream, but a bit stuck on how to turn “7-bitized 3 floats ( X, Y, Z ) position” into floats from part of the incoming sysex data.
As far as I can tell this part of the data comes as 14 digits, with an example as follows: 17 60 110 102 65 81 30 0 16 65 100 115 57 63
There’s some example Python/C++ at the bottom of the API spec pdf, but I’m not handy enough at reading either language to make sense of how to turn that into Max-friendly code.
I will admit trying to parse exact algorithm of this C++ code at 11PM is a bit over my mental capacity (but I get why they went that way - MIDI uses the most significant bit to differentiate between data and status bytes) but maybe you could use the provided code as external in Max MSP? Quick googling shows me that both C++ and Python could be used as external object.
To be fair, that’s some silly Python code. Yes, it’s very neat that they’ve written the conversion in one line, but easily followable it is not. As a toolbox function I suppose it’s okay, but as example code next to useless.
Ok, after an afternoon/evening of coding I’ve gotten some very promising results.
I explain it in the video, but basically getting a corpus of samples (1715 in this example) and projecting that onto the Touch (using FluCoMa and SP Tools) then navigating that sample space using the Touch (via the API controls).
I need to organize the code and tighten things up, but the way I built it it’s more easily generalizable (using jit.matrices to handle the LED stuff so I can easily up/downsample depending on corpus size etc…).
I’ve been using the Flucoma Tools in Max with the Sensel Morph to trigger single shot samples. The problem is that it creates “clouds” of points along the 2D space and not a filled grid like I’ve seen in your video. By doing this I don’t have an equally distributed space and there are large areas that trigger only a couple of samples and other with a huge amount of them
May I ask you what kind of mesage or object did you use to make that ordered grid within Max?
It’s what I’m using for the Erae Touch, but I coded it so it’s generic in terms of the input it takes. As long as you send an XY pair (between 0 and 1) it will find the nearest match in the sample space.
Also, I’ve smartened up the UI for the Erae Touch bit too (also in the next SP-Tools update):
How are y’all finding it for midi control while playing? I’ve been enjoying my Launchpad a lot, but considering one of these guys for the added expressiveness/flexibility in control layout. Really bums me out that the Launchpad does not have knobs for controlling select parameters or some such thing.
How do you vouche for the Erae hardware more broadly now that you’ve used it more? (say, in comparison to the Boppad which I know you’ve also used)
I’m very curious if I should add it on my list of “midi controllers I could bang on with sticks” because I really would welcome a great one with tweakable visual feedback.
Yup, still holding up great and working well. The boppad was super disappointing all around, so it sets a pretty low bar (for me),but I definitely recommend the Erae Touch. They just put out a firmware update too, so I’ll update that when I’m back.
Agreed, but it was sturdy, transportable and pretty so it had qualities despite not being a good instrument ! I was wondering if the Erae fullfilled the missed promises of the KMI device. I’ll definitely add it to the wishlist now thanks for the quick breakdown !
Hi There,
I looked deeply in your patch sp.eraetouch and could figure where the conversion were made, I just found the drawToTouch patcher that handles the sysex message.
I’m basically trying to feed a jitter matrix like you did but to draw sample waveforms
(jit.matrix 4 char 42 24)
I’m trying to make a simple granular synth
Not sure whether you’re talking about the input (i.e. data coming from the Touch). That’s sorted here:
(it’s not red in the actual patch)
Once you’ve created an API zone on the Touch, and put it in API mode (by sending the following message below), you’ll get a stream of numbers out of the bottom of p eraeTouchInput.
That’s in the drawToTouch subpatch you found already
For the output (i.e. creating the matrices), I’m doing something where I’m going from dicts to matrix, but upstream in the patch there’s some 420x240 matrices:
Put whatever you want in there, and you’re good to go. You can likely skip the switch 4 1 thing, as that’s me handling “picture in picture”-type stuff. There’s a bunch of other stuff you won’t need, like the white puck and black dot overlays for position when playing etc…