A bit of a bump here.
So I’ve got a cool thing I’ve built, with a demo and explanation, and an associated (hardware) problem and limitation.
One of the things I’ve been wanting to make for ages was to have an “automatic library parser” thing, where I can feed an arbitrarily large sample library into a process, then have that be playable without having to go through and map everything manually.
In this case I have a sample library of 3094 samples (Metal Resonance by @timp) which I’ve pre-analyzed for loads of audio descriptors (loudness, pitch, spectral centroid, etc…) and then I’m using my playing on the BopPad to “navigate”, by using the velocity and radius from the BopPad to map to a normalized version of the loudness and spectral centroid, respectively.
The video explanation:
A quick performance demo:
Even at these early stages it feels super promising. It’s quite expressive to play as things map in a manner that makes conceptual sense without you having to individual make those decisions for each of the 3k+ samples.
I also do some compensation in the code to further extend the dynamic expressivity by compensating the matching sample to the queried value (if I need a sample that’s -12dB and the nearest one is -15dB, I will boost the sample by 3dB on playback).
The spectral centroid thing is quite intuitive too, although I am experiencing a problem with this (more below).
In the long-term I want to implement a way more sophisticated version of this where instead of 1-to-1 mappings for each parameter/descriptor, I use some fancy machine learning tools (by our own @tremblap) to map dozens of audio descriptors and their changes over time within the samples to a dimensionally reduced self-organizing map. So the clustering and organization on the BopPad (or whatever other controller) will be able to query the samples in a more sophisticated way.
But for now I just wanted to post the quick proof of concept idea.
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Now onto some more technical hardware things.
For some reason my BopPad is spitting out really shitty values for the Radius. Check it out:
I basically get values of [1, 32, 64, 95, 127] for around 95% of my hits, giving me effectively 5 “regions” for the radius, instead of 127 values.
I wrote KMI a while back asking if it would be possible to get 14-bit MIDI output from the BopPad as even 127 values would be fairly shitty resolution for what I’m trying to do, but 5 steps is…um, pretty bad…
That led me to find the Mandala Drum, which I had seen before, but wasn’t too excited by. They tout themselves as being an “HD position sensitive” instrument (which just means 7-bit MIDI), but I was curious to see if anyone had any experience with them. And if so, how they felt, how smooth the changes of values were etc…
Lastly, I was curious if anyone knew of any other drum pads that offered position/velocity information? Or even DIY projects.
I’d absolutely love something that did 16bit output over OSC, for example…