Here’s my attempt to create semi-generative piano system in Ableton Live using mostly its built-in midi effects, and a few max4live and audio devices.
I tried to reach kind of “call and response” feeling between human and machine. Turn performance into a dialogue in which machine reacts on what human is playing, and human, in his turn, react on machines response. Also that was important to have a possibility to easily morph the patch from subtle echoes to chaotic avalanche of notes.
It’s work in progress, actually. The patch is still a bit ungovernable, has kind of machinery feeling in a wrong way. I’m not 100% comfortable playing with it, so I gonna develop it further, but anyway here’s what I have for this moment.
Be really really glad to hear any feedback from you, you guys are such a beautiful community!