@zebra - First, let me make clear that I respect the amazing engineering, and appreciate the “launch and iterate” approach. Your interest in the area, sparking this thread, and acknowledgment that it has been on your mind, makes me like norns even more. If you’re going to play a unique instrument, it helps to know where your luthier is going!
Given the engineering trade-offs involved, Lua does seem like an understandable choice. As the language has only minimal abstraction mechanisms, and an unfortunate propensity for globals, it is easy enough to build a plate of spaghetti. But, with careful thought, we should be able to assemble something more layered (lasagna!)
But enough of Lua - it is what it is…
Indeed, @jasonw22, composition of control has been a vexing problem both in music as well as in general programming.
In music systems I’ve used and written over the last 40 years - most have completely given up when it comes to control. Think of all the power of ChuckK… and it gives you three byte MIDI messages for control. Max & Pd are much the same. The only system that comes to mind that has done anything more ambitious is Kyma.
Outside of music, the most applicable work I’ve seen is from the area functional reactive programming, where event streams can be combined in useful ways you can reason about.
I don’t yet have specific ideas for norns yet. But, some things I think about:
- My examples from my comment were chosen carefully to demonstrate several kinds of control composition: Union, Wrapping, Layering, Multiplexing, Hierarchy.
- There was mention of a parameter system - how would parameters compose as controls connected to them compose?
- How do engine parameters fit in?
- The user will need to map at some “outer” layer: One user has a keyboard controller with faders and buttons to map… another has separate keyboard and knob boxes. How does this layer of mapping differ from other control compositions?