Historical note: Functional Reactive Programming was pioneered and described by Conal Elliot and Paul Hudak in the 90s, long before newthings in the hype. Interestingly, their domain was interactive animation.

The essence of FRP is that it is a cohesive way of combining both continuous and event based streams over time in a way that is formally reasonable. It enables the expression of what you want your model (instrument) to do, rather than only being able to code the details of how.

Also note, Elm isn’t FRP and doesn’t claim to be. (Aside, I love Elm!) Most (all?) of the hip JavaScript stuff isn’t FRP, either: Attaching event handlers does not make something FRP, despite some appropriating the term “reactive”.


While there is an interesting discussion forming here about interactive performance and music structure… My initial concerns are far more prosaic:

Here are some situations a framework for control composition should be able to address:

  • Independently, I have two voice engines in SC, and each has a Lua script that maps some controls it. Combining the voices in SC is easy: a Mix UGen. How dow I combine the Lua scripts? …So I can play both at once? …So I can play them in parallel? …So I can flip between them?

  • I have a MIDI looper in Lua: It has controls for it’s operations - and it reads other control input (notes & ccs) to record and play back. How can I place control effects after the sequencer output (perhaps an arpeggiator), or before it’s input (like some algorithmic melody generator or perhaps a “corrector”?). These seem like common, approachable cases. Now what if I want to place a “conductor” script before the controls of the sequencer?

  • I have a complex effects chain built up. It is (of course) a single large engine, but chaining and mixing the parts up via UGens is clear and easy. Assuming that the first example has been tackled, there is now a nice combined mapping so that I have control over these effects from a giant MIDI knob box or two. So far, so good. Now I want to go on the road, and use a small MIDI knob box. How do I replace one with the other? What if the MIDI controller doesn’t support banking, so the banking needs to be done in Lua?

In each of these situations, it is clear that with some amount of editing of the original Lua scripts the combined or altered situation would be possible. The quest is to see how uninvasive it can be. Can those original scripts be written in such a way that while being “natural” and working stand alone… they are suitable for being combined - in the various ways - by another control script?


Tiny thought: I like the idiomatic way Python modules can both provide an API for other modules to use, and operate as stand alone programs. See §29.4. __main__.

7 Likes

Huh? Not anymore, but its creator certainly claimed Elm to be FRP when it was the newthing: http://elm-lang.org/assets/papers/concurrent-frp.pdf

I’d say your use cases are fairly well supported by sclang with its declarative Streams & Patterns classes and focus on metadata, despite it being a OOP language. The idiomatic approach to control in the norns world, though, is Lua. At this stage this revolves mostly on how rather than what. I’m all for declarative approaches, though, as I think it maps well to music.

Btw - I’m still interested in hearing about Kyma.

1 Like

More fodder for the horse:

  • When talking about control surfaces, remember that they are often bidirectional: some encoders, pads, and grids, and even keyboards display control setting feedback. In the case of banked encoders or knobs, this is more than just output: It sets the point at which input will pick up from.

  • Control might loosely fall into three categories: note controls (ex.: keyboards), parameter controls (ex.: knob for cut off freq.), state controls (ex.: looper rec. arm). The first is fairly easy to see how control routing might work. The other two get more progressively complex. And instrument control might mixes of these: Imagine a voice that takes notes and two knobs to control… and a second voice that takes notes and three knobs and a button. If I compose these, the options for notes are clear: split, or layer, or toggle between. But happens to the controls? If my controller has eight knobs and buttons, can I have all controls for both voices active all the time?

  • The display is part of the control system. Two layered instruments vie for control of the display. Is there a separate whole display toggle? From the examples given, display output is scripted as a global: “Position here, write this text.” - But that doesn’t give us the option for splitting the screen… though I don’t know if norns will need that level of complexity… Even without that - it will need a way to enable composed systems to manage the screen: In our layered voice example, I move a knob for voice B, I expect it to take over the display… twiddle voice A and it takes over.

  • There was mention of a parameter system… perhaps for a uniform patch saving concept. If so… that’ll need to be composed as well.

1 Like

Yes - the elm of that paper is a distant relative of elm, the web programming system. The later retains a hint of FRP only in its top level construction (essentially Event -> Model -> (Model, Action), but offers no library or mechanisms for using FRP to build up that function.

I’m going to have to dig to find any specifics on Kyma: I was an alpha tester (had one of the very first Capybara units) and user back in the 90s!

@mzero are you at a point where you could write pseudocode or diagram an architecture yet? Or do we need to discuss goals in more detail, or gather some other kind of information first?

@jasonw22: I think I’d like to wait and see the norns source first. There are a number of systems who’s architecture was only hinted at: parameters, interaction with the engine (via @zebra’s magic Ugen?); and the controller processing pathway is still unseen. Also, getting something like this to work well depends on deeply on doing the “right” integration with the language runtime and core libraries. I’ll need more experience with Lua and the code base to get a feel for direction.


One more complex, but very real scenario:

  • Imagine I’ve got two different grid applications running. The buttons at the top right switch between them. I’m playing on the first app, which perhaps lets me play looped samples as long as I hold a pad down. So I start to play a “chord” of samples with my left hand, holding down the pads… Now I use my right hand to switch to the other app (or page of the same app) and press pads to do things - all the while holding down the pads with my left hand. Now here’s the hard part: I release the pads with my left hand. Those pad up events have to go to the first app, not the second, even though the second app is now “in focus”.

Many controllers get this wrong: Take a keyboard controller that has octave shift buttons: Hold down a chord, then octave shift, then release the chord: Do the notes keep playing? I’m always amazed at how many controllers get this wrong! Novation is a notable exception: Every device I’ve ever used of theirs has paid very careful attention to this detail and gets it spot on. Korg does too. The Ableton remote script framework in Python also does this right - though if I remember correctly, it is a bit of a snarly mess inside.

6 Likes

(sorry to post so much… hope this is all still interesting!)

A quick look at SuperCollider’s MIDIFunc gives a good example of an API giving the wrong affordance, and thus making composition of control hard:

An example of it’s use (from SC’s help):

notes = Array.newClear(128);    // array has one slot per possible MIDI note

on = MIDIFunc.noteOn({ |veloc, num, chan, src|
    notes[num] = Synth(\default, [\freq, num.midicps,
        \amp, veloc * 0.00315]);
});

off = MIDIFunc.noteOff({ |veloc, num, chan, src|
    notes[num].release;
});

Notice that the user of MIDIFunc has to do the matching of noteOff to noteOn. To compose with this interface (imagine we have two such users), the composition has to do a lot of work, and somewhat invasively.

Had the API been defined to be used like this, it would have been much easier:

on = MIDIFunc.note({ |velocOn, num, chan, src|
    var note = Synth(\default, [\freq, num.midicps,
        \amp, velocOn * 0.00315]);

   // and return the function to call when the note is off
   { |velocOff| note.release; }
});

This is both less code for the client, the client no longer relies on a global(-ish) array, and is much easier to compose.

2 Likes

This is a really common problem in javascript (e.g. handling a mouseup event whose associated mousedown event is attached to an object that has been transitioned out of view).

When I get home tonight, I’ll study the Lua event model and try to write some sample code that demonstrates how this would work.

I’m going to ignore MIDI and Supercollider for now and focus on the Lua event model with special attention to object hierarchy and association.

4 Likes

For SC you might wanna check out Modality (http://www.3dmin.org/wp-content/uploads/2014/03/Baalman_2014.pdf) and FPLib (referred in the Modality PDF and iirc maintained here: https://github.com/miguel-negrao/FPLib/blob/master/README.md)
(FRP hype alert - havent used any of these libs myself)

I made the grid based SuperCollider UI lib Grrr (http://github.com/antonhornquist/Grrr-sc) which handles widgets on grids as the regular SC GUI framework widgets. Composable? To an extent - not its main focus, but it was conceived for multiple-modes-per-grid keyup/down tracking.

After dabbling with C#/WPF at work I also toyed with a MVVM framework in SuperCollider potentially applicable for both SC GUI and Grrr: https://github.com/antonhornquist/DeclarativeViewBuilder-sc

The last one is at the crude-hack stage. I lost motivation after realizing the number of anonymous functions that had to be spawned and got worried about performance.

This is just for inspiration. Again: idiomatic norns control is lua. I’ve not had time to port any of my SC control libs to lua since gotten involved in the project. And, frankly, I’m not sure it’s worth it. :slight_smile:

3 Likes

Also, while I thoroughly enjoy reading ideas on pure FRP I’ve yet to see any of it in practical use.

Though, again, the imo reactive programming inherent in dataflow languages I see as an good example of mapping asynchronous data flows very much like FRP.

1 Like

in my day we called it “event-based programming”.

I’m being cheeky, but I guess I don’t see a huge difference between what i see being described as “reactive” and what we always had to do in JavaScript (when we were writing JavaScript well). Maybe someone can enlighten me.

3 Likes

If I get FRP correctly, it in practice means operating on continuous or discrete streams of events (immutable data - values rather than mutable objects in the oop sense) with conventional functional programming mechanisms: select / map / etc. There’s more to it than this of course, but that is part of it.

I reckon there’s an overlap here with Dataflow languages in which messages are sent around in streams boxes that combine, filter and join streams.

I mean- how often in Pd do you attach a callback handler? Its rather a spec of a graph to execute - as is FRP to a certain extent.

1 Like

Cool, that makes the distinction clearer. That being said, I never found attaching a callback handler to be quite the unnatural thing that so many others appear to see it as. :man_shrugging:

The core of why FRP is not Event handlers is in the Conal Elliot & Paul Hudak paper I referenced above, in the section “The Essence of Modeling”, in particular the part named “3. Declarative reactivity.” I know, the paper takes some time to unpack…

To expand: Attaching a callback handler gets complex as you compose meaning to the events: The key handler has to “know” that in one state the Y key should insert the letter ‘y’, whereas in another state it should choose the yes option. As you build higher constructs (the grid is now a sequencer, now a tape strip, now a preset memory grid…) you keep having to impose into the handler… or worse, globally manage installing and removing handlers. FRP provides tools for safely, and cleanly combining just behavior over events - and building up higher event streams. Sure, you build a library of event handler combintor functions… but then you’re building FRP over your event handling system! :slight_smile:

1 Like

I’ll look into that tonight! I have a sneaking suspicion that I just got used to

and also that such a thing may happen again in context of Lua for norns.

But all the more reason to read and absorb the paper.

I’m glad this was mentioned, handling event chains is particularly hellish in graphical environments such as Logic or Max/MSP. It was one reason I stopped working with Max primitives and started wrapping all of the interesting stuff in externals.

A fundamental composition problem, for me, is that one may apply a mapping f(x) to a note on event (“x”), and then change the function to g(x) by the time the corresponding note off arrives. The result is a stuck note, because f(x-“On”) should be canceled by f(x-“Off”), not g(x-“Off”).

This will occur basically any time the mapping changes dynamically. So the generic mapping f(x) needs first to be composed with a simple utility function that repeats the input: [x f(x)], and another function that stores the result in a table and then retrieves f(x) the next time it receives an x-“Off” message.

[Note: This has been permanently broken in the Logic environment since day 1, up to and including Logic 9. I have no idea if it was fixed in Logic X. But the idea is, never change “transformations” on the fly or you will get stuck notes.]

A second issue is that the mapping itself, f(x), can at best only be pre-selected from a menu, and configured with parameters. To pre-select from a menu, the patch has to already have implementations of all of the possible mappings. There’s no hope in a live situation of someone making up a new mapping and simply passing it to a “composition engine” that defends against stuck notes (because nobody wants to think about, or is really capable of, that kind of “programming” in a live situation).

All this would have been so much easier with the idea of a patcher that takes not just data-events as inputs, but other patchers (with known I/O configurations), and contains therein meta-rules for a) hooking them up (the essence of “composition”) and b) routing incoming data events. Patchers would be dynamically fed to other patchers, as a special type of event – just like audio and Jitter buffers are special types of events. All patchers would thus have this basic capability, to be operators as well as functions. One would then specify the “actual function” and let everything else take care of itself.

This is why, I gave up on Max except as an externals wrapper when I created a real-time transformation tool that also plays with the timings of events (simple things, like reversing/looping/double-timing buffers). My solution was still kludgy but at least possible in C++ [and I got people annoyed on the Max forum for even using C++ and not doing an external in straight C which it seems the community prefers…]

Anyway, I have zero experience with Lua, but its idea of “closure” seems to allow for at least some of this: functions can take other functions as input and data to be routed into these functions is scoped appropriately.

Reference: https://www.lua.org/pil/6.1.html

Of course, none this is a new idea, and the FRP idea probably either contains this as well as goes way beyond (which I need to review more, I merely skimmed without a deep understanding). But it may be a hint as to what is possible within Lua itself, to at least avoid the unmaintainable kludgey horrors of getting the most basic stream processing to work in Max.

I wanted mostly to argue that the simple idea of closure is both badly needed and possible in graphical programming environments and would do wonders for their renewed use in “live coding” situations. I also suggested a way this could be done.

2 Likes

Three cheers for closures!

don’t you have to do that in FRP as well? a declared handler still has to know what to do. but if i understand correctly it makes it easy to declare new events, so instead of a general handler that has to maintain the overall state of UI you can have handlers be closer to the events they should be reacting to and knowing only the minimal scope they need to know. but you could implement it in a procedural language as well. but yeah, it’s nice to have the language itself provide some scaffolding (it feels like C# has been trying to employ some FRP techniques as well…)

1 Like

:slight_smile: this is fantastic!

it’s so cool (feeling grateful) to be looking in
on this professional level
programing theory discussion/seminar, please continue…

'the poetry of code

it makes me think of this (l.wittgenstein)
and this (reggio emilia poem)
everything is a language

computer programmers get to define/accept the definition(s) of a language
and can expect the computer to execute the code exactly the same way, every time
(it’s from brazil, is there a portuguese flavor of Lua?
is it slightly different in angola?)

spoken languages develop culturally over time
music shows happen in a cultural context
even in the same language (english, say)
america, england, new zealand
words/musical constructs, have slightly different meanings/connotations

music is a language
performance is a language
I’ll start another thread :slight_smile:

2 Likes

This seems relevant to our interests:

1 Like