Supercollider tips, Q/A


sysrt, you need to give index = 10, this is 3rd argument.

however, its probably easier if you just use the convenience method

MIDIdef.start(key, func, srcID, dispatcher)


Got a reply from MH:

We try to run SuperCollider classes pretty regularly throughout the year. There’ll be a single class with Shelly Knotts on Sept 6th (will be announced in the next couple of weeks).
As for another course of three workshops with Joanne, I’ve not spoken to her about this but they’ve gone very well so far and there seems to be a lot of demand, so I expect we’ll run a similar thing again if she’s up for it!


Thanks for pointing it out! Can’t believe I was only misplaced the arguments, and yes MIDIdef.start is more convenience as you said.


cool, did you get it working?

Ive limited experience with SC, and so all chances to dive in to improve my understanding are good for me.

at the time of your post I was playing around with SC on a Bela Salt, trying ‘live coding’ on a modular… so was good opportunity to look at midi in that context.

gotta say, SC Ide on a laptop, remoted into a SC eurorack module is so much fun,
I think its the mixing of the physical and virtual world.
( oh, and Ive always liked programming leds to flash :wink: )


wow, JITLib combined with modular , is starting to blow my mind…
( that said the SC docs are great, just follow thru the concepts tutorials its pretty clear, just quite a bit to take in)

it seems to make SC what i hoped it would be, the missing part of the jigsaw.

Im also now going thru @theseanco awesome articles on using this for live-coding - thank you @theseanco its a great resource, helping bridge the concepts to practice.

any there any other good resources for JITLib and using it for live coding?

I think once my head stops expanding/exploding - I might do a video to introduce others to its wonders.


how are you attaching a screen to the salt? doesn’t look like there are any front panel connections for video out?


I’m not using a screen, I’m using a network connection and a laptop :slightly_smiling_face:

Bela is quite nice you can either plug a WiFi card in , or use usb networking - so just attach a usb cable to your laptop and your done.

The network connection also means you have access to Bela webserver.

So I’m using salt in 2 ways, stand-alone , where I can switch between pre-made ‘patches’ (Pd or Sc) or connected like here for live coding.

It’s nice to use a laptop as it means salts cpu is not ‘wasted’ on a UI - also as sclang is local, means your able to interact with anything on laptop - so it’s kind of like a cv bridge.

another thing I’m thinking of trying, is Norns.
Running matron on Norns/rPI and crone/sc on Bela Salt :slight_smile:
( I could run the whole Norns on Salt, with push2 as a display , but this splitting of roles interests me a bit more :))


has anyone managed to sync their supercollider to CV clock?


Interesting question…

for Bela Salt, its trig inputs are kr or ar ugens, so you can easily use to schedule things.
however, if you wanted to be compatible with the SuperCollider clock, I guess you’d have to subclass Clock?
I noticed there is a 3rd party midi clock implmentation (MIDISyncClock) ,

this looks like it would be pretty simple to changed to use a

is there an easier route?

(I might have a go, once Ive determined if the clock interface is a requirement, or if its as easy to do the triggering directly. Im thinking a clock might be a requirement for PDEF - correct?)

it would be trivial of course, to use Bela Salt as a CV clock master - just attach it as a clock sink in SC, and get it to send triggers on its trig outputs.

(I think for now thats the route I will go, just for simplicity… but I can definitely see the advantage of taking a CV clock as input)

note: Im saying bela salt, im guessing this would be similar for anything your running thats going to interface with a eurorack system - as far as i see it, its how do you hook the CV in (or whatever!) , into the SC clocks system


is () part of a bela specific library? it’s not showing up in my documentation :confused:

no, I’m not using bela, although I have been watching that space very closely, with keen interest. at the moment I am running supercollider on my laptop -> ES-3. however, there is an ES-8 in the mail and I want to be able to use one of its audio inputs to slave supercollider TempoClock


yeah DigitalIn is bela specific ugen.

for the ES-8, you’ll just use a audio in ugen ( , but its the same
you’ll then be able to trigger things manually if you want.
bu the question becomes do you need it to be a clock? if so Im guessing you have to follow the route described above.

as I’m pretty new to SC, I guess, the thing I haven’t worked out, is how important is it other SC functionality to be tied to a Clock (esp. the new JITLib stuff im looking at) - is it a necessity or a nice to have?

(hmm, actually I also need to look/think about which bits of this puzzle are client (sclang) and server (scsynth) side… for my stuff, so far everything is server side)

btw: I think if I did CVSync for Salt, it’d only be a one line change to switch it from a digital input to an audio input… so Id add that too - its easy for me to test too.

(a quick google search didn’t bring up any hits on doing it… just some sending clock out, which as I mentioned is pretty simple)


on the teletype, evaluating the code:


will assign T the number of milliseconds between evaluations. this is useful because you can then multiply it, sub-divide it, etc.

the TempoClock provides tempo-based functionality in supercollider, with built in integration with patterns and proxyspace. I also am just reading @theseanco’s guide to live-coding, and trying to move to a more uninterrupted, JITLib-style approach. I want to be able to slave supercollider to midi and cv to make the laptop-modular setup more immediate and collaboration-ready.

I think it should be possible to build a with but I’d need to dive into TempoClock and tinker around a bit, maybe make a sub-class.

I believe that the dev team are trying to implement an ableton link sub-class for TempoClock in 3.10, which would be super handy tbh


cool, post here as you make progress,
Id love to hear more, as all sounds very relevant to the direction im heading too.


Don’t want to rain on anyone’s parade but I went to one of Shelly’s workshops a while ago and felt like I came out knowing less than when I went in. I’ve been to half a dozen or so things at Music Hackspace and they’ve been hit and miss, but that one was really awful. Can’t speak for Joanne, obviously, but thought it’d be worth mentioning in case people get their hopes up! I’ve found Eli Fieldsteel’s videos a lot more in-depth and informative, with the added bonus that you can take them on at your own pace, pause where necessary to play around with little snippets of code, etc etc.


Ok, got this to work today…

was very simple,

build crone on Bela Salt (as we only want the SC bit) , so that means very few dependencies in the build.

edit -> norns.local

then on norns, edit
change : -> bela.local

viola, Norns (matron) connects and controls) the SC instance (crone) on Salt in the Eurorack

worked but seemed to have a bit of latency, and also under more load, started seeing nodes not found…
I think i just need to tweak it a bit, perhaps :

  • wifi might be not great,
    try replacing with a usb network.
  • Crone using a lot of cpu (was only reporting 40-60%) ,
    think this is because i was using a 64 sample buffer, rather than the 256 on norns…
  • might need to adjust latency parameter in sc, to reflect network latency ?!

the next interesting question, would be how to integrate CV in/out into Norns engines :slight_smile:


I have this recollection from when I first learned SuperCollider that there was a way to ‘test’ a UGen, whereby it would only play it for a few seconds before stopping.

E.g. instead of

{, 0, 1) }.play;  // plays indefinitely

you could do something like:

{, 0, 1) }.testSound;  // plays for 5 seconds, then frees

I’ve tried a lot of Googling (several times), but I can’t find anything. Have I just imagined this?


I don’t know of anything like this. You can do{x = {; }.play; 2.wait; x.release}).play;

but that’s hardly as convenient.


Finally after a few months of trying to remember, I have :man_facepalming:t5:

It was from Overtone rather than SuperCollider proper.

user=> (doc demo)
([& body])
  Listen to an anonymous synth definition for a fixed period of time.
  Useful for experimentation.  If the root node is not an out ugen, then
  it will add one automatically.  You can specify a timeout in seconds
  as the first argument otherwise it defaults to *demo-time* ms. See
  #'run for a version of demo that does not add an out ugen.

  (demo (sin-osc 440))      ;=> plays a sine wave for *demo-time* ms
  (demo 0.5 (sin-osc 440))  ;=> plays a sine wave for half a second


(demo (sin-osc))

It’s a useful function. Would be nice in sclang.

~demo = {|ugen, rate ...args|
		x = {perform(ugen, rate, *args)}.play;
~demo.(SinOsc, \ar, 440, 0, 0.1);
~demo.(Saw, \ar, 440, 0.1);

You could of course wrap this into a class so that it’s always available…


That’s really cool! Is there a particular reason why one wouldn’t do this:

~demo = {|function, waitTime = 1|
    fork {
        0.01.wait; // apparently necessary to give the server time to know the SynthDef
        x = Synth(;


as that seems even more comfortable? I’m still very new to Supercollider, so I might just be missing something.