Couldn’t you use Buses and change the values of the buses using MIDI and then use the Bus.index.asMap functionality to link the buses to the Pbind? That way you’d get continuous changes.

I’m a beginner at SC, and trying to figure BeatTrack out.
I want to track the bpm of an incoming signal in realtime. I’ve got this modified example code to work (a bit wonky, but still it works):

(
a = SynthDef(\help_beattrack, { |out, vol=0.0, beepvol=1.0, lock=0|
    var in, fft, resample;
    var trackb, trackh, trackq, tempo;
    var beeper;
    
    in = SoundIn.ar(0);
    fft = FFT(LocalBuf(1024), in);

	#trackb, trackh, trackq, tempo = BeatTrack.kr(fft, lock);

    beeper = SinOsc.ar(423, 1.0, Decay.kr(trackb, 0.15));

    Out.ar(out, Pan2.ar((vol * in) + (beepvol * beeper), 0.0))

}).play
)

Now, my question is: how do I use the bpm extracted from the input in a sequence, using Pbind for instance? trackb, trackh and trackq each represent quarter notes, eight notes and sixteenth notes. But I don’t understand how to use their values in a sequence. Is there a way to simply convert trackb etc to a float or integer, and then insert it into TempoClock?

Yes, the best (simplest?) way to continuously change parameters of running synths in response to incoming midi (or other) data is to change the value of a control bus in response to the data, and map (.asMap) the control bus to whatever parameter you want to modulate.

Using events you can only change the value on the evaluation of the event, not continuously, so even if you use a Pfunc it’s still only updated at the start of an event.

For this you’ll want to use SendTrig to pass a value from the server to the client.

I haven’t used BeatTrack in a long time, but I think this should work for what you want to do:

(

t = TempoClock(120/60);

a = SynthDef(\help_beattrack, { |out, vol=0.0, beepvol=1.0, lock=0|
    var in, fft, resample;
    var trackb, trackh, trackq, tempo;
    var beeper;
    
    in = SoundIn.ar(0);
    fft = FFT(LocalBuf(1024), in);

	#trackb, trackh, trackq, tempo = BeatTrack.kr(fft, lock);

    beeper = SinOsc.ar(423, 1.0, Decay.kr(trackb, 0.15));
	
	SendTrig.kr(trackb, 0, tempo);

    Out.ar(out, Pan2.ar((vol * in) + (beepvol * beeper), 0.0))

}).play;


o = OSCFunc({ arg msg, time;
    [time, msg].postln;
	t.tempo = msg[3];
},'/tr', s.addr);


)

o.free

t.tempo.postln // post current tempo

This will set the tempo of TempoClock ‘t’ to whatever the currently detected tempo is on every quarter note tick of ‘trackb’.

If you aren’t familiar with the client/server relationship (it caused me, and every other beginner I know, lots of grief), check out Client vs. Server reference in SCDocs.

[edit]

Oh also, I recommend using OSCdef for managing your OSC responders in the client.

OSCFunc is just there in the example because I cribbed the code for SendTrig help file, but I’ve had orphan OSC responders laying around causing me grief in the past using OSCFunc directly.

Thanks a lot, this was super helpful!
I got it working as I wanted it with a pbind-sequence by changing your code to:

t = TempoClock.default.tempo = 120/60;

I’m still new to both programming in general and supercollider in particular, and I’m sure there can be better ways than to change the default tempo, but hey it works!

Installed SuperCollider on my ChromeBook (Samsung Chromebook Plus V1 (with ARM processor)). The application installed through the CLI fine but I can’t seem to get the server running. I’ve tried a few things from googling but had no luck so far. Curious if anyone with a more thorough understanding of the Linux implementation in ChromeOS and SuperColliders operating requirements could help me figure out if this is possible or not!

hey SC users – sorry if this is a bit off topic, but some of us SC devs have put together a SuperCollider user survey:

would be appreciated if you could fill it out to help us decide on the best directions of development :slight_smile:

(let me know if this isn’t the right place to post and/or overall inappropriate for this forum)

5 Likes

SC runs fine on my chromebook, running Ubuntu 14.04, and I don’t remember any server hiccups in installation… What distro are you running? If Ubuntu, maybe this is helpful:

i’ve been trying to run it in ChromeOS. i’m not sure what the limitations are on Linux applications in ChromeOS at this point but i would imagine it has something to do with that or the ARM processor.

Which Chromebook do you have? Crouton or Crostini? Built-in audio?

I’m Chromebook shopping but I’m a bit hesitant to commit as it seems that Crostini has not yet implemented the AC97 codec required for onboard audio. This might be a non-issue with external USB audio, though.

Acer c720 with crouton. I use either the 1/8" headphone output or an external USB audio box like a focusrite if that’s what you mean.

How can I use the envelope of my synth to control a transform action (RTT, for example) in the Ambisonic Toolkit?
For example:

SynthDef.new(\waveGenerator, {

	| out = 0, t_trig = 0, attack = 3, decay = 2, amp = 0.8, freqfactor = 0.75, doneAction = 2 |
	var sig;

	~waveEnv = EnvGen.ar(Env.perc(attack, decay), t_trig, doneAction: doneAction);

	// Mix pink and brown noise for wave signal
	sig = (PinkNoise.ar(1) * freqfactor) + (BrownNoise.ar(EnvGen.kr(Env.new([0.2, 1, 0], [attack, decay], [1, -1]))) * (1 - freqfactor));

	// Apply envelope
	sig = amp * sig * ~waveEnv;
	sig = sig * 0.3;

	Out.ar(out, sig);

}).add;

I tried to pass ~waveEnv to the transform action (FoaRTT(sig, ~waveEnv, 0, 0)), which didn’t work out.
You can find my code repository on GitHub.

Is there a supercollider speech/voice synthesis code out there anywhere? (ideally text to speech)

I did some searching, but it seems my keywords (supercollider, voice, speech synthesis) are too generic.

In sc3-plugins, do a search for “LPC” (that’s Linear Predictive Coding). That’s the foundation of a lot of speech synthesis.

Also in sc3-plugins: FormantTable. “Returns a set of frequencies+resonances+amplitudes for a set of 5 bandpass filters, useful for emulating the main 5 formants of a vocal tract.”

1 Like

Lately I’ve been trying to learn the MIDI loopback technique on Digitakt. There are few circumstances which would cause the device freeze and panic. I used the MIDI Monitor on Mac and found out that is because of some MIDI messages. For conveniences I want Pi to do the task as a MIDI filter/ blocker.
First I want to make it to filter out a Control Change message: CC 120 across MIDI channel from 1 to 15. I have read the MIDI guides but still I cannot find a concrete example for this task.

MIDIdef.cc(
	\ccFiltering, {
		// Which function to block/ filter this CC message?
	120, 0..15
	}
);

Do you guys know which func I should use?

Thank you.

I just ran this on my computer and it’s incredible, really encouraging as i’m barely started learning supercollider and this makes me excited for what’s possible with it once i’ve put in the effort!

1 Like

I’m not sure where this reply belongs, feel free to moderate it however is needed.

Now I’m getting to grips with Lua and have got my drum sounds nailed in SC (when I say “nailed”, they are ready for MVP testing :wink: ) I’m very keen to get SC engines built. I’ve replicated the simplest engines but my own engines cause Norns to throw an error after I reset, but I can’t figure out what I’ve done wrong. Is anyone aware of any super basic SC to norns tutorials? Or a way to get feedback on what exactly has gone wrong? Cheers all - the journey continues :slight_smile:

A general query: I’m primarily interested in learning SC (having no prior coding experience) for sending data to an external MIDI device (or softsynth). So far, I’ve been copying and rewriting other’s code and teaching myself that way, along with sundry video tutorials and the ever-present help documentation.

For MIDI, lots of people suggested Max to me, though I’m not a so-called “visual person” and already suffer from constant moderate-to-severe tendonitis and RSI. So, I figured mouse-dragging ‘cables’ to objects and the like on a repeated basis would probably not be in my best interest. Also SC is so powerful and generally seems more amenable to my compositional dispositions.

What other methods and materials would one suggest for one teaching themselves SC for primarily MIDI implementation? Are there any expansive didactic materials (with examples) that focus primarily on MIDIOut methods?

not sure what you’re looking for, MIDI output is really simple… you push bytes to the device and that’s it. the MIDIOut helpfile and Pattern Guide 08: Event Types and Parameters pretty much cover everything there is to know. (pattern event types include MIDI stuff, for convenience.)

everything else about patterns and event generation in SC applies equally well to MIDI events as to internal synth commands.

1 Like

relative to dealing mostly with e.g. audio synthesis? I’m mostly drawn to MIDI insofar as I’m not so much interested in audio synthesis and more interested in, well, event types and parameters (‘morphology’). MIDI is thus a seemingly more expedient way to ‘explore’ this (e.g. generation of lists, structures, patterns, array-forms, etc., with primary focus on ‘pitch’, dynamic (velocity), and temporal parameters), given that the ‘audio objects’ (timbral forces or whatever one might say to maintain generality) would be ‘pre-synthesized’…?