People use Supercollider for live coding. It can even do graphics. Tidal is a frontend to Supercollider by the way.

2 Likes

@theseanco posted his GitHub repo/tutorial a few replies upwards and it’s full of gold.

1 Like

Yeah, thanks, I am aware of that, i have tried out both but mostly been playing with supercollider (on a quite basic level i must say!). I was more asking about syntax / workflow differences and which would be more suitable for live jamming / improvisation. For example i really like how tidal works with timing and patterns for rhythmical sequences (very immediate) and struggled to achieve something similar in supercollider without using 4-5 lines of code on a Pbind. On the other hand i found it much easier to explore synthesis in general and probability based sequencing through supercollider. In general tidal felt more immediate (preformative?) while with supercolider i felt like i had more control over everything. I guess it ends up being very personal and it obviously also comes down to how well you know the syntax.

1 Like

I think you’re seeing what each is good at. Tidal takes care of some of the details for you, SC doesn’t.

1 Like

You might look into using tidal with custom synthdefs rather than superdirt. I haven’t tried it yet, so I don’t know all the ins and outs but I know it’s possible.

1 Like

I would love if Supercollider had robust options for programming visuals. Seems like there’s not much beyond gui stuff and a pen class.

Recommended by @trickyflemming in the Supercollider thread:
https://smile.amazon.com/Mapping-Visualization-SuperCollider-Marinos-Koutsomichalis-ebook/dp/B00GX67V5W/ref=sr_1_6?ie=UTF8&qid=1526231943&sr=8-6&keywords=supercollider

1 Like

Very interesting. It seems, from reading the description, that book is more about some of the built in visualization tools, I guess I was thinking more of abstract visiuals.

Check out the table of contents. Quite a few mechanisms for abstract visuals.

That being said I think if you wanted to do a lot of this sort of thing in a performance context it probably makes more sense to use OSC to communicate variables from Supercollider to a more capable visualization environment such as TouchDesigner, Processing, or Unreal.

2 Likes

Yes I agree. I have done that in the past (sent OSC to Processing). I was just dreaming of being able to dynamically live code visuals in the same environment as my audio.

1 Like

https://redfrik.github.io/udk00-Audiovisual_Programming/
…?

Another way to go is to use Overtone (a clojure based live coding frontend for Supercollider) with Quil (a clojure based live coding frontend for Processing).

http://overtone.github.io/
http://quil.info/

3 Likes

If you’re looking for some guidance on live coding in SuperCollider, the tutorial post above by @theseanco is excellent. Getting into using ProxySpace is a big jump towards making SC practical for live coding, and this gets discussed in the stuff above as well as in the JitLib guides in the SC docs.

2 Likes

I found that using FoxDot is a ā€œlittleā€ bit less taxxing to get started with if haskell is not intersting to you

I’ve been experimenting with the Processing REPL mode for live coding visuals. It works pretty well, but you can’t do things like add global variables or functions. You get an error like this:

   Exception in thread "Thread-2629" java.lang.NoSuchFieldError: editor
    	at jm.mode.replmode.REPLRunner.exceptionEvent(Unknown Source)
    	at processing.mode.java.runner.Runner$2.run(Runner.java:599)

I don’t really know what that means, but I’m guessing this is normal. It stops the sketch from running which makes the REPL scary (at least for now) to use in a live situation.

I think that Java is not a really great language for REPL live coding because it is overly verbose. I did not know about this REPL mode in processing.org before so I tried it and after using Quil with Clojure REPL I would still recommend Quil (as @jasonw22 suggested).
I did not used a lot of livecoding languages but I used TidalCycles (https://tidalcycles.org) to play some stuff live and make record of it (https://firmanty.bandcamp.com/album/nieludzka). My only gripe with it was that it was more sample focused but it was mostly my personal problem because I was not used to using samples before and did not have any good sample library. But after sampling some of my synths it was much easier to create something with Tidal. One of the things that helped me a lot when playing live with Tidal was putting output of my computer through analog filter with distortion unit so I could add filter sweeps etc. and also it really helped to glue all the stuff together.
I also made some shaders with Veda https://veda.gl and experience was very nice and pleasant so I would like to explore Veda further.

I’ve been gigging and recording using ChucK and I love it. I use an ETC video synth for visuals. I have a few videos here:

and have a stash of my code here:

3 Likes

What’s an ETC video synth?

https://www.critterandguitari.com/etc Check it out.

1 Like

TOPLAP have got a 15th anniversary live stream running till the 17th (started on the 14th).

Line up here.

Stream it on YouTube.

(I especially like the channel name…)

1 Like