The vscode remote development bits work surprisingly well but I found it sluggish compared to just running vscode on my laptop/desktop and using sshfs to mount the norns filesystem locally. Naturally I use emacs keybindings in vscode to make it easy to switch between it and emacs at a moments notice depending on need…

1 Like

Hi friends,

Just got into norns (which has upended my understanding of reality), and I’ve already started working on a binaural spatializer / spatialization sequencer / sampler. I’ve been making spatialization instruments for years now in supercollider so this is going very smoothly.

My question specifically regards ATK (part of sc3-plugins) and how to make this instrument available to the community once it’s ready. ATK requires configuration beyond simply compiling the plugins package, namely installing the HRTFs as well as other data from the ATK website as well as installing the ATK quark. I can deal with this by just making a script that one can run after installing the package, but then it becomes inaccessible to those who have never ssh’d into their norns. Additionally, it seems against the philosophy of the instrument management on norns (self-containment of instruments within dust, as I understand it). I’m curious about y’all’s input on this.

BTW, sc3-plugins/ATK was empty except for a README.md before I did my own installation of it. The other plugins appear to have been compiled correctly. I’m on fates, though I don’t imagine that has anything to do with it.

5 Likes

thanks for the heads up. i didn’t realize ATK needed an extra step, but looking now:

sc3-plugins/ATK was empty except for a README.md

indeed, this seems to be the case after simply building and installing sc3-plugins from source. perhaps i am missing some needed configuration.

for norns, we build all the supercollider components and host them on http://package.monome.org/, including sc3-plugins, and that is where they are pulled from when building norns-image.

i suppose we should add some documentation on the SC component builds to that repo, and perhaps that repo’s issue list would be a good place to help us understand what else might be required to make ATK functional.

(i know how to build the components but @artfwo is the master of the packages, and @simonvanderveldt is the keeper of the image, and @tehn is of course our gracious host)

Thanks for taking a look and giving some background on the ecosystem. I’ll spend some more time looking into it and will likely open an issue on the repo.

Having binaural encoding in a box excites me greatly so I’d love to see it functioning without any extra configuration on the user side.

Hi! I also tried vs-code remote and what @ngwese mentioned, mounting the file system, but, how do you replicate the “play” button functionality and repl? have you sorted that out? or are you keeping a window with maiden open?

Honestly, for what I’m doing, I don’t need much beyond maiden in a browser tab for error reporting and the occasional debug print. I usually restart the script from the fates hardware.

Although, everything could probably be done from VS Code’s ssh terminal pane running the (recently deployed) maiden REPL.

1 Like

Yes, thats what I’m also doing at the moment, I looked into the code quickly and saw on the web side redux is just sending the command as is to the socket connexion which is norns.script.load("YOUR_SCRIPT_LOCATION"). I also noticed that a web socket connection is being made to ws://norns.local:5555/ for matron however I didn’t manage to connect a simple websocket application to that port, and I’m not sure of that would be the answer or if I should issue a GET or POST message to the server instead…

If anyone has any tips that could be very cool, in the meantime I will use the web button and REPL as well.

PS: didn’t know there was a recent deployment of maiden REPL that could also be nice.

See Norns execution through bash? for some recent info on websockets etc.

“FIX integrate maiden-repl” from Norns: update 200218, but IIRC it’s not super useful unless you soft link/copy/move it to somewhere that’s in PATH; in any case, it needs to be launched manually in a term, and then keep in mind that it’s essentially a mirror of the maiden web REPL… I can’t seem to find the recent lines discussion about it :frowning:

1 Like

all right, thanks! I’ll have a look at those links and see where they get me, if I manage to make it work I will report back :sweat_smile: :pray:

hi i had a friend who wrote a very cool program called KeyKit in the mid 1990s @ AT&T and he asked

“Does Lua talk to SC to do MIDI I/O? Or does it do it directly?”

what is the proper response?

pp

“directly,” if you will

this doesn’t preclude doing MIDI I/O in SC, but it will not be plumbed into the UI device selection &c

what API is his next question i assume is that logical next question?
Tim has.is keykit and i am trying to recruit him :slight_smile:

it uses the ALSA snd_rawmidi API

it’s possible that this will get yanked out in favor of JackMidi or something in a future rev.

1 Like

Completely off-topic, but; O.M.F.G.! I used KeyKit in the 90s! So cool… The world is indeed a tiny place…

1 Like

i LOVED keykit. it rules so the prospect of KeyKit -ish sequencing on Norns is staggering

Is it possible for other programs to be “plumbed into the UI device selection &c”? If so, how would that be done? I’m brand new to this and haven’t looked into it yet, but the device_midi.c file looks like MIDI support would be easy. So, I’m now wanting to find out how the UI on the screen is done, and how that might be done with C code. Thanks for any pointers!

…Tim… (author of KeyKit)

3 Likes

I’m scanning the previous development conversations here, it looks like a lot of my questions (including how the screen can be accessed) will be answered after I get through them all.

…Tim…

5 Likes

right now, not very directly. there is a program called matron which runs the lua interpreter and interfaces with screen, GPIO, and USB devices. its components aren’t really set up to be used by other C programs, either as libraries or as IPC endpoints.

@PaulBatchelor has made some C snippets that might be helpful if you just want to replace matron with something else.

the norns lua API provides “script” writers with high-level interfaces to MIDI and screen and everything. one thing that we don’t have (hasn’t come up yet) is any modular way to extend the lua API with components written in C. (such extensions have to be baked into matron.) but there is nothing stopping you from (say) launching an executable in a lua script and then talking to it with sockets or OSC or something.

the “audio backend” on norns consists of a mixer program called crone, alongside an instance of sclang+scsynth and a buffer-cutter called softcut. communication with these components is done via OSC.

norns v3 is under development and will likely alter and/or extend some of this stuff.

5 Likes

Thanks much for the pointer to the C snippets! Using OSC to talk to Lua from another executable should be pretty easy for most things that matron manages. …Tim…

1 Like

This would be super useful. A few days ago I “accidentally” recorded over an hour of me going through various scripts and noodling around (intended to record a couple of minutes but forgot to stop recording). There were a few bits that ended up being quite useful! Not a huge deal to pull it over to my computer and cut out loops, but it would be convenient to just load the audio into a script on norns, cut and save loops, and repeat.