fair enough, it just seemed like a common enough use case that it would make sense to abstract it from the script author in the params system.

I haven’t actually spent the time to try rolling my own solution though, and I’m sure it’s not that bad.

so I’m thinking about a script that would need a stereo “send” bus which can send a separate stereo mix of softcut voice outputs to engine input (using engine as an fx send rather than a softcut input)

something like: softcut.level_cut_engine(voice, ch, level)

how much of an ask is this ? (I’m reading up on similar requests upthread)

it sounds like that would need 2 extra outputs / jack ports for softcut, but could get away with the same number of ports for engine

1 Like

no you would need a lot more cause you just said separate stereo mix…

so you need six more level smoothing and mix points, in addition to the jack ports…

you of course need command glue for all of this…

when you’re done you have the near-guaranteed result of accidental feedback loops through supercollider. is this worth it?

the feedback loops are part of the goal ! so something like voice 1 > modulated supercollider fx > rec into voice 2. it’s a technique I use in max/ableton already & enjoy - it just dawned on me today that the norns routing doesn’t presently support it.

is the issue with adding things more about CPU headroom or just the amount of work that would need to go in or both? they could be less-fancy outputs if there are options to compromise. I’d mostly be treating them like toggles per voice to head into sc.

(as per usual I don’t mind contributing as opposed to just asking for things, but I get that collaboration can be messy what with different experience levels)

Why don’t you give it a try and see what obstacles you encounter after studying the crone sources.

3 Likes

thinking about tilt functionality for norns…

Questions

  • are older grids the only devices with a tilt sensor?
  • in that case, would grid.lua be the best place to implement this?
  • what sort of values are expected for x,y,z? (ranges? value types - int, float?)

I’m imagining this might be a cool thing to have after vb_driver upgrades happen.

Stretch goal - A DIY tilt sensor add on

3 Likes

serial protocol is documented.

generally tilt is in a minority of grids at this point, and the implementation/calibration is not ideal. supporting a generalized tilt will be difficult. not to mention that we discontinued tilt in 2014.

I would expect better success and adoption with a new diy handheld tilt device. that way there would be a focussed group exploration

Yup. I have been looking at it and was unclear on what “normal” 16-bit values were for x,y,z

0x81 tilt
bytes 8
structure: [0x80, n, xh, xl, yh, yl, zh, zl]
n = sensor number (support for multiple accelerometers)
description: 16-bit tilt input for x, y, z axis

Totally understand if there’s a minority of grids with tilt. Just a crazy idea while playing with accelerometers today.

Thought it might be worthwhile to use the existing serial protocol since tilt is in there. I’ve already got a demo throwing numbers to monome home in max, but wasn’t sure how to teak the values to something usable.

Or then I suppose I should ask - if not monome serial - what would make the most sense for an external tilt sensor device? OSC?

1 Like

this is what I mean by not ideal. ranges are not the same between devices.

if you’re going to make a new tilt device, perhaps just make it midi?

Eventually going to make a wireless device with an ESP32 so OSC makes the most sense for that (assuming latency isn’t too bad).

Guess I’ll get thinking on some kinda OSC library to make that go.

3 Likes

Any plan for a gen~ engine?

I’m a new norns hacker so apologies in advance if this has been covered already.

Do you guys clone from git directly onto your norns into /home/we/norrns ?
If yes, how do you manage working on your own fork in github?

After building the code at /home/we/norns, what is the correct way to launch it? Is there a process I can kill to get everything to restart?

yes. i do something like
git remote add catfact http://github.com/catfact/norns.git

now i have two remotes, the default origin which points at the upstream monome/norns.git, and catfact which points at my fork.

workflow is then something like

git checkout -b feature-branch
git push --set-upstream catfact/feature-branch

…hack…

git commit blabla
git push feature-branch

…repeat…

before opening a PR it’s important to sync from upstream master to fix conflicts pre-emptively (i’m always forgetting this, like a jerk)

git fetch --all
git merge origin/master
git push feature-branch

when ready, PR from catfact/feature-branch to monome/master with the github web UI.

we have so far stuck to merge PRs versus rebasing, with no serious ill effects.

if/when PR is merged, sync my fork’s master

git fetch --all
git checkout master
git merge origin/master
git push catfact master

options.

to relaunch the whole stack, reconnecting with maiden websockets:
systemctl restart norns-*

but often during development it’s nice to just run the processes in the shell without redirecting stdio.

i’ll typically have 4 ssh sessions, plus emacs tramp-mode or VS Studio remote-fs for editing.

  • one session for git and whatever
  • one each for crone, sclang, and matron

(from ~/norns/):
stop processes: ./stop.sh
start processes (in separate shells, and in this order):

  • ./build/crone/crone
  • sclang
  • ./build/matron/matron

you can also use systemctl to just restart individual services (norns-crone, norns-matron etc) or kill/launch them directly. just be aware that unless you restart all three processes in the right order, matron and sclang will be out of sync and some things won’t work right. (crone is pretty much independent.)

5 Likes

Perfect! Thanks a bunch

The separate shells idea looks like the go for debugging.

FWIW, I can’t say enough good things about how well VS Code’s Remote Development features work. In spite of running the bulk of the language-specific features on the remote (i.e. fates RPi 3b+) side, it performs admirably.

This allows me to keep all code, and my VS Code workspace and other config files on the fates side, and connect to it from whichever machine I happen to be at (personal laptop, work laptop, etc.) - all I need is a fairly basic VS Code install.

1 Like

That sounds very modern. I am a caveman with vim and ssh.

4 Likes

vim+ssh is a timeless, infinitely powerful combination.

11 Likes

yeah, vim 4 life (literally)

2 Likes

The vscode remote development bits work surprisingly well but I found it sluggish compared to just running vscode on my laptop/desktop and using sshfs to mount the norns filesystem locally. Naturally I use emacs keybindings in vscode to make it easy to switch between it and emacs at a moments notice depending on need…

1 Like

Hi friends,

Just got into norns (which has upended my understanding of reality), and I’ve already started working on a binaural spatializer / spatialization sequencer / sampler. I’ve been making spatialization instruments for years now in supercollider so this is going very smoothly.

My question specifically regards ATK (part of sc3-plugins) and how to make this instrument available to the community once it’s ready. ATK requires configuration beyond simply compiling the plugins package, namely installing the HRTFs as well as other data from the ATK website as well as installing the ATK quark. I can deal with this by just making a script that one can run after installing the package, but then it becomes inaccessible to those who have never ssh’d into their norns. Additionally, it seems against the philosophy of the instrument management on norns (self-containment of instruments within dust, as I understand it). I’m curious about y’all’s input on this.

BTW, sc3-plugins/ATK was empty except for a README.md before I did my own installation of it. The other plugins appear to have been compiled correctly. I’m on fates, though I don’t imagine that has anything to do with it.

5 Likes