Norns: development

i don’t think any C programming is required to support other keyboard layouts.

a tiny amount may be required to detect when a HID devices is in fact a keyboard (supports “keyboard” interface protocol.)

i found the open issue for iternational kbd support. this doesn’t cover the creation of a higher-level keyboard abstraction in lua to glue all these things together.

[i’m gonna move these last few posts to “development” topic, and move further details to the issue comment thread.]


I was thinking about the possibility to write the name of (for example) the recording tape with keyboard (HID)…
it will cool and easier to be able to write the name with letter of a computer key.

What’s the best way to debug core C on norns? I’m planning to try a few things in screen.c around the frame buffer and don’t know how to configure norns to get messages back.


ok, i guess we are always building with debug symbols, but they won’t be useful with -O3 (the default.)

unless i’m forgetting something (possible, b/c my waf-fu is weak), we don’t have a debug configuration set up that changes the optimization flags for matron.

so for now i would just edit norns/matron/wscript line 79 to change -O3 to -O0 (and don’t forget to change it back later!)

then just

> pidof matron | xargs kill
> cd ~/norns 
> ./waf clean
> ./waf build
gdb build/matron/matron

(gdb) b main
(gdb) r

et cetera

1 Like

Would be possible to rewrite routings to use a class compliant audio interface with Norns to expand I/O and add multitrack record?

Of course I can imagine not only Norns but the engines would have to change to accept more I/O.

Would be possible with the current hardware though?

I’ve been curious about the same thing. I plugged in an M-Audio fast track and I get this. ~/dust/code/sines $ aplay -l
**** List of PLAYBACK Hardware Devices ****
card 0: sndrpimonome [snd_rpi_monome], device 0: monome cs4270 cs4270-hifi-0 [monome cs4270 cs4270-hifi-0]
  Subdevices: 0/1
  Subdevice #0: subdevice #0
card 1: Pro [FastTrack Pro], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 1: Pro [FastTrack Pro], device 1: USB Audio [USB Audio #1]
  Subdevices: 1/1
  Subdevice #0: subdevice #0 ~/dust/code/sines $ jack_lsp

No problems on the hardware front. JACK doesn’t pick up the new device, though on my Linux desktop I can do USB audio hot-swapping without restarting jackd.

So I’d say, yes this is possible and yes it would take a lot of work to hook it into the larger Norns platform.


After calling screen.arc() to draw an arc, I’d find it useful in scripts if I could get the current x, y position on the screen.

What do we think about using cairo_get_current_point in screen.c, and adding get_current_point() to the script-level screen lib?

we can consider this after a pending PR gets merged where cairo gets heavily reworked (its own thread) which complicates get methods (they need callbacks). but once it’s merged there will be a pattern that can be followed to add this.

1 Like

Hey there, just wanted to submit some thoughts and bugs regarding the clocking system.

  1. Crow clock input, as of now if I feed a clock with bpm 125, then Norns will show it as 300 because internally it understands 125 bpm as 125 x 4. The clock speed is still 125 when Norns syncs, but the bpm reader on Norns will show 300. This doesn’t affect any functionality, but would be nice if it showed it correctly. For example when syncing to midi clock it will show 125, so the incorrect reading only happens when clocked from Crow.

  2. General Midi Start / Stop / Run / Reset messages (also applicable to crow clock sync), currently they don’t exist, hence, Norns never stops the sequencers or anything tempo related. In the end it is very hard to start your other gear with Norns synced to either midi or crow clock. Also for performance reasons, it is important to be able to stop everything at once (and start too). Fairly basic stuff that should have really been on Norns from the start.

  3. Norns does not understand Clock Swing when either clocked by midi or crow. By clock sync I mean irregular clock where every second pulse is delayed by variable amount. The majority of music relies on some sort of swing amount in the sequence, from classical MPC 54% clock swing to full triplets at around 67%. For example when using ERM multi clock, it sends variable swing amount for 4 midi outputs, this way every machine can follow this swinged clock and stay in the same groove. For example Monome Ansimble responds to swinged clock exactly how you would expect it, but Norns completely ignores it. This is not so important for loopers or audio related stuff, but very important for sequencers, meaning that the sequence should advance to the next step not just by regular clock, but also irregular. Would love if this could be implemented :slight_smile:

Hope I don’t sound too needy with these requests, but I really feel like these things are absolutely necessary to make Norns work with other gear and be more performance friendly, its not just for ambient music :smiley:

1 Like

hey hey! hope all’s well :slight_smile:
some very good topics, which thankfully point to docs gaps more than system troubles, so hopefully this reply helps!

Crow clock input

under PARAMETERS > CLOCK, there’s a crow in div setting which determines how many clock pulses should register as a “tap” for the BPM matching. you’ll need to fine-tune to match this to your clock source’s pulses per quarter note.

so, if a clock source sends four pulses every 500ms to represent 120 BPM, then you’ll want to set crow in div to 4 to match the norns clock tempo displayed.

if fiddling with different values doesn’t give expected results, please share what your clock source is and we can investigate the specs :mantelpiece_clock:

i just had a dark night of the soul writing a transport into cheat codes, so i am fresh on the heels. big picture: each script needs to define what should start and stop and reset whenever these messages are received. this is a docs gap which i’m writing to fill, but essentially norns already has two script-definable transport calls: clock.transport.start() and clock.transport.stop(). these are hard-coded to respond to MIDI start/stop messages (if clock source is set to MIDI), as well as Ableton Live’s “start stop sync” (if clock source is set to Link).

so, simply put, the functionality has been present in norns for a year, but very few scripts utilize it because it wasn’t well-documented. i’m hoping that this docs effort will help clarify when to use what and why, but if you have a few scripts which you’re totally dying to have this in, let the author know and i’m totally happy to help how I can :slight_smile: @vicimity 's Initenere comes to mind as a script which makes use of these messages.

@artfwo can speak more to the architecture, but from what I understand the clock runs its own transport which can derive a tempo from an external source but it doesn’t wait for ticks (like ansible or other modular gear which require constant pulses to advance [if a patch cable is in their clock input]). apologies if this is a dumb question, but how do DAWs handle incoming clock swing driving their main clock?

as far as outgoing swing, @tyleretters’s lattice provides a really nice framework for scripts to incorporate swing, but this is a new library for script authors. in fact, lattice also has a nice framework for scripts to act more like modular sequencers where a pulse event is required to advance. excited for more people to use this!


well, the clock API only provides the primitives to sync to the source, so swing has to be implemented on the script level.

in a DAW, incoming swing is best handled on the DAW end, e.g. the DAW gets a steady rhythm at the clock input and shuffles it internally (by delaying odd beats for example). i am not aware of any hardware or software implementing it the other way round.


Thanks a lot for all the explanation, really helpful! Didn’t know start and stop is already in there so this is definitely great news for future app updates!
Will check out lattice as well!

1 Like

I found some comforting steps to contribute to norns here, but section “api docs for master” provides a link that seems to be broken: [[]]

[mod note: as pointed out, the link above is broken. it will continue to be broken, so don’t use it. the readme has been updated since this post was made.]

Trying to build Norns, Waf can’t find a dependency called “panel”. I can’t figure out where this is supposed to come from? I’m on Void Linux but I think I have all the packages in the list in installed…

it’s an ncurses thing. the relevant package listed in the readme is libncurses5-dev. that list of packages assumes something similar to debian/apt. i don’t know void linux but its package management seems very different.

this is only needed to build the ncurses-based REPL (maiden-repl.) if you just want to build the primary components you can do that:

./waf build targets="matron,crone"

maybe worth asking: what are you trying to do? if you want to rebuild/update the software on norns, the typical way i do that is just by building on the device. (i’m sure someone has done cross-compilation or built with docker, but i haven’t bothered with it.)

if you want to explore running the various components on desktop linux, carry on. (but you may have to figure some things out on your own for your particular system.)

Thank you, this was the right question. I do indeed want to build for the actual hardware, so I suppose I’ll just do it on the device.

Has there been any progress on setting up a virtual norns environment? I’ve found many fragments of information but am having a difficult time piecing everything together. I recently learned about norns (yes, loopop) and am hoping to get a taste for what it is like before investing in another project.

I’m a programmer and have previously spent a considerable amount of time contributing to a teensy based synthesizer called TSynth. The embedded synth was a fun learning experience, but not my favorite. Norns seems like an amazing blend of hardware / software.

One challenge seems to be emulating the screen. There is a lot of discussion here and github, a WIP demo, and even a pull request. Something may even be in the works, but I’m not sure where to get details.

Running the software in a virtual environment (or locally) doesn’t seem to be an issue. One project called norns-dev made it pretty far, I was very happy to see the entire environment start with a single command and was able to start experimenting. Getting audio to work is less straight forward, perhaps it would be worthwhile for the community if I spend some time working on this and update the documentation?


audio should really be one of the more straightforward bits. the entire audio stack (crone and supercollider bits) is developed on desktop linux and only depends on JACK and liblo… if you run into specific blockers you can always ping me.

[ oh shoot. that sounded wrong. i meant building the audio bits. in my experience - in which i am sometimes very dumb - developing audio stuff under a VM and expecting to hear it properly is a tall order.]

(i also build and run on macOS from time to time. windows is less easy but i’ve considered adding a portaudio option at least for testing softcut on windows. lmk if this would be of interest or if you would like to take on the [modest] task.)

the matron process is really the trickier one, more tightly coupled to POSIX/linux, and the norns hardware in particular. (since v2 clock updates it also has dependencies on jack and on ableton’s Link SDK.)

the open PR to redirect screen to SDL is elegant and functional. but, the prevailing situation for a little while has been: we need to work out one point of confusion around API breakage (we basically have to break screen.text_extents), then we can merge/redo both of the big pending screen changes which step on each other. (screen emulation on one hand, and dedicated screen-events thread on the other.)

i’d think that the SDL screen would provide most of what is wanted from “virtual norns,” since there is OSC emulation for keys and encoders.(right? actually i forget.) it may not be super ergonomic but i wouldn’t worry about that right away.

1 Like

Yeah, /remote/key and /remote/enc. I have not devised a good keyboard/mouse mapping for this, if that’s even desirable (vs using a midi controller or something). There are maybe a couple other things one would want to switch off in matron like battery monitoring but I think this is already fairly well-trod with the shield hardware etc.


sorry for seeing this rather belatedly… been a weird year i suppose.

i think i see a miscommunication here regarding swung midi clock.

now… what follows is my own understanding. i could well be wrong and if so i hope someone will correct me. (i also apologize in advance for over explaining things that may be obvious; it might be useful anyway to lay this stuff out a bit.)

but my understanding is that @artfwo is correct - the actual pulses coming from, say, an MPC or LM-1[*], are not intentionally disordered in time to create swing. swing is a construct created on top of that pulse.

its important to understand that these pulses are pretty dense, typically 48 or 96 per quarter note. this number is not random; it’s chosen precisely to facilitate building (for example) triplet patterns, dotted-triplet, etc.

that’s where these “classic” percentages come from: the LM-1 used 48ppq, and implemented swing by delaying the second 16th of each 8th note by some number of pulses. within that time period there are 12 pulses and therefore 12 possible discrete placements of that 16th note. the percentage indicates how much of the 1/8th duration (24 pulses) is occupied by the first 16th after delaying the 2nd by some number of pulses up to 11:

a = Array.fill(12, { arg steps; (12+steps) / 24 });{ arg ratio; postf("%\\%, ", (ratio * 10000).round * 0.01)}); 


50.0%, 54.17%, 58.33%, 62.5%, 66.67%, 70.83%, 75.0%, 79.17%, 83.33%, 87.5%, 91.67%, 95.83%

so that’s what i understand about swing and drum machines. i don’t know anything about how any midi module implements it, but (again, as i understand it) i think what defines the swung is the placement of note events, not actually midi clock messages. if your module is producing CV pulses on a swung pattern, then it’s surely doing so by changing their placement in terms of MIDI clock pulses - just as the old drum machines did.

[*] ok right, LM-1 was pre-MIDI; it used DIN-SYNC instead but the concepts are exactly the same.

so anyways. the next question i have is why you would perceive other devices as staying “in the groove” when norns does not. if the script in question is responding to midi notes, and the midi source is sending notes in a swung rhythm, norns will play the notes when they arrive (or so we hope.) if it isn’t sending notes, then AFAIK there is no way for us to know if it’s swung or not.

if you’re seeing a beat-to-beat or bar-to-bar drift, then that’s something we should know about. (currently we think the clock code is quite stable when paired with all the clocking devices we’ve tried… but i don’t know if that has included anything like a vintage 48ppq drum machine.)

that said… the “new” clock module (the details of which @artfwo knows better than i) is sort of an “opinionated” follower, which (if i understand correctly) does attempt to smooth out tempo fluctuations and unintentional clock jitter.

if you feel that it is not respecting incoming hardware signals to your liking (perhaps not faithfully enough) it may be worth knowing that the clock messages are totally available directly in the script, along with all other incoming MIDI. so you can simply drive a sequence directly from the midi handler in your script if you like.

(apologies… think i’m too tired to write up the demo right now. it would be perhaps 20-30 lines.)

i do understand that this is not a useful answer if you don’t want to write code… more apologies for that. but if there is an appetite for a “raw clock” mode, maybe we can discuss it.

(if you are interested in rolling your own MIDI-driven sequencer: the lattice library mentioned is a good fit for driving a sequence from a low-resolution clock like MIDI. it is formally very simple: basically, a bank of digital phasors with arbitrary nominal rates, updated at the clock’s sample rate. when a phasor wraps, a user-defined event occurs. because of aliasing, the timing of each phasor can jitter all over the place if it’s not an integer multiple of the clock, but it won’t drift.)

[i had more here on another topic, but… well never mind.]