norns: ideas

I woke up from a dream a few weeks ago thinking about someone making a version of the Antikythera Mechanism for Norns where the strangely looping geocentric orbits could be LFOs or generate melodies.

5 Likes

Amazing I will await this extremely eagerly! Since both the lfo and the stepped version look amazing, maybe there could be a menu option for selecting which mode. One sends smooth LFOs to crow (and midi CCs?) and the other mode sends quantised notes with selectable scales!
Because the moon idea sounds too fun not to do, in the absence of a 5th output, could it instead be used to modulate it’s planet? Like the moon of any planet represents the FM of that output. So a moon is a vibrato to quantised CV, and frequency modulation if it’s an lfo?

2 Likes

all this sounds great :slight_smile:

coming along :slight_smile:

19 Likes

Exciting! Have you seen this software called Quadrivium: https://www.giorgiosancristoforo.net/softwares/ ? I was really excited about it but it is attached to my computer =P I have used it exactly 0 times as a result :confused:

The thought of a Norns script using orbital mechanics would be wayyyy better in my opinion =)

2 Likes

Looking great! Just a little UI idea on the steepped/smooth thing, maybe a black planet (unfilled circle) is stepped and a white planet (like in your picture) is smooth… Then you could run stepped and smooth outputs at the same time, making for more useful planets to do the interesting alignment patterns with.
That way you could run one quantised pattern and a few LFOs, rather than having to have all quantised as you would in a dedicated stepped mode…

Edit: reading your original post i am realising maybe that the outputs do not specifically correspond to individual orbital bodies but the system as a whole, so colour coded planets won’t work? If so ignore me I’m just excited :stuck_out_tongue:

I have now! So cool! Tragically Win/Mac only, so I can’t try it out, but I would love to hear your experiences with this way of composition and/or any results you have from working with it.

Yeah, that’s right. The output values are derived entirely from the relative positions of the planets to one another, rather than the absolute positions. I very much envision the software as a pipeline - one page for the model itself, one for a numerical representation of the data. Then a page for configuring and displaying trigger conditions. Those two sets of data together are passed to page 4, for the track-and-hold config, whose output is passed to the 5th page for quantization and range limitation and then finally to the 6th page for output configuration. This way every value ends up musically related as they are all derived from the same set of related values. A bit like this:

A diagram

quintessence-diagram

(Oops, the second quantization and ranging page should be output selection. Sorry.)

My thought is that individual parameters can be mapped to MIDI controllers; for example, it might be quite musical to modify the track and hold tracking window dynamically, so you could open up or shut down any portamento generated by prolonged close passes. Obviously it would be pretty cool to map the planets’ angular positions to an Arc, too - displayed on the LEDs and editable with the encoders - but I don’t have an Arc, so that will have to be a community addition.

3 Likes

I haven’t used it for the reasons I stated initially - I realized that as much as I love the concept, I don’t want to be tethered to my computer workstation. Here’s a sample of the software in action that may be a good reference =) :

and here:

3 Likes

I’m thinking about making an adapter library so that I can send polyphonic notes to my elektron model:cycles.

All of the 6 available voice machines can be pitched and even the “chord” machine does polyphony a bit different.

When sending midi to the m:c each machine uses a channel (1-6).

So my simple idea is to make a m:c library interface thing that routes polyphonic note events to an available voice machine channel.

Are there any other instruments that could use this type of setup? I wouldn’t mind making this tool work with multiple pieces of hardware… I just don’t know if there are other machines out there that could use it.

4 Likes

i think a generic voice allocation algorithm would be great for a bunch of synths! different synths will have different ways of assigning the voice (i.e., not all of them will use 6 different midi channels like the m:c), and different numbers of voices, so it would be awesome to have those things be configurable :raised_hands:

4 Likes

i was just reminded that this voice allocation lib already exists, baked-in.

(if anybody feels excited about this one and wants to help put together a neat lil demo script, that would be raaaaad + will help close the gap on reference completion)

8 Likes

Ah that’s even better. It’s awesome to know what the thing I want is called too.

I’ll take a look at it this weekend and put together a demo.

1 Like

Music from celestial harmonies – very Pythagorean :slight_smile: Looking super.

1 Like

Im looking at the voice allocation lib that @21echoes and @dan_derks pointed me to. It seems fairly close to what I’m looking for but I need to extend its functionality a bit if I want to use it as I originally intended. So I would like to check back in to make sure this kind of thing would even belong in voice allocation tool before I go that route.

Im wanting to allow things like aftertouch and mod wheel/cc param routing.

Would it be better for me to make a library that uses the voice allocation lib to decide how to route those commands or should I try to modify the voice allocation lib to optionally handle message routing to existing voices?

1 Like

does a 10 band eq script exist? i keep thinking about buying one of the boss eq pedals to have a little more control of my synth sound shaping but it would be nice to have it in norns. maybe with each band represented on the grid so you can adjust then on 10 columns? even just scrolling trough bands with k3 and adjusting each with k2 would be cool

2 Likes

It would be really nice to integrate into Norns itself a way to run a user-defined script prior to any other code; for example, to integrate n16o, passthrough, and other “background” scripts into the core Norns experience for people who want them. I really want to use Norns as the center of my workflow, but having to edit every single script I download is a bit of a pain.

Passthrough itself is a good candidate for corization - it could exist in the SYSTEM=>DEVICES=>MIDI menu, for example - but n16o is not, and I think this would be an acceptable general solution to this class of problem.

8 Likes

I feel that elements of Passthrough could be good candidates, although I wonder if the scale quantization is too opinionated to be part of the midi menu itself. But yea, would love to see it in the core Norns experience!

1 Like

I’ve been in a months-long lull w/r/t making music but yesterday I thought up a norns app to generate workout jams that match up to my intervals (indoor rower). my goal is to equip 5% of Planet Fitnesses nationwide with a norns shield by 2023.

Summary



10 Likes

Script idea: a sound morphing script, in the style of the Krotos Audio Reformer Pro. You can import two sounds, the script analyses the audio properties and allows for morphing between then (with potentially additional effects?)

UI norns parameter redesign idea to merge the Parameters → Edit and Parameters → Map so that one clicks through from the edit menu to mapping, rather than backtracking from edit to map.

Status quo flow to map a parameter to MIDI cc:

  1. K1. Open menu
  2. E1. Select parameters page
  3. E2. Select Edit
  4. K3. Enter Edit
  5. E2. Select parameter
  6. (Perhaps K3. to open a submenu and jump to 5.)
  7. E3. Wiggle parameter value
  8. K2. backtrack to Parameter page
  9. E2. Select Map
  10. K3. Enter parameter list
  11. E2. Select parameter after making an informed guess
  12. K3. Enter mapping page
  13. K3. Start learning
  14. Wiggle MIDI controller
  15. K1. Exit menu

Suggested flow

  1. K1. Open menu
  2. E2. Select parameter
  3. (Perhaps K3. to open a submenu and jump to 2.)
  4. E3. Wiggle parameter value
  5. K3. Enter (moved) mapping page
  6. K3. Start learning
  7. Wiggle MIDI controller
  8. K1. Exit menu

:+1:/:-1:? Ideas?

Status quo menu structure
Parameters
`-EDIT
 `-parameter 1
 `-parameter 2
`-PSET
`-MAP
 `-parameter 1
  `-mapping 1
 `-parameter 2
  `-mapping 2
Suggested menu structure
Parameters
`-PSET
`-parameter 1
 `-mapping 1
`-parameter 2
 `-mapping 2
1 Like