Or, you know, a chess app that makes sounds… https://github.com/acarabott/chessMusic
worst case scenario i might learn from reverse engineering your code
A musical go game occured to me, although I’m not sure how it would map to musical events. All for anything that does interesting things with networked norns.
would be cool to scan sections of a go game’s state for sequence info or just to seed random stuff!
Ah, but composer Haskell Small does: See his composition A Game of Go for two pianos, 1987. Video:
all this reminds me that the “new thing” thread originated as a request for a modern 256 grid batch…
I wonder if norns could revitalize some interest for a larger grid again?
very interesting! i did an interactive piece for Go a long time back, details here: http://chailight.com/2011/07/16/invisible-territory/
My approach at the time was kind of literal and I feel like there was a lot more that could be done with records of Go games - specific patterns triggering different musical events. I could probably dig up the code I wrote for reading Go game records into Max if anyone is interested in playing with this kind of thing.
Back to norns though, it does make me think of whether it would be feasible to connect a camera and do somethihg with that other form of CV (as in computer vision, not control voltage)
it’s long been a goal of mine to have a piece that you can compose by playing a physical game. I’m not across Super Collider yet, but I’m guessing there might already be some support for camera input, like there is in Max/MSP? Something to investigate …
i know i’m definitely excited to try out the new and improved mlr with my 256—i imagine others might be as well
This is relevant to my interests.
You forgot to add a top shot of the grid
So… the more I think about this the more I think I can build one myself.
Either by taking advantage of the BetaBlocker UGen, or by writing my own simple stack only VM.
The language parsing to opcodes can be done in
sclang, probably as Polish notation (a la Forth / Teletype) to start with. AFAIK infix notation to Polish notation is a mechanical translation.
In a lot of respects, the problems that need solving are the exact same ones I was working with on the Teletype firmware.
I’ll ruminate some more on it, but if I’m still minded to take it on I’ll start a new thread to brainstorm.
IIRC, I think I might have been the one to mention BitWiz / bytebeats to @tehn first (in the context of using a Teletype and a TelexO to make them).
Its open source, and might give users some interesting things, out of the box.
It would be quite easy to change the Organelle display over to the Norns oled/encoder.
It’s parameter system is broadcast over OSC, so perhaps links up with the LUA layer, already runs on the rPI, just needs PD to run really.
Sorry not been following this thread in detail, been a bit busy, but technically Id envisage it as fairly straight-forward.
also i think it can be symbiotic relationship…
modules could be exchanged/developed between norns and other platforms
rPI development on orac could be boosted, e.g. Ive plans on how to make it multi-core, ‘norns developers’ could help if they wanted.
I can’t concretely port Orac to Norns (cant afford one ), but Id be willing to help/support the effort, if its contributed back into the Orac code base.
anyway, lots of ideas around Norns for now, so just one to chuck into the pot for perhaps future consideration.
Rhodia pads are the best!
really great project! fulfills a hugely missing feature for embedded linux sound instruments.
re: running on norns, probably wouldn’t be complicated. i don’t know much about organelle, but if the screen uses the native linux framebuffer (norns does) it’d be pretty simple to have the screen interface “just work”. norns encoders/etc are linux events (we tried to respect the existing linux model and not make many weird hacks) but can be easily routed via OSC. pd/etc has already been discussed. we’ll be publishing the OSC spec for interaction with
matron for custom DSP backends, but there are also tentative plans for more robust JACK integration which suggest similar chaining opportunities.
of course once we get the code all published it’ll become more clear where there is crossover potential
I love the Organelle, its portability, its flexibility…
However, one thing I wished was that I could run multiple patches at once, combine them in different ways…
or in-house: Orac : open source instrument thing ;)
just had a thought… imagine a patch system a la orac which adresses this:
compose building blocks to quickly build monome (or non monome) logic on-the-fly (and on device, with much more screen real estate). no idea how well this would work but it’s fun to think about.
I think both projects have already enough to explore without cross potential. but holy moly, exciting times!
didn’t realize the whole category where he posted that was on mute