Yes: Pisound sits right atop a raspi. Once you run their installer script, the Pisound is just there as an audio capture, and audio playback device in ALSA. It’s MIDI jacks show up in ALSA, too.

The Pisound itself doesn’t run anything on its own - it is just amps, DACs/ADCs, MIDI, some lights and “THE BUTTON”. Out o’the box (“out o’the installer?”) it runs PureData patches from the button, but it is easy enough to use what ever Linux audio software you want. I use PureData and it rocks, but others report using SuperCollider, SonicPi, MODs, and Jack software.

The button can be customized: It just launches shell scripts in /usr/local/etc/pisound/ so easily scriptable to do what you want.

2 Likes

yeah pisound is great, due to its standard 3/4" Jacks, and midi din, and the gain pots are useful, and just appears as a normal audio/midi interface.

only small drawbacks (no show stoppers) are:

  • if you fit behind a lcd panel, the midi din ports make the sd-card slot difficult to access (you need strong tweezers/small long nose pliers)
  • pots… limit how you can mount the board in an enclosure.
  • uses a lot of gpio ports, so not many left to use afterwards
  • no breakout for remaining gpio

Ive currently got mine mounted behind a 7" panel, and fortunately, the pots don’t interfere with the stand. though I’m probably going to build a small enclosure for the panel/pi, so that the pots can be a bit more accessible. (not sure if I can make ‘the button’ more accessible)

I think another strong point of PiSound is its getting a community behind it which is all about music application with a PI… something Ive not seen elsewhere.

anyway, back to organelle…
the reason I’m doing Organelle first is its more ‘supportable’… its a bit like iOS (Organelle) to Windows (rPI)

  • a closed platform, means delivery is easy, there is a prescribed way to deliver applications, and you know the setup your targetting
  • an open platform, means supporting users with different setups, different distros, different expertise… its a nightmare by comparison… and I think is what is partially holding the PI back for non-linux-hackers.

(linux, we have partly seen this with Axoloti, it seemed every Linux user we had, was using a different distro, that needed different support e.g. debian, Linux mint, arch Linux… then multiply that by different architectures, its a royal pain… compared to windows/mac os which get one image each :wink: )

Which LCD panel? Picture of this set up? For my current use, headless is fine, but I have another project forming that could use a small one.

Linguistic confusion brought on by poor app naming… :persevere:

Aftertouch as in the midi feature, or the app? Drop me a line if there’s anything I can do to make the app better :wink:

Oh no, I meant the midi parameter! Unfortunately I don’t own a device with force touch or whatever it’s called so I haven’t been able to try your app. I’m sure it’s great!

1 Like

@mzero here you go, a picture from today, whilst ive been testing the setup with the soundplane and axoloti.


I could do with tidying up the cables a bit, but whilst testing, I’m constantly pulling things in and out :wink:

so in this setup I have…
rPi3 + waveshare 7" touchscreen + PiSound (you can see on the back in 2nd pic) - pi3 is then connected to a ML Soundplane and Axoloti via a small 7 port usb hub, and a small wireless qwerty keyboard.
the pisound has the midi ports on the left, and the audio jacks at the top, so both unobstructed, the pisounds pots are on the back, so accessible for setting up, but you wouldn’t want to access whilst playing :slight_smile:

its not a bad display 1024x600, comes with a little stand, which is fine for now.

oh… you will need a good power supply, i had lots of power issues until I bought the official PI 2.5 amp.
even so, I’m still finding some usb devices seem to prefer being attached to a powered usb hub.
(and its a bit easier to get to the usb ports when unplugging things)

5 Likes

phew!

what’s the upcoming Eigen controller you were mentioning?

I dunno, it was @TheTechnobear that teased us!

The Pico above? It’s pretty old, you can get one right now :slight_smile: http://www.eigenlabs.com/product/pico/

sorry, my bad… I meant there are more controllers to come for (=supported by) MEC , and so organelle, one in particular I have special plans for.

Cool! I completely misunderstood though. :joy_cat:

This thread motivated me to buy an organelle! Looking forward to play with all the patches posted above :slightly_smiling_face:

2 Likes

You will not be sorry, between Custom O/s and cool patches we are building a pretty full featured pd design tool.

Here is my latest audio mangler, groover, stretcher sampler tool

1 Like

More progress on Organelle :slight_smile:

some details of what I’m up to here :

also perhaps important note… this means you can run Organelle patches unchanged on a rPI… using the Push 2 as you control/display device.

2 Likes

Apologies for asking before clicking through, but is the Push 2 bit simply MIDI, in which case I could substitute an SL MkII?

no the Push 2 has a high res, RGB display which you can access via USB (proprietary protocol, detailed here - it makes it a fantastic device for custom applications :slight_smile:

however, part of oKontrol is also going to allow for midi learn, so then you can assign a CC to each organelle parameter too. I want do this as a ‘live mapping’ rather than hard coding CC.

whats going on here , is the Organelle is using my oKontrol external , which then communicates with my ‘MEC’ application via OSC , then in turn renders on the Push 2. all this is done running on the Organelle, but it could be distributed across macOS, rPI, Organelle (basically at this stage its a ‘distributed parameter system’)
(oKontrol is all C++ , so its the same code used in the external and within MEC, just different callbacks etc)

of course, as its sending out OSC messages, potentially I can do this with anything that can process OSC.

anyway… for non-techies, the idea is simple, write PD patches once (I can add other language support), and then run on different platforms, using different controllers to control the patch. also if you read the post on C&G you’ll see that its really easy to add these parameters to patches.

my personal goal is pretty simple, I want to write generic patches, and then run them on Organelle, rPI+PiSound or Bela depending on its needs… (oh and test/develop on mac/linux) …
(also I was frustrated at the effort i was spending on the display side, when writing PD patches… so I wanted it simple to do, and to test)

3 Likes

And the hits just keep coming in organelle country

[looks at bob corrigan] ahem

1 Like

Interesting. I’d be very interested in your approach. I’m hoping to integrate a lemur patch with the organelle pd stuff and I want to find an approach that will work without having to modify every organelle patch. I’m hoping to add cv inputs as well.

I’m hacking up an organelle clone on an Rpi. At the moment it does everything an organelle does but I’m looking to extend it to take advantage of the extra cores on the Pi. I’m using a coloured TFT screen, an arduino via usb serial doing the knobs and buttons and a 6 in 8 out audioinjector sound card.

I’ve added an osc message to stop/start a pd patch inside mother.pd so it doesn’t need to exit and reload the pd application each time the patch changes. I am running 3 instances of pd~ inside the mother.pd, each running a different organelle patch. Each pd~ runs a modified mother.pd with different osc address. I’ve modified the mother c++ application to have an array of menu and app objects so it can maintain the state of 4 different organelle’s and switch between in response to a button press. I’m having a bit of trouble getting around the way the organelle uses the tmp directory for running patches but I’ll get there. I’m thinking I might need to write a script to parse the pd files in the patch directory to replace references to /tmp/patch with /tmp/patchx where x is the pd~ instance it is being loaded on.

Next step is CV/Gate inputs and outputs. I’m thinking I could make these appear as midi sources in pd and extend the organelle midi channel menu to allow the selection of a cv/gate source/destination instead of a midi input/output.

edit… Found your github page. nice!

4 Likes

sounds like a cool project @widdly
p.s. /tmp/patch… be aware, Im talking with owen about extending/changing this… basically as id like to separate data/preset storage from the patch directory.

interesting comments on the multi core aspect, not something I’m tackling at the moment, though its is something ive considered a little on how id tackle… one though I had thought was to ditch mother host from launching the PD application, and instead have mother host use libpd, so this would enable it to track multiple patches correctly… basically acting as a PD host (something like the way a DAW hosts VSTs) … but thats a non-trivial task.

generally though, I think Im trying to solve a similar issue as you, but in a different way…

In no way am I trying to clone an Organelle (no need I have 2 :wink: ) on other platforms,
rather I see the problem as one of hardware abstraction in patches

The issue I have, is I don’t think PD patches should be written so that they hard code the control interface… imagine if VSTs assumed a certain display size, or having 4 knobs… no, I patches should abstract this, and then it is the ‘controller’ that should have code written that can ‘render’ that patch in the most appropriate way.
again the VST example is very relevant… whilst they can render their UI, they also present a unified parameter list… and this is how things like automation are possible.
(amusingly, MEC, which oKontrol is part of, also runs as a VST, so theoretically, I can allow you to control oKontrol parameters from your pi/Organelle from the mec VST running on your PC/mac in a daw :wink: )

… its also why I think OSC is useful, so not only is there a PD external, and C++ api for interfacing, but also a ‘wire interface’.
(btw: im happy to talk to other developers about the osc messages, and how they might be improved)

why is this only coming up now?
I don’t know, I was genuinely surprised when I first looked (2 months ago?) at PD that there was not some kind of standardised parameters/preset system, that all patches used.
but I think the reason is, most PD patches were written for PC/Mac with screens, but here - we are interested in headless patches, patches controlled either remotely, or from midi devices, etc etc, perhaps this use-case is just now relevant to more people on different ‘micro’ platforms.

(as a side note, any talk I’ve made of a oKontrol touchscreen interface for the Pi actually has had a lukewarm reception, its a nice to have, but musicians seem more interested in getting physical control integration)

of course, my approach requires patch developers to adapt, and to ‘buy in’, which i will only be able to convince if they see big advantages e.g. easy to implement, and nice interface for users … so thats my first target.

fortunately, as I developed a whole bunch of Mutable Instrument patches which seem quite popular, I can use those as these guinea pigs… to let developers and users alike see the advantages (or not).
if they think it looks cool, and they like , they can use … if not, then perhaps it will inspire something else, this is what open source is all about (for me)

anyway, Im excited by the response so far… seems others also think this is an important piece of the puzzle for these kind of hardware setups :slight_smile:

anyway, less talk , more coding!

p.s. another side of this , for me, is its not just about PD!
e.g. after then next Axoloti firmware release, I plan to build an oKontrol interface into axoloti, so that Axoloti’s parameters are exposed onto the same infrastructure… so then ‘controllers’ build to use oKontrol to control PD patches, will then be able to control axoloti patches as well…
as I said, its all about abstraction :slight_smile:

1 Like

:musical_note: It’s starting to look a lot like Christmas :musical_note:

1 Like