Yup, it’s super super light. To better illustrate what’s going on, I reorganized the VCVRack plugin code so it’s in a separate repository. https://github.com/Dewb/monome-rack

Here are the only changes that have to be made to the whitewhale code: a few lines in main.c, and none in libavr32. One #ifdef to hide the flash NVRAM section attribute, and a slight change to how main() works, so the module init code can be called in isolation without entering the check_events() infinite loop. With the right inlining options, this might not result in any changes in the resulting firmware build.


Yeah, I totally get what you’re saying. Let me explain more where I’m coming from. I use white whale with a 256 (and @scanner_darkly’s 256 mods) and I perform with that setup every couple of months. Two of the three open issues for feature requests on WW are saving patterns to USB and treating the clock knob as a divider when the clock input jack is used. I would love both of those features and use them in performance immediately. I think I could tackle both, but I’m too lazy to do it if I have to debug blind with the hardware-load-test cycle, or take the module out of my rack to plug in some kind of ISP debugging interface. Now, though…

Making it easy to test these sorts of enhancements, and leveraging VCV Rack users as beta testers before things go to hardware, seems like an unambiguous win to me. The VCV Rack ecosystem is already pretty huge, if someone sees white whale for the first time and wants something from it that it’s not philosophically equipped to do, they’re just going to use another sequencer, I don’t think there’s much risk of a flood of orthogonal feature requests.


This sounds awesome! I’m excited to start poking around in the teletype code. I started with white whale because I was already sort of familiar with it, but teletype is obviously much juicier.


One more quote-response!

Oh certainly, an abstraction layer that’s much higher up (as @scanner_darkly describes) would be preferable. Hopefully things that are being actively developed can trend that way.

How the Mutable Instruments port was done is interesting. Most of the DSP and UI components are reusable C++, but for the VCVRack modules, the main loop of each module was rewritten inside the VCVRack class interface. This seems like 90% of the way there; changes to the base firmware DSP can be easily adopted in updates to the Rack plugin, but changes to the main loop would have to be manually analyzed and brought over.

C++ is totally viable on the AVR32. I’ve done a bunch of C++ on 8-bit AVR! C++ does give you a lot more dangerous toys to get into trouble with, but if you’re careful, it’s as efficient as straight C.

Olivier from MI also has two separate AVR abstraction layers using C++ templates:


Unfortunately, apart from the name, the AVR32 has little to do with an AVR8. I think the name was chosen for purely marketing reasons by Atmel.

The biggest issue with using C++ is that we’re stuck on GCC 4.4.7, and as C++ is a much more living language than C, that makes interoperability with code that targets C++11 onwards really hard. Also, and this is only anecdotal, but I think the AVR32 ASF code is not entirely C++ bug free. @zebra and @rick_monster might know more…


Ah, my mistake. Didn’t know that about the ASF and gcc limitations. I thought that all the Mutable stuff was on AVR, but on closer inspection the AVR ones are all Atmega644, the 32-bit modules use ARM Cortexes.


jumping in here cause i was tagged, and trying to follow whats going on… :slight_smile:
so, we are talking about potentially porting monome module code to vcvrack, which is a c++ project targeting win/mac/linux. okey dokey

and it has crept in that we do indeed have one layer of the aleph codebase compiling under c++ - namely the BEES application logic.

i’m not sure. sounds plausible. but the way we have dealt with it in bees is to avoid the ASF altogether.

originally, the hierarchy of components goes kind of like this:

1. ASF [lowest level drivers and stuff]
2. aleph/avr32_lib [aleph-specific boilerplate and drivers]
3. aleph/apps/bees [application logic]

we made avr32_sim a drop-in replacement for avr32_lib, using the same headers but replacing any calls to ASF functions. mostly its just placeholders.

since then, the modules were developed, and monome libavr32 was created from avr32_lib, and we just finished refactoring such that we now have

1. ASF [lowest level drivers and stuff]
2. libavr32 [common boilerplate and drivers for monome modules]
3. aleph/avr32 [aleph-specific stuff that wasn't included in libavr32]
4. aleph/apps/bees [application logic]

and now i think avr32_sim is probably a little bit broken…

but that’s just a limitation of the cross-compiler?

it seems like there are implicitly two different topics here:

  1. literally targeting the AVR32 hardware with c++. i agree that this is problematic for several reasons.
  2. targeting x86 with a c++ project that includes application logic from monome modules. i don’t see why this should be a problem. if the module project structure makes it difficult i think that’s an argument for refactoring the modules to make the application logic more independent and target-agnostic.

thought that all the Mutable stuff was on AVR, but on closer inspection the AVR ones are all Atmega644, the 32-bit modules use ARM Cortexes.

yeah, just to be clear when people say AVR they genreally mean AVR8 (atmega), AVR32 is more of an odd duck. kind of a relic of the time before ARM dominated the 32-bit RISC space, and unfortunately atmel has more or less stopped supporting it with new toolchains or libraries.

as sam says, avr and avr32 have very little in common apart from both being atmel products; avr8 continues to have low-cost, low-power applications and has received more support in the last 5 years.

i agree with this. if this were my project i would just focus on porting the core logic of each application…


Thanks! I started poking around avr32_sim, a bunch of helpful ideas in there particularly around the FTDI & HID simulation.

beekeep uses avr32_sim, right? Looks like that still builds.


I moved the firmware code into a separate module and started down this IPC road (with zeromq) but then I realized I could load multiple versions of the library in-process if I just created a new temporary disk copy of the firmware library for every new instance. Not pretty but it works! I definitely want to finish the IPC thing at some point, for the II bus, etc., but this is good enough for now. Loading multiple instances of the module works fine:

Now that the abstraction layer is a little more robust, I also started in on Teletype. Getting it to compile and emulating the screen was pretty straightforward, but the keyboard is going to take some work. Tracking progress on that here: https://github.com/Dewb/monome-rack/issues/11


@Dewb, I should probably just wait till you get a package made, but I tried to build this on my Mac following your instructions (using Visual Studio Code to make) and I got:
Makefile:22: ../../arch.mk: No such file or directory
Makefile:31: ../../plugin.mk: No such file or directory
make: *** No rule to make target '../../plugin.mk'. Stop.

any ideas? v happy to just wait it out, thought I’d ask though!


So I had this issue and solved it.

Are you trying to download and build it inside your downloaded version of VCVRack? If so, that won’t work. The instructions state it - though could be made more clear - you have to download the VCVRack source, compile it yourself, and then compile the monome VCV rack modules inside the plugins directory of your built version. The files you need to compile aren’t present in an off-the-shelf version.


That’ll do it, thank you! I am using a downloaded VCVRack, not a built. I wasn’t sure if I could fudge that part, but that makes sense. Now to free up some storage for Xcode (smh).


think you can get away with xcode command-line tools, rather than the full shebang.


Well, this is seriously cool. Got it compiling on the first try. Crazy to see the Grid light up after the right-click connection.

I’m having two issues that aren’t on Github. I can open issues there if you want.

  1. There’s a crash that occurs when deleting the virtual Grid if it’s connected to a module. If I switch WW to the hardware Grid and delete the virtual Grid, no crash occurs. I figure that this is probably known and related to the crash if the hardware Grid is disconnected.
  2. A lot of WW is working incorrectly on both my hardware and virtual Grid. When I place a trigger, it doesn’t place it under my finger, but rather on the same row and wherever the playhead column is. So, if I place a trigger on trigger row 2, column 2, it will place it on trigger row 2, column x (where x is the currently active column under the playhead). It also seems to have issues switching between the various pages (trigger, CV).


Yep, sounds like the first one is one of the known disconnect issues.

The second issue sounds like you may have held down the alt or meta key on the virtual grid and then switched to the hardware grid with the module thinking alt/meta are still held. Can you reproduce it in a session with no virtual grid? Need to automatically cancel held keys on disconnect (or properly simulate a FTDI disconnect message.)


That fixed it. Thanks for the fast response! Part of it is also my Kria-addled brain. It’s been a while since I had used WW. :slight_smile:


I was able to guess what the problem was because I’ve done the same thing! I theoretically am a heavy-ish white whale user, but a fun thing about this process has been realizing that I don’t know it nearly as well as I thought I did, and that it has a lot of awesome features that I’m underutilizing.


Oops, was trying to stay under the radar, but a couple days ago the VCV Rack Facebook group found out about this, and it got mentioned in CDM: :open_mouth:


Hey @Dewb, you also got mentioned on Rekkerd: https://rekkerd.org/vcv-console-mixer-released-hot-bunny-vult-modules-stellare-modular-more/

The CDM article has already doubled my Github traffic this morning vs. yesterday.


The work you guys are doing is great. :slight_smile: