Aleph as Blackfin development environment?

Heya, a nice friend lent me an Aleph. I’ve been working on my own DSP projects using a realtime Linux kernel on commodity SoC units like Raspberry Pi and Allwinner A20 dev boards (cubietruck). I’ve been intimidated by the Analog Devices ecosystem for a number of reasons, one of which is the development toolchain. When I discovered the Aleph literally loads a compiled ELF binary onto a Blackfin built through a GCC toolchain I was quite excited.

So…my question is, what is the difference between a fancy Analog Devices development kit with their VisualDSP++ IDE and an Aleph with a properly built “bare metal” GCC toolchain on a typical Linux computer?

2 Likes
  • the aleph is more fun to play!

  • it has 10v in/out and other stuff

  • we’ve already done a lot of the hard work setting up low-level stuff on the blackfin (like SDRAM driver, codec drivers and so on.)

  • aleph has the avr32 for controlling the blackfin. which is also the processor for monome euro modules. so there is some potential for cross-pollination there.

  • as you note, it’s set up to boot arbitrary firmwares over SPI, while the unit is running (blackfin is reset by avr32.)

  • VisualDSP++ costs tons of $ and is closed-source.

basically i’d say the aleph is very useful as a blackfin dev platform. i guess you could even use .elfs compiled with VisualDSP++ if you want.

but, if your goal is to use the blackfin for some radically different purpose than what we use it for, than obviously it would not be that useful.

we use it to drive AD1939 audio codec with 4 ins and 4 outs, and also to drive the CV outputs via AD5686R. we don’t use it for UART or I2C or anything else. on aleph, any interface with the outside world that is not happening at audio rate, goes through the AVR32. (this includes 10v input, which is not an immediately obvious decision.)

oh, i should also mention that the codec is hardwired to be in ‘standalone’ mode - which means that the codec generates the audio frame clock, and the bfin gets an interrupt each sample.

3 Likes

also if you haven’t yet, see https://github.com/monome/aleph

Heh, thanks for the response. I haven’t unscrewed it yet but I read the schematic on github and noticed there are even JTAG headers!

Thanks for the confirmation. I think I understand this project better now.

Also hi @tehn and @zebra ! my name is Lee. I work in software development and have been doing hobby hardware stuff for the last two years out of a hacker space in SF called Noisebridge. I’m long time friends with Alfred (Daedelus) who is responsible for me having this cool sound computer in my possession.

2 Likes

nice! noisebridge is awesome. i’ve played a couple of noise pancakes since they moved it there.

1 Like

Nice! Sadly, the last noisepancakes show for an indefinite time was two weeks ago.

Since we’re on the subject, I have a followup. I understand if it’s contentious but it was the first place my intuition took me.

Why did you choose to run firmware with no OS on the Blackfin? I ask because when I saw there was a GCC toolchain, my intuition was like “cool now I can cross-compile Puredata and run my own patches on this rad DSP computer, though I might have to convert all 32bit floats to fixed point.”

I don’t even know if that’s a good idea but it seemed like a rational one.

1 Like

because the computational overhead involved in running ucLinux would not allow for much single-sample audio processing. even doing everything in float would be prohibitive. even with fast floats (non-IEEE) a float operation might be 10 instructions.

and because in many ways it is kind of less complicated to develop this very simple straightforward functionality on the metal.

you could maybe get something going with libpd and uclinux on the bf533. it would be a pain in the ass because you would have to make a kernel module for the codec and so on. and a big limit is flash size. we didn’t add any flash memory to the blackfin, so the entire firmware has to fit in 64K (one bank of SRAM.) good luck getting a useful linux with audio in there. but, it’s fast.

at the end of that, you’d be able to make a sinewave or something, with lots of latency.

[ed] so i guess i’d say: if your main interest is linux on blackfin, aleph is the wrong platform. simply because you would want more flash.

[i’m not exactly sure what people use uclinux on blackfin for. i can imagine it being valuable for instrumentation and data logging applications, but suspect its existence is more a “because we can” thing.]

if you want to play with bare metal gcc programming for audio processing on blackfin, aleph is probably the perfect platform.

our own @rick_monster has even extended the tooling substantially, so you can compile blackfin audio programs as pd externals, and you can load blackfin .elfs to a running aleph unit from the command line. (since we are doing this in hobby/hacker mode, you have to dig a bit and play around, but it totally works.)


Sadly, the last noisepancakes show for an indefinite time was two weeks ago.

fuuuk. SF, you’re breaking my heart… again

2 Likes

our own @rick_monster has even extended the tooling substantially, so you can compile blackfin audio programs as pd externals, and you can load blackfin .elfs to a running aleph unit from the command line. (since we are doing this in hobby/hacker mode, you have to dig a bit and play around, but it totally works.)

Oh that’s rad. Like totally rad.

More irrelevant details, I discovered the Blackfin range of DSP chips when I disassembled some silly USB mic called the Blue Tiki and turned out it had a Blackfin for noise cancellation built by some company that now seems to do some kind of hardware SIP licensing for their previous VST plugin business. I thought it would be lulz to say “my microphone runs Linux, because I can.” So I’m not coming from an EE background. It’s mostly hacker, though I’m trying to get better at actual EE stuff like drawing, math and breathing solider fumes while remaining calm.

I reached a certain point of algorithm complexity where I couldn’t debug my fixed-point code using just the device! So being able to cross-compile on a desktop computer is pretty valuable. I haven’t used the pd external much yet - there’s a wrapper to compile blackfin audio programs as linux console apps.

One technique I like a lot is to set up some kind of desktop test harness for a new tricky thing such as cubic interpolation inside the blackfin audio program I’m working on. Then simply print the input/output to your algorithm as a comma-separated-value file inside the console app. This enables use of a simple graphing tool like gnuplot to visualise what the code is actually doing, as opposed to what one thinks it should be doing…

it’s just great if you want to hack on digital guitar effects. I had this itch to scratch when I bought the thing which is I wanted to really understand how pitch shift, chorus and those types of delay-based effects work. When you hack on them for the aleph you get the bonus that they’re in a very low-latency package so the ‘feel’ is as good as a digital effect can be.

And also (though I still struggle to get my head round musical uses of this) your pedal idea will automatically hook into a moderately capable algorithmic control ‘ecosystem’. So for example you could turn a pitch shift pedal into a guitar arpeggiator, program & recall your own ‘presets’, etc… (hah - I guess for someone already versed in pd this is old hat, but BEES is significantly different!)

This is officially off topic now but hey! @rick_monster have you seen the MOD Devices Duo guitar effects pedal? I got an evaluation unit and I love it. It uses the LV2 plugin standard, which for me was a huge win due to the abundance of plugins and source code. I did a teardown video. It runs a (sort of) normal Linux distro with RT preempt and a custom sound card based on a Cirrus Logic IC. JACK handles the audio graph in userland and round trip latency is < 6ms.

https://youtu.be/MibmdZtAuOM

6ms is impressive! but 0.02ms is another kind of thing.

1 Like

Well, yeah. I can talk all day about the diminishing returns on imperceptable latency. Before I knew about the Aleph and Duo I prototyped a guitar pedal with a raspberry pi, USB audio interface and JACK. Got it down to 5ms not including ADC/DAC. We live in interesting times when $35 project board and a $60 Craigslist audio interface with free software can be a guitar amplifier, cabinet and effects chain.

diminishing returns if you go source -> process -> sink. not for applications with feedback… that’s what i mean, its qualitatively different. with single sample latency the aleph can be a delay block with arbitrary processing, in any circuit.

of course its cool that there is cheap hackable processing and embedded OSs everywhere now. even into 2nd, 3rd, nth generations, so really you can do stuff with no money whatsoever (discarded android phone.)

bf533 is old and weird. it won’t win awards for raw power or flexibliity. but its a beautifully made architecture and generally fun to work with on the bit/sample level. on the float / block level, linux on ARM makes more sense i think.

2 Likes

/me nods

I get that. I got some ideas for a delay with a pre-feedback send.

Kind of cool! I might even bite if it were in a klunky multi-fx format with a big rack of finger-dials, many foot buttons & a foot-driven pot or 2. Those things were the naffest thing ever when all they do is cycle through crappy digital versions of a marshall stack or whatever & a nasty sounding wah pedal, but I have this feeling with a programmable one you could do a really neat live show with a guitar & some tapdancing - obv incorporating a full-featured looper. But then again you can do really neat stuff with just a looper, it just takes a lot of time & practice!

I pretty much lived inside linux RT for the last year or so - love that ecosystem! I really think it’s state of the art for a lot of things especially rapid prototyping… Think how much you’d have to invest in proprietary software to get legal versions of tools like ardour/lv2, jaaa, the faust compiler, pd, gcc. At some point I’ll probably get a little pedal-board linux box like the organelle or whatever… But it’s all more stuff and I have a enough of that already.

also when you have 0.02ms you can use algorithms with ~6ms inherent latency without ending up at ~12ms.