Very different things, related by virtue of being very “playable”. I wish the orphion sound engine was configurable, but it does output midi.
Have you tried this?
Gauss is great, I was hesitant because I’m a huge fan of FieldScaper and the Morphagene but being an AU means you can layer instances of Gauss, really great feature.
And with Heinbach and Bram Bos it’s a no-brainer!
I’m pairing it with Koala and Koala FX for a really robust, small setup.
+1 to Tc-11 for playable by hand.
I’ve heard good things, but haven’t taken the plunge. An embarrassment of riches on iOS.
Can anyone suggest a method for achieving this within iOS? (There’s a video embedded in the post which demonstrates the theory within the first 30 seconds or so)
I’m thinking of two possible options:
using a sample and hold LFO to trigger samples somehow, maybe in a round robin style but with a couple of dozen hits
using some method of randomly generated FM synthesis to create hits on the fly (maybe from Ruismaker FM or Noir?) instead of samples and using some kind of mutating sequencer to trigger them
I’ve got vague notions of using a number of Bram Bos apps to get me there but wondered if anyone had any more concrete suggestions.
@junklight could Cality help, perhaps? Or maybe I could load up a bank of single hit samples within, say, Koala, and then input them all into Ioniarics to sequence them? (Is there a maximum to how many notes Ioniarics will accept?)
loading custom samples into Patterning 2 with lots of randomization would get really close I would think
That’s a good call and much more simplistic than what I was coming up with! I use Patterning all the time and yet it never crossed my mind to do this!
If I’m on the right page, I do something similar with Fractal Bits using any number of apps to trigger the samples (including mozaic/other Bos apps) and an LFO to randomly change the tuning of the samples (I use the built-in LFOs in ApeMatrix for this).
I can imagine that yields some very intriguing results! Using an LFO to control the tuning is an excellent call. The randomness options within Patterning basically does a very similar thing (albeit via an algorithm rather than an LFO)
Just to mix it all up a bit more, Sector still works fine despite no updates in a while, Glitchcore is cheap and neat, and Filterstep is an easy way to have random filtered variations pulsing.
Ah, I’m embarrassed to admit that I have all three of those and yet can’t remember the last time I actually used any of them! I used to love Sector - I can’t imagine why it has been left to gather virtual dust.
I’ve used similar methods to this, often pointing Rozeta Cells or Rhythm at Fractal Beats w/ Cality in between manipulating the midi. So I’ll have a relatively simple stream of events fed into Cality, and use its “chord” section to somewhat generatively add notes that then go to FB. This allows a lot of control over how many notes get added and to their distribution. Also the transpose portion of Cality can be played manually or through an LFO to vary the voices played in FB.
Inevitably some of them will gather dust. I’ve got way too many bloody apps, but occasionally I see one that I haven’t cracked open in a long while and magic happens. Replicant is good too, in case you have it and forgot it or haven’t tried it yet.
The Audio Damage one? I do have it, although I tend to use Bram Bos’ Scatterbrain mostly which I think essentially covers much of the same territory.
Yep, the audio damage one. Sometimes I use a combo of Scatterbrain and Perforator. All told, I seem to have a lot of apps (chiefly) meant for rhythm but don’t really make rhythmic music, so ¯\(ツ)/¯ .
Obviously the iOS app sales are in full swing. I’ll refrain from putting up everything that’s reduced on here, but would like to highlight LK from Imaginando which appears to resolve my workflow issues at last.
I had previously thought that it was designed as an Ableton Live controller for the iPad (which isn’t something I especially need) but I realise that I’ve disregarded it unfairly and that it’s actually a far more capable AUM MIDI sequencer than I currently have at my disposal.
My current method of working within iOS is essentially centred around playing AUM like a mixing desk. This often involves MIDI and audio, combining the two using simple AUM file players to loop audio sequences I’ve rendered from elsewhere (e.g. Koala). I then have the controls of the channels mapped to an external controller (e.g. a Launchpad XL) which allows me to make changes to specific elements on each channel and essentially capture a performance in a style which is very much inspired by the likes of King Tubby (although it’s not remotely anything like Tubby’s output).
What I have always wanted to do is be able to cycle between different elements so that I could add variety (say, for example, a multitude of drum patterns, a variety of basslines etc) rather than being restricted to them essentially being either on or off.
Also, I’m thinking that I could maybe use something I already have to replace the file players (EG Pulse, for example, if that would allow me to load loops and latch them indefinitely until they are replaced by something else within the same choke group). In this setup the loops would then be controllable via MIDI, opening up the potential of utilising a MIDI control/sequencing functions.
Looking at the video above, it seems as though LK can help me achieve precisely this!
[Also, all Imaginando apps (and IAPs, as I understand it) are 50% off throughout November]
So for those using AUM as their main mixing hub, how are you handling multi-track recording? This the one main aspect of iOS I’m not sold on. I prefer working on the Mac for mixing, it feels cumbersome to mix on iOS. Thus far all I’ve been doing is recording a 2 track mix of my jams, which is fine in AUM, however I’d like to get back into multi tracking and overdubbing. Ideally I’d use Live or Logic as my Daw and route stuff through AUM to process on a few plugins but that doesn’t seem like an option without using two interfaces. I got my Mac to recognize my iPad as an interface through Audio Midi setup but I was bummed to see that you can only send audio from the iPad to the Mac, but not the other way around. I’d love to be able to send audio from a DAW to process through a few iPad plugins that have some tactile control, like Borderlands etc, but I don’t see a good way to do this without running audio out of my one interface back into the headphone jack of the iPad. That just seems silly to add another layer of D/A/D conversion. Any solutions to this?
You could use AUM to record all of the stems and then export/import them into either an iOS DAW or a desktop DAW for mixing mastering?
This is how I do it if I go down that route. Also I’ll occasionally use the quantum loop record function so that I generate loops of a specific length. To be honest, I’m more interested in sending my iPad tracks through my old Roland PA120 desk and dub mixing them live using the spring reverb and delay, only capturing a stereo master in my PC