here is volume i of the goodrick books. (i would normally refrain from posting material like this in a public setting, but there’s absolutely nowhere you can purchase this anymore and i know of no current plans to re-print. if that changes, i’ll happily take it down and pay for a paper copy.)
@kisielk
yeah i like that idea about weighted choices for melodic intervals. it’d be easy to implement. that gets nicely at the distinction i was trying to make in the article about filters vs. sorts.
the other thing i’d like to look at is longer-range melodic rules. it’s still pretty crude as currently implemented, given that the algorithm can only compare the current chord and the next one. therefore, there’s no long-range melodic development possible. i’ve been looking at the fux counterpoint book to try to develop some strategies for this. i’d also like to move into territory beyond first species in the near future, maybe experiment with suspensions over the barline for more dissonance.
Sounds really interesting. I hope you keep us updated on your progress.
I know way more about programming and algorithms than I do music theory as I never really formally studied music, I’ve just started studying it in earnest the last couple of years. Still feel I have a lot of catching up to do before I start getting results I’m satisfied with
last article in my series is out today. it’s a roundtable conversation on computer algorithms in acoustic music with pianist dan tepfer, composer kenneth kirschner (12k), bassist/composer florent ghys, and jeff snyder of snyderphonics and the princeton laptop orchestra. we touch on topics including the role of algorithms in one’s compositional process, realtime notation and its performance challenges, and computer networking using max/msp, ableton link, and LANdini.
i find it interesting how these guys are approaching algorithmic composition in completely different ways, from the pre-compositional use of algorithms, to the realtime generation of graphic or traditionally notated scores, to digitally controlled acoustic instruments and musical data visualizations.
still a lot of wide open territory to explore here…
We had ESSL as our Composer in Residence in 1999 at UF Electro-acoustic festival and he woke a TON of people up to using the computer for composition maths instead of just brute force speed stuff
Yesterday was my first experience of bellringing - having read several of @mzero’s posts on the topic of ringing changes, I jumped at the chance to join a local bellringing society for an introduction to the basic techniques.
The group I have met seem focused on the practice of ‘method ringing’ - here is a resource for learning about this ancient tradition:
I know nothing about bell ringing, nor own an arc, but just reading the description I imagined an arc based app with bell physics and position via visual feedback??
I am excited to announce I am NOT releasing my algorithmic post-technoish album now. But I want to release it very soon and it consists mostly of studies on Mark Fell’s pattern synthesis and “regularly irregular” rhythm producing techniques with a personal touch, but I’d rather do it with a very small label than self-release on Bandcamp and I am currently not in touch with any labels that seem me to be appropriate. Does anyone have a suggestion of a label with which I could work?
This is it in its current state:
[EDIT: replaced soundcloud with the real thing, now released!]
Any suggestions/help/feedback is appreciated. Thanks!
hey @ParanormalPatroler, thanks! It is all Max/MSP and some live interactions with me tweaking parameters in realtime, some final MIDI touches on a DAW. Almost all are variations of the methods employed by Mark Fell in his Multistability album. There’s some discussion on his methods here:
Hi @ParanormalPatroler, now that the album has been released, I discuss a bit more of the whole process here – in case you want to know more feel free to ask there!
I made a little video where I use an envelope follower to make a generative patch react to live piano improv: machine duet 1 (VCV Rack + piano improv) - YouTube. Anyone else doing anything like this? I would love to hear other techniques for combining live music with algorithmic music.
Patch notes: Orca’s Heart provides the notes of the generative melody, feeding into the Vult Basal oscillator. Random voltage from Turing Machine gets quantized by Instruo harmonàig, producing the chords from the Instruo troika oscillator. An envelope follower on the audio output from the piano (the Fazioli Experience from Pianobook.co.uk). Output from the envelope follower feeds into a variety of modulation sources, including an envelope generator controlling the density of Supercell and a delayed envelope that opens the filter on the melody.
I made something in that direction some years ago, with the iPad generating melodies and bass notes triggered by my upright piano. I need to looking for the AUM project and go back to that!