Procedural / Generative / LLM Text, Poetry, etc.

I’m planning on putting out a zine of art / text / poetry etc in the near future and it’s got me thinking about methods of procedural / aleatoric writing. In general (like with my music and other art things) I prefer such methods - if I sit down with a set goal in mind it generally gets muddled or feels too on the nose, especially with something as cut and dry as text.

My favorite method has always been the Burroughs cutup method - I love how Burroughs ended up making entire works out of it - "The Cut Ups," (1966) William S. Burroughs - YouTube

On the software side of things, one of my favorite monome apps has always been the (very) old dunefarm - collected/dunefarm at master · monome-community/collected · GitHub

I’ve dabbled in making “poem” apps in Max too, basically just a few coll objects calling lists of text -


What are your favorite or preferred methods of making text?


I sometimes go through old notebooks and select random snippets from what I’ve written. It’s nice if you can find unrelated segments with similar language or rythmn.

1 Like

as with audio, OS 9 was full of generative text apps. id love to see what you came up with in max!

i’ve always found the trad cut-up method to be tried and true and used a number of procedural ways to extract poetry from text. i “think” argeiphontes lyre has or had some great generative text tools.


Been using JanusNode most of my life - since the classic mac era when it started out as ‘McPoet’ - now it’s freeware on OS X and Windows.

used it for many Cementimental track/album titles, some cutup writing, and also used it to write surrealist object generation rules which years ago I used to design an art installation, generating text descriptions of a crazy golf course and then building it IRL. Those rules (TextDNA as janusnode calls them) are now included in the program :slight_smile: I think i’m the world’s #1 user of this software, hope some of you will check it out too :slight_smile:


I spent a lot of time watching this in action when I was working at a museum hosting his exhibition a few years ago.


Although it’s crazy complex, it’s worth checking out Christian Bök, specifically his Xenotext project.

A good friend is working on a “completism” book (art that has to deal with combinations/permutations/etc…) and he was telling me about this oldschool 60s/70s writer who was using very early permutation and computation algorithms to write poetry. I’ll see if I can find out his name, but it’s definitely an interesting idea.


Most basic and most effective for me was writing small context free grammar for the specific use case (like for example character description generation) and using this grammar to generate text. But I have heard that people got great results using recurrent neural networks with lstm so it would be probably nice to check it out in the near future.

Chiming in just to say everything Allison Parrish does in this sphere is wonderful!


Raymond Queneau’s - 100,000,000,000,000 Sonnets

Alison Knowles and James Tenney - House of Dust

I’ve generated text for pieces using Word’s autosummarize a few too many times. It seems to have disappeared in recent versions.


I also have used max patch for creating duplets and triplets of word groups or pairings. My normal process involves inputting a series of individual words, sometimes word pairs, noun and verb. These are all my “favorite” words at the time of input. When i have enough input, turn on the metro and it randomly pairs the words. Most of it is rubbish, but somethings jump out. Then the writing begins to try to place it in a structure. Mostly, i use it for titling pieces, or to inspire visual ideas. I have an instagram post of it running for a Disquiet project:


I have used YouTube auto captions with the cut-up approach to make Dadist poetry. Done for fun rather than as a deeply serious artistic statement but I was very pleased with how it worked out.


Years ago I did something similar with YouTube auto caption mashing up modular and speech sounds into text…

Also did a series of works which took the entirety of the Snowden archives, ran them through OCR, and then generated simple Markov transition models for every word/symbol that appeared and used those to generate text strings. Unfortunately I just found out moments ago the twitter bot that archived a lot of these messages for easing viewing online has been suspended due to violating twitters rules, but a lot of the text is in a book I made as part of the project (Vol. 2 white pages, or the edges of the black pages - warning PDF is large)…

1 Like

interesting to see this topic here… a few days ago i had begun thinking about the possibilities for creating synthetic words using basic letter combinations.

@mdg wow, just checking out some of allison parrish’s work. this video on vectorizing words based on distributional meaning & phoneme content is pretty mind-blowing.


I’ve been using computer language translators for a million years now, embracing their mistakes and baffling word choices. For one band, I write lyrics in english and translate them into german with google translate, which is bad but not horrid, but the lyrics suit the fact that inadequate technology is to blame.
My long running favorite for hilarity though is passing text back and forth between korean and english, i.e., start in english, translate to korean, copy and paste translation and translate back to english, cycle until something interesting comes out.

The newest and so far strangest method is same as above, but english to amharic instead.

Penetrative Injury to the Face Resulting in Delayed Death After Rupture of a Cavernous Sinus Aneurysm on the Contralateral Side.
Amharic translation:
ዘግይቶ በመሞቱ ምክንያት መሞቱን አጣጥሎ የመሞቱ አጋጣሚ በካንደሬሲን ዝንጀሮ ከተሰረቀ በኋላ ማዕከላዊው ጎን ለጎን አለመስጠት.

Results of back and forth cycles:
1st cycle. Because of his death, he died at the center of the cocaine after having been stolen by a Kandorsi monkey.

2nd. After a Kondary monkey’s death, he died in a cocaine middle of the night.

3rd. After the death of the baboon, the Kannada died at midnight.

4th. Canada died in the middle of the night after the baboon was dead.

5th. After the baboon died, Canada died in the evening.


I also used to death that technique, it was “better” when the technology was bad.
a few years ago there was translationparty . com that did automatically iterate between english and japanese, but IIRC at some point it wasn’t working anymore because of some issue with google translator.

that back+forth english-ethiopian is hilarious.

1 Like

I loved translationparty for saving me time. I’d wished they had made a version that let you choose the language, and I think someone else did at some point, but I can’t find it now.

1 Like

I love this guys stuff. Nice thread btw

Thank you for sharing this! I may yoink your colls for bringing into my scripts :stuck_out_tongue:

I do something similar to this for prefixing the name of each sample within a folder of samples. To create nondescriptive but mnemonic sample names, at the same time as getting a random sorting (for anything that sorts by alphanumeric). For them to be mnemonic it’s helpful for them to be pronounce-able.

This ends up being a huge flow helper when auditioning or hotswapping different samples in Patterning 2 and ElasticDrums (I use both on iPad a LOT).

I don’t do it for all my sample folders but I do it for ones where otherwise the list becomes monotonous/lifeless. ie I had a folder with keychain shakes, misc chimes, shakers, all manner of rustlings and “junk” percussion. I found I’d end up “preset surfing” for the perfect chime or something and I’d deceive my ears to think nothing worked. Using this method to put the list on “shuffle” helps counteract the perfectionism as well as lead to surprises.

And yet because of memorable names I still remember some distinctive ones that have worked well repeatedly, and know to try those often/first.

Will post my very simple CLI script for this later :slight_smile:

1 Like

I wrote a little Python lib for text generation with Markov chains.

Can be used from Python but also has a little Flask app for locally iterating in a browser (similar flow to playing with one of the translator web apps as described above — I’ll selectively copy paste the results back in, combined with other results, for further mutations)

It’s nothing amazing or unique, to be honest it was just for fun and to create a code sample for jobbing :sweat_smile:

If you haven’t seen it, check out the National Novel Generation Month project:

Participants have the month of November to make a 50,000 word novel, generated by a program.

I particularly love this entry by Liza Daly, which is a James Bond novel where Q keeps showing James ordinary-looking household objects that are actually deadly weapons, and James makes bad puns.

1 Like