I wrote a little Python lib for text generation with Markov chains.

Can be used from Python but also has a little Flask app for locally iterating in a browser (similar flow to playing with one of the translator web apps as described above — I’ll selectively copy paste the results back in, combined with other results, for further mutations)

It’s nothing amazing or unique, to be honest it was just for fun and to create a code sample for jobbing :sweat_smile:

If you haven’t seen it, check out the National Novel Generation Month project: https://nanogenmo.github.io/

Participants have the month of November to make a 50,000 word novel, generated by a program.

I particularly love this entry by Liza Daly, which is a James Bond novel where Q keeps showing James ordinary-looking household objects that are actually deadly weapons, and James makes bad puns.

I found a few very old max patches, mostly done by / done with / copied from a friend,
here’s a couple of screenshots:




5 Likes

thanks for all the cool works, folks!

@jzed house of dust is the closest to the kind of thing I want to do - I think my goal is to more inject my own bits of language (either as specific as words or as vague as a style of writing or mood) and then have something scramble it.

the max patch I posted is so simple that I barely define it as a generational process or app - instead of creating a completely new piece by itself, I give it the raw materials and lend it the job of formatting. it feels kinda lazy but I’m not sure if it is really (it probably is). here’s a link to the patch + the .txt file full of raw materials - https://www.dropbox.com/s/8ryq37yqgvcmg4b/poem.zip?dl=0

edit - the write to text file function is kinda broken for whatever reason, it outputs a file but isn’t natively recognized as .txt

Check out this recent Ben Vida piece https://technosphere-magazine.hkw.de/p/4-Heteroglossic-Riot-htw88uV5pNadqVLmj77mT9 … that whole issue about “Machine Listening” has some interesting work…the C. Spencer Yeh piece is great and uses speech but is not generative afaik.

2 Likes

Just joining in to follow along - glad the long history of the topic is being exposed, and somebody’s beat me to mentioning both Queneau and Allison Parish. I’ve played a lot with generative prose - both from Markov chains and slightly more sophisticated methods, such as grammars and copying form of other text. I also remixed a microstory by Jeff Noon into a physical artefact that generates ‘more’ of the books it’s fed. Which you might enjoy.

Tight corpuses, with a clear style, make for good bodies to feed into statistical processes like Markov Chains - the bot I run making infinite descriptions for chocolates works well precisely because chocolate-box-descriptions are so purple.

I love this stuff.

1 Like

Nothing useful, really, just wanted to say this is the best thread ever!

Cheers,
Lukasz

def! cosign 100x (and 20 char)

oh wow love this Ben Vida piece!

1 Like

So much great stuff already posted!

If you want to see this sort of stuff applied to humour, with deep research, check out http://aiweirdness.com

Especially posts like this machine learning ice cream flavour generator: http://aiweirdness.com/post/173990761332/generated-ice-cream-flavors-now-its-my-turn

4 Likes

Forgot that I made this out of the Amharic translation cycles. Using slightly different options made totally different translations. All the results are on the right side.

(Gotta open in new page to see large, I think. Sorry.)

3 Likes

“death is death and death” that’s excellent, thanks for sharing.
what translator did you use for it?

Just google, actually. It works totally differently depending on how you paste the translation back in (whether using the other alphabet or the English pronunciation and so on), though it looks like they changed its method a bit since I last tried.

1 Like

Oh

Relevant for NYC folks especially but I am not in NYC and still subscribe to the newsletter of event notices

Just because interesting projects are mentioned, then those people have interesting portfolios, and so on

1 Like

I have been toying yesterday with gpt-2 https://github.com/openai/gpt-2 and unfortunately the smaller model doesn’t perform as well as the model they are showing in blogpost but still for AI related question it responded with answer that sounds like something that AI could say:

Model prompt >>> Will the AI overlords treat us well?
======================================== SAMPLE 1 ========================================
What do we regret, for instance? Is our children even allowed to go to you?
Do we have to navigate over ourselves in Holodomor? 
When will our children be able continue their studies, become teachers,
godlings and so much more? 
In hunger, thirst and economy,-what do we feed them?
Only--harvest them for our idleness."
1 Like

I like Ekphrasis (and I also like using that word to describe creative process), particularly when writing haiku to describe the feelings evoked by an image. The syllable constraint makes that work for me but I’d guess I could try another set of rules, like not using pronouns or something.

I’ve had a lot of interesting times making odd generative art with argeiphontes lyre

http://akirarabelais.com/o/software/al.html

2 Likes

Amazing thread! I’m co-curating an exhibition that much of this relates to. If anyone is in NYC in March, please join us. Would be great to see you there!

66 Likes, 5 Comments - Darling Green (@darling_green_studio) on Instagram: “Please join us for the opening of Double Negative at ChaShaMa 320 W 23rd St, NYC, curated by…”

Hey all- Love this topic. I’ve always been into generative music and writing (thanks, ENO). I’m trying to find an old program I used to love but lost- Random Verse Lab. It was a simple list-and-template sort of poem generator, but It was very usable and fun, and all links mentioning it online are dead. Anyone have an old rvl32.exe lying around?

This is pretty astonishing / creepy https://talktotransformer.com/