11 Comments
May 7, 2022Liked by Roger’s Bacon

I'm currently working on a piece about the limitations of AI visual art. It seems to me that DALL-E, etc. are much more restricted in their abilities than the AI story generators. Which only means we will have to be more judicious and careful in our consumption of news and information . . .

Expand full comment
author

Interesting, i’ve gotten that sense to from looking at Dall-E art but im not sure i can put my finger on what feels “off”. Do send the article over when its done!

Expand full comment
May 16, 2022·edited May 16, 2022Liked by Roger’s Bacon

Here it is: https://www.ruins.blog/p/ai-art?sd=fs

Expand full comment

One can also imagine a bleak future where stories are regulated to be beneath a quantifiable level of impact regardless of whether they were human originated or not, simply because it's no longer possible to tell the source of a story.

Expand full comment
author

Ooof that is bleak. Someone needs to write a dystopian story about this…

Expand full comment
May 9, 2022Liked by Roger’s Bacon

Stories as technologies aligns with Ursula Franklin's notion of technologies as the way things are done around here, or simply, technology as practice. She pinched the idea from Kenneth Boulding who pinched it from ..... Franklin, U. M. (2004). The Real World of Technology (Kindle ed.). Anansi.

Expand full comment
May 11, 2022·edited May 11, 2022Liked by Roger’s Bacon

Question: if stories are technology, when what would be the difference between a "story developer/analyst", a "story scientist", and a "story engineer"?

Speculation: 1. Aarchive of Our Own or FanFiction (copy/derivative), Wikia/Fandom or TVTropes (dissective), deviantArt or WebToons (innovative). 2. Also I would predict that this will follow a 70-20-10 distribution. 3. The 10x storytellers (10x the proructivity and 25x rigor between the top 4% and the bottom 4%) will be in demand in the near future.

Expand full comment

I like the broad context and implications of this piece. We are absurdly vulnerable, given how our minds really want everything to be part of a story. I ran across some vignettes about malevolent story-telling outcomes at the AI Vignettes Project. That inspired me to write a long-winded, dark story of my own (Artificial Persuasion) wherein persuasive AI's pursue their unbounded goals and we don't even know what hit us. On the other hand, I've been trying to get GPT-3 to write an intro for a friend's photography book, and it's very hard to get out anything out of it that is neither irrelevant or trite. But, as you point out, we haven't (presumably, as far as we know =:-0 ) trained the models for max(persuasion). And somebody will probably do that, either to gain power or just to see what will happen.

Expand full comment
author

Absurdly vulnerable to stories is a good way to put it lol. Have you read Scott Alexander's story about Shiri's scissor? Not exactly about AI persuasion but very related to what you said https://slatestarcodex.com/2018/10/30/sort-by-controversial/

Expand full comment

Just read SA's wonderful story. Training a system for maximal controversy is very like training for persuasiveness, it's just persuasiveness X 2 . We already had training for maximum attention-grabbing in 2018. And all three tasks are all pretty closely related. There will be blood. It's funny that he (SA) is such a strong advocate for real discussion and listening to those who disagree with you, yet he writes a story in which an algorithm can find, for any group, a controversial statement that will destroy that group. I did not know of the story before, but I would guess it is one of his more famous posts.

Expand full comment
author

Haha yup very ironic. I only came across it recently but I think it's pretty well known and you can see why.

Expand full comment