When you say “By carefully crafting prompts you can put words on a page that look or feel to us like a story, but to GPT there is no story.” I think your issue is with how it’s doing it? But the results are the results. It doesn’t matter how the cake is made if customers are happy to eat it. The same “forgery” happens with human writ…
When you say “By carefully crafting prompts you can put words on a page that look or feel to us like a story, but to GPT there is no story.” I think your issue is with how it’s doing it? But the results are the results. It doesn’t matter how the cake is made if customers are happy to eat it. The same “forgery” happens with human written work; writers will pop back to the beginning of a book to slip some foreshadowing in later.
The fact the software doesn’t comprehend the whole story at once is largely irrelevant? Humans forget huge chunks of a story too. I’ve written a lot of books and I’m frequently amazed at what I wrote when I go back and read them. I’ve slipped in ‘artificial’ foreshadowing at later times, and I’ve only been able to keep track of the whole structure by referring to my outlines. I can get an AI to do the same thing. You can get it to keep the “outline” in mind.
And with your first point, “what did character A mean when he said this?” That would depend on the context. If it was referencing something that happened outside of the context window, then it would indeed just be guessing. But if it had the story outline as well, it would probably get it right. A forgetful human would suffer the same problem.
I think (correct me if I’m wrong) that you think there’s something deeper to story; that it has some kind of… I don’t know… soul? Heart? that needs a human to comprehend it in order to create it. But I don’t believe this is necessarily the case.
About a year ago I was very confident that we were at least a decade away from AI creating whole novels. But things happened a lot faster than I was expecting. Less than a year later, I know people publishing fiction novels entirely created by AI! The fact the Ai doesn’t “understand” the stories is moot, to my mind. Humans do a lot of things without understanding what they’re doing as well. (Walking, running, jumping etc.)
't doesn’t matter how the cake is made if customers are happy to eat it.' -this seems rather shortsighted. The Romans used to sweeten their wine with lead. Yumyum and the effects of heavy metal poisoning are slow and subtle. The whole idea of AI is to deceive ourselves, that is the heart of the Turing Test after all. We are not creating intelligence but simulating it.
'writers will pop back to the beginning of a book to slip some foreshadowing in later.'-this isn't the forgery, this is exactly what GPT can't and doesn't do. That's the right way to do it, the story evolves and grows in the writer's mind such that the later events exert a ripple affect changing the earlier events, there is an actual 'something' there in the writer's mind that is 'the story'. The proof of this is that the writer knows that the words may or may not tell the story well, or accurately. When the author changes the words he feels that they must be changed in a way that they are more faithful to 'the story'. He is trying to communicate something to the reader which is not words on a page.
The story has an independent existence. If I ask you to think of the story of Hercules it is unlikely that you are thinking of a particular telling of the story, perhaps some images that you have seen pop into your mind, but neither the images or words that you have read that tell the story are the story itself. And in judging any telling of the story you judge the words and the story separately. You might find one telling has particularly apt words and is well written, both technically and artistically, but criticise it because it does not match the Hercules story that you feel to be most real, on whatever grounds you make that judgment call.
A great example of this is the Hunchback of Notre Dame. The story of this outcast, this 'half made'(Quasimodo) man and his complicated relationship with Frollo and unrequited, even unperceived, love for Esmerelda and all of its interweavings is fascinating. But Victor Hugo is a terrible storyteller. Most of the book is almost unreadable minutiae about the cathedral and the streets of Paris. To Hugo, the story was actually about the building. The actual name is Notre Dame de Paris, the characters and story were just something that happened at a terrifically interesting building to him. But, having received the story, I can disagree with the author, because it has an objective reality. I can find elements of the story jarring and incongruous, because the author has failed to make a particular element match the rest of the story. I can say, 'Quasimodo hates Frollo.' even though Hugo perceives him to love Frollo. I can rewrite the story with a happy ending, as several people have, and it remains the same story. Anyway, my point is that words and stories are different and separate things. If they aren't then the writing is flat and stupid like an AI's. And no increase in RAM or streamlining of algorithms or dumping more training data in will change that.
Hmm.
When you say “By carefully crafting prompts you can put words on a page that look or feel to us like a story, but to GPT there is no story.” I think your issue is with how it’s doing it? But the results are the results. It doesn’t matter how the cake is made if customers are happy to eat it. The same “forgery” happens with human written work; writers will pop back to the beginning of a book to slip some foreshadowing in later.
The fact the software doesn’t comprehend the whole story at once is largely irrelevant? Humans forget huge chunks of a story too. I’ve written a lot of books and I’m frequently amazed at what I wrote when I go back and read them. I’ve slipped in ‘artificial’ foreshadowing at later times, and I’ve only been able to keep track of the whole structure by referring to my outlines. I can get an AI to do the same thing. You can get it to keep the “outline” in mind.
And with your first point, “what did character A mean when he said this?” That would depend on the context. If it was referencing something that happened outside of the context window, then it would indeed just be guessing. But if it had the story outline as well, it would probably get it right. A forgetful human would suffer the same problem.
I think (correct me if I’m wrong) that you think there’s something deeper to story; that it has some kind of… I don’t know… soul? Heart? that needs a human to comprehend it in order to create it. But I don’t believe this is necessarily the case.
About a year ago I was very confident that we were at least a decade away from AI creating whole novels. But things happened a lot faster than I was expecting. Less than a year later, I know people publishing fiction novels entirely created by AI! The fact the Ai doesn’t “understand” the stories is moot, to my mind. Humans do a lot of things without understanding what they’re doing as well. (Walking, running, jumping etc.)
't doesn’t matter how the cake is made if customers are happy to eat it.' -this seems rather shortsighted. The Romans used to sweeten their wine with lead. Yumyum and the effects of heavy metal poisoning are slow and subtle. The whole idea of AI is to deceive ourselves, that is the heart of the Turing Test after all. We are not creating intelligence but simulating it.
'writers will pop back to the beginning of a book to slip some foreshadowing in later.'-this isn't the forgery, this is exactly what GPT can't and doesn't do. That's the right way to do it, the story evolves and grows in the writer's mind such that the later events exert a ripple affect changing the earlier events, there is an actual 'something' there in the writer's mind that is 'the story'. The proof of this is that the writer knows that the words may or may not tell the story well, or accurately. When the author changes the words he feels that they must be changed in a way that they are more faithful to 'the story'. He is trying to communicate something to the reader which is not words on a page.
The story has an independent existence. If I ask you to think of the story of Hercules it is unlikely that you are thinking of a particular telling of the story, perhaps some images that you have seen pop into your mind, but neither the images or words that you have read that tell the story are the story itself. And in judging any telling of the story you judge the words and the story separately. You might find one telling has particularly apt words and is well written, both technically and artistically, but criticise it because it does not match the Hercules story that you feel to be most real, on whatever grounds you make that judgment call.
A great example of this is the Hunchback of Notre Dame. The story of this outcast, this 'half made'(Quasimodo) man and his complicated relationship with Frollo and unrequited, even unperceived, love for Esmerelda and all of its interweavings is fascinating. But Victor Hugo is a terrible storyteller. Most of the book is almost unreadable minutiae about the cathedral and the streets of Paris. To Hugo, the story was actually about the building. The actual name is Notre Dame de Paris, the characters and story were just something that happened at a terrifically interesting building to him. But, having received the story, I can disagree with the author, because it has an objective reality. I can find elements of the story jarring and incongruous, because the author has failed to make a particular element match the rest of the story. I can say, 'Quasimodo hates Frollo.' even though Hugo perceives him to love Frollo. I can rewrite the story with a happy ending, as several people have, and it remains the same story. Anyway, my point is that words and stories are different and separate things. If they aren't then the writing is flat and stupid like an AI's. And no increase in RAM or streamlining of algorithms or dumping more training data in will change that.