-
I really liked this post from #JMS (creator of #Babylon5 and many other things) on why #AI isn't a threat to good story writers:
https://jmichaelstraczynski.substack.com/p/silence-where-a-story-might-have
He argues that LLMs lack the human experiences needed to write compelling and novel stories, and explains why.
I'd take issue with one point:
> But not a heck of a lot about whether it has an unconscious.
> Because the answer is immediately self-evident: no. There is only the data it has scraped, and the information programmed and designed by others.I'd argue that LLM AIs *do* have an unconscious -- in fact they are arguably almost entirely unconscious and every interaction with an LLM gives you a window into its unconscious and the associated biases.
However, their unconscious is very much unlike a human unconscious. It isn't filled with first-hand experiences of a life lived and books read. It is filled instead with what it has learned from reading a large corpus of materials. It is much closer to an alien in this regards -- one with no lived experiences and that has learned only about humanity and our world from reading transmissions from Earth.
I do think that some year it could be *possible* to make an AI that could do storytelling, but by infusing it with a lifetime of human experiences across a range of senses. However, this quickly turns into an ethical dilemma worthy of a serialized scifi story -- any AI that *could* do good enough storytelling would also be close enough to being a Person and sentient that having it run in a loop of living for hours at a time to spit out some story and then to have its self obliterated and reset would be highly unethical.
-
When your #OuraRing says this about a walk outside to the post office and a school maybe it's a sign the sidewalks are slippery...