Seeing what's happened with Bing, I don't think A.I is ready to write a novel yet.
The stories of the chatbot going rogue tend to occur after long conversations. The one reported on by the New York Times was 2+ hours long. Now the Bing chatbot is limited to eight messages and you have to clear and start over.
ChatCPT can fool people for a while. But in the end, it is not truly intelligent and the longer the conversation goes, the more cracks show. I've seen it spit out short stories. But I doubt it could convincingly write an extended narrative.
|