View Single Post
Old 05-04-2023, 06:04 PM   #179
Quoth
Still reading
Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.Quoth ought to be getting tired of karma fortunes by now.
 
Quoth's Avatar
 
Posts: 14,440
Karma: 107078855
Join Date: Jun 2017
Location: Ireland
Device: All 4 Kinds: epub eink, Kindle, android eink, NxtPaper
Quote:
Boffins at the University of California, Berkeley, have delved into the undisclosed depths of OpenAI's ChatGPT and the GPT-4 large language model at its heart, and found they're trained on text from copyrighted books.
The programmers of OpenAi tools (part financed by MS) enabled their programs by scrapping as much of the web as possible, without consideration if fact, fiction, conspiracy theory/fake news or copyright.
Then it's shuffled and reguritated.

This is not the same a human reading lots of stuff and writing a novel.

https://www.theregister.com/2023/05/...ght/?td=rt-9cp

It's a corporate funded fraud.
Quoth is offline   Reply With Quote