Quote:
Originally Posted by Solitaire1
A question to ponder is: Does an AI require emotions to become safely sentient? In fiction, it seems often that making a sentient AI without emotions leads to disaster. The following compilation of scenes from Green Lantern - The Animated Series provide an example (beginning at 6:49, Aya has turned off her emotions), and inspired this post:
|
Ye gods, let's hope not, although I fear that that's how most people would adjudicate "sentience," unfortunately. That's all we need; a neurotic or paranoid computer. Mine is cranky enough without a midlife crisis, or what-have-you.
Quote:
New topic: Is it possible that, in the future, movies will no longer be a finished product. Instead, they will be a work in progress.
An example of this is the original concept for the Disney movie Fantasia. Originally, it was supposed have segments added and removed each time it was released but the idea was put to the side. It was revived a bit with the release of Fantasia 2000, where all of the segments but The Sorcerer's Apprentice were removed and replaced.
In a related way, could it be that movies could become more like choose-your-own adventure books? You buy a movie and the movie you actually see will depend on the choices you make while watching it.
|
I dunno...seems bad enough that we are seeing hundreds of thousands of self-pubbed books that any sane person would term "works in progress" rather than finished products. How's
that working out for all of us?
Hitch