Computeruser.com
Latest News

The summer cyber cinema

We can’t build artificial intelligence if we don’t know how the natural kind works. 3_6_1001.xml hed: The summer cyber cinema dek: We can’t build artificial intelligence if we don’t know how the natural kind works. blurb: We can’t build artificial intelligence if we don’t know how the natural kind works. number of pages: 1 by Nelson King

You know the gambit: “Did ya see the movie …?” This is one of those stock lines that people use with their friends and acquaintances to provide about five to 10 minutes of conversation and a chance to share opinions and recommendations. I’m going to start this column the same way hopefully to prevent the content from becoming too heavy: Did ya see Spielberg’s “A.I. Artificial Intelligence” or Hironobu Sakaguchi’s “Final Fantasy: The Spirits Within”?

They were the two big cyber movies of the summer. Much as I’d like to review them, this isn’t the venue. I’ll just say that neither movie is going to be in many (or any) of the critics’ top 10 lists, and judging from the relatively mediocre ticket sales, they weren’t big hits with the public either. I wouldn’t be surprised if many COMPUTERUSER readers haven’t seen either movie even though both have considerable significance with respect to computing.

The Spielberg opus is unusually dark, which is the code word for a movie that seems pessimistic and unhappy–it’s a downer. As a rule, Spielberg doesn’t do dark movies, although he likes to cast some fairly deep shadows into his stories. Even his serious movies end on an uplifting note despite graphic inhumanity (“Schindler’s List,” “Saving Private Ryan”). This time, however, Spielberg developed much of the story (and certainly its themes) with Stanley Kubrick, and no doubt the creator of the movie “2001: A Space Odyssey” brought much of the darker perspective to the project that even a Spielbergian sentimental ending can’t lighten.

“A.I.” diddles around with a story much like “Pinocchio.” It tells the story of David the little android boy (or mecha) who wants to be a real boy. David shares a similar intention with the android character Data (from “Star Trek: The Next Generation”) who wants to become more human; but the environments are vastly different.

In the “Star Trek” series, the humans around Data discuss, encourage, and guide Data’s desire to become more human. It’s a positive support group in a social setting that accomplishes incredible feats of technology and has come close to endowing the android with the capacity to be fully human. No such luck for David in “A.I.” The humans in the story are an unappealing lot. It’s no wonder, because the milieu is bleak–a post-global warming world where the great coastal cities have been flooded and people are hanging on to the shambles of a technological economy.

It appears that the level of cyber sophistication is nearly as high as that in 23rd-century “Star Trek,” since the android boy is quite human in behavior (though played with a wonderfully spooky abstractness by Haley Joel Osment). After bonding with his mother, David even develops complex emotions, something that didn’t appear until the late episodes of “Star Trek: The Next Generation.”

Unlike in “Star Trek,” neither the computer technology nor the economic background that underlie the story in “A.I.” is convincing. The movie presumes a required level of tech knowledge, a tip-off that the storyteller isn’t interested in technology. The result is a picture of how we (a.k.a. Steven Spielberg or Stanley Kubrick) currently react to very human-like androids, not how the technology evolved or how people that far in the future might actually grow into using androids. This isn’t quite a criticism, since a lot of science fiction makes similarly abrupt presumptions. However, there is still a requirement of plausibility, and there are many scenes in “A.I.,” such as the encounter with David’s creator, Prof. Hobby, that ring false.

The movie asks us to believe that mankind is encountering the issue of emotion-capable androids for the first time. An early scene takes place in a quasi-corporate setting, where Prof. Hobby and his assistants discuss the need for emotion-true androids as if it’s a new topic with a few metaphysical open-ended questions. I guess these folks never watch “Star Trek” reruns or read any science fiction books.

It’s ironic–no, anachronistic–that Spielberg spotlights artificial intelligence at a time when most of the workers in the field have given up grandiose schemes for reproducing human intelligence. We’re learning a great deal about intelligence, and we do know that it isn’t massive logic; but we can hardly build a computerized artificial intelligence if we, as yet, don’t know how the natural one works.

Never mind. Spielberg is into making a fable, albeit a dark and difficult fable. The story uses David as the focus of a very human ambiguity: He looks and behaves like he wants love. We cotton to creatures that appear to love us, like puppies or kittens. Yet we know David is not human; we do not know what he really is or wants. The uncertainty repels us because he looks so human, but he is also strange. He could be a threat. In the story, most humans, including the family that adopts David, demonstrate this ambiguity. The natural son schemes to destroy David as a rival. The father plainly dislikes him. The mother suffers the agony of conflicting feelings, but elects to abandon David. On his own, David must escape from other humans who destroy androids for entertainment.

By the end it’s clear that David should be acceptable as human, except we know he isn’t. In fact, the audience has seen him dive from skyscrapers, live underwater, and survive 2000 years. The latter part of the movie wanders into a bewildering muddle of icons, aliens, and metaphysics. I guess we’re supposed to believe that David carries the essence of humanity into a time when humans are extinct. The aliens learn from him, and then he too passes into some kind of oblivion. Somehow this isn’t very comforting or inspirational if you’re human.

If the story of “A.I. Artificial Intelligence” doesn’t quite get at the issues of computational capability, you could try “Final Fantasy: The Spirits Within”-not because of its story, but because of how it was made.

For starters, “Final Fantasy” is based on a very popular video-game series, so it has computing in its genes, so to speak. It’s also no surprise that the plot isn’t sophisticated: Dr. Aki Ross and Dr. Sid are trying to save the world from the phantoms who arrived via comet from some other planet and are now sucking the life essence from every human not protected by a city-shield. Besides the killer phantoms, the enemy includes the military led by a misguided and ultimately evil Gen. Hein. Naturally, Dr. Aki is in love with a military guy who helps her triumph in the end.

Like the story, the characters are cartoonish, having about as much depth as a sheet of plastic wrap. “Final Fantasy” sounds like a cartoon, but it’s not; it’s the first entirely computer-generated movie, including the characters. Unlike (most) actors, they can’t think independently; but these computer creations sometimes look like real people. It’s the appearance of reality that is striking; the little hand or eye motions, the flow of hair. In the representation of humanity, the movie is definitely a huge step up from cartoons, claymation, or any other technique.

In “A.I.,” David is played by a real actor, who does a masterful job of using reactions and mannerisms believable of an android. The characters in “Final Fantasy” give tiny but constant signals that these aren’t actors. Yet this movie shows that someday in the probably not-too-distant future digital creations will look like real actors. Maybe David could be played even more convincingly that way. Then the question becomes: How will we know when we’re looking at computer-generated actors? In a sense this is like the mother in “A.I.” who doesn’t know if she can trust android emotions. Will we be able to trust our perceptions? Will it matter?

The Screen Actors Guild thinks it does; it tried to organize a boycott of “Final Fantasy.” The sensitivity is not wholly premature. While it may be decades, if ever, before a digital character can give a performance worthy of a Streep or a Hanks, moviemakers are already inserting crowds and extras that are computer generated. Even famous people and actors have been computer mimicked, if only briefly.

Coming from different directions, “A.I.” and “Final Fantasy” raise issues that go back to some of the oldest philosophical questions: In life, what is real and what is illusion? However, since so many people don’t care a whoop about philosophy, how about putting actors out of work? Do we care about that? What about replacing your unruly kids with some nicely programmed androids? (Forget the messy genetic cloning stuff.) Do you want to hug an android? The folks on the starship Enterprise don’t mind; should you?

Leave a comment

seks shop - izolasyon
basic theory test book basic theory test