The Other Worlds Shrine

Your place for discussion about RPGs, gaming, music, movies, anime, computers, sports, and any other stuff we care to talk about... 

  • James Cameron's "Terminator 3" Announced, and it sound great

  • Your favorite band sucks, and you have terrible taste in movies.
Your favorite band sucks, and you have terrible taste in movies.
 #170290  by Julius Seeker
 Wed Nov 01, 2017 11:35 am
James Cameron and Deadpool director Tim Miller are developing an ambitious arc of three new Terminator films.

To give some background: Cameron didn't like the story of the original Terminator 3, and now has control of the license again. The series will begin 30 years after Terminator 2, and will ignore the later films. At the core of the concept is the view the loss of control of AI. In Terminator 3 it occurred because the military developing it was stupid, and they hooked Skynet up to all sorts of things without thought; and Cameron didn't like that. In Cameron's view, the loss of control occurs when the AI technology being developed by optimistic scientists is taken over by military organizations for the purpose of weaponizing - or to attack all potential enemies. In other words, it's not Skynet who thought up ways to build robots and kill humans, but rather humans themselves that developed Skynet that way. As said before, he doesn't want to develop this as just a plot point in the last half hour of one turn-off-your-brain action spectacle, but rather across three films. The point of AI self-awareness doesn't seem to be the core of the conflict in his new vision, as it was in T3.

Below is the video, and further below I try to break it down a little more.



This has happened before with technology: Atomic energy is the example he brings up. The original scientists were incredibly enthusiastic about powering the world with an extremely efficient new technology. The first manifestation of it revealed to the public was the deaths of hundreds of thousands of people, and a multi-decade long war of threats of a nuclear terror.

This has occurred with things other than technology. There was eugenics, the sort of thing that before the 19th century would have been considered by all modern nations to be taboo. It began to become generally accepted as a way of improving the human genome, making us healthier and more productive as a species. This acceptance became weaponized in a way, and began to manifest itself as genocide against a variety of different nationalities and cultures/ideologies; particularly among demagogue lead nations (Khmer Rouge, the USSR, Maoist China, Hutu Rwanda, and Nazi Germany).

This is a theme that really interests me. We've read Asimov's vision, AI development that took us to the hyperdrive and galactic-wide colonization. But even in Asimov's vision, human nature brought it all down, the difference is he didn't involve weaponized AI... At least not until Gaia and Galaxia, which are like benevolent borg. He doesn't really entertain the notion that AI could be weaponized.

What I can glean from the video is that he didn't like what Rise of the Machines did. In that the story, a government organization happens to have set up some early prototypes, and haphazardly hooked Skynet into all their systems, even giving it access of all the nukes - because, you know, why not? As I explained above, Cameron seems to believe this is kind of a silly story, and that the better story is with the scientists who good intentions, and the military with the intentions to weaponize. The AI being "out of control" didn't occur when Skynet became self-aware, it was when those scientists with good intentions lost control to military organizations (and potentially maybe hacker-groups and such) with weaponization and genocidal goals.

All of this seems much closer to the original vision. The first film establishes the technology from the future, the second one has Sarah Conner raging out against the scientists who built the AI. so...

MY THOUGHTS ON THE THIRD FILM'S STORY
based on James Cameron's interview.

30 years later, AI is again reaching the sort of levels of (Terminator style) learning computers. They look like the next big optimistic step. Sarah is against it, but most people see her as a bit of a cook who will hold back the progress of civilization - and written well, it should really appear to be the case. The problem is NOT with machines becoming self-aware and conspiring amongst themselves to overthrow humanity, but rather with hacker groups taking advantage of this AI technology for malicious purposes. The major issue occurs when rival nations begin developing it for use against each other in a sort of AI-arms race. Each government making things bigger and more dangerous, building armies of machines - AI-controlled drones, tanks, and classic T-800 style assassins.

Will there be a Skynet this time around? I don't think so. Skynet was effectively dead in T2, and making it re-manifest itself 30 years later seems a little lazy and uninspired. I also don't think there will be a judgment day in the same way that T1 and T2 had it as an impending disaster - Cameron noted he is less concerned about nukes this time around. I think he wants to explore new ground rather than retread ground in the way the non-Cameron Terminator films did. It doesn't need to shed everything, because T-800 style assassins and HK style drones are still applicable today. Although, I think the idea of an AI controlled nuclear launch system is definitely not; if anything, it would be something more like advance SDI to block nuclear attacks.