53: The 1995 Anime “Ghost in the Shell” Is More Relevant than Ever in Today’s Technologically Complex Society (Maynard)
Sarah Wangler and Tina Ulrich
Andrew D. Maynard
#review, #descriptive, #reportinginformation, #argument, #currentevents, #kairos #technology
When the anime movie Ghost in the Shell was released in 1995, the world wide web was still little more than a novelty, Microsoft was just beginning to find its GUI-feet, and artificial intelligence research was in the doldrums.
Against this background, Ghost was remarkably prescient for its time. Twenty-three years later, it’s even more relevant as we come to grips with advances in human augmentation, AI, and what it means to be human in a technologically advanced future.
Ghost in the Shell is one of twelve science fiction movies that feature in a new book that grapples with the complex intersection between emerging technologies and social responsibility. In Films from the Future: The Technology and Morality of Sci-Fi Movies (from Mango Publishing), I set out to explore the emerging landscape around transformative trends in technology innovation, and the social challenges and opportunities they present.
The movies in the book were initially selected to help tell a story of technological convergence and socially responsible innovation. But to my surprise, they ended up opening up much deeper insights into the nature of our relationship with technology.
Identity-hacking
Ghost in the Shell opens with cyborg special-operative Major Kusanagi helping track down a talented hacker—aka the “Puppet Master”—who’s re-writing people’s “ghost”, or what makes them uniquely “them”, using implanted brain-machine interfaces.
Kusanagi inhabits a world where human augmentation is commonplace, and is almost entirely machine. This technological augmentation provides her and others with super-human abilities. But it also makes them vulnerable—especially to hackers who can effectively re-code their memories.
This seems to be the modus operandi of the Puppet Master. Yet as the narrative unfolds, we learn that this is not a person, but an AI developed by US security services that has escaped the leash of its handlers.
The Puppet Master (or “2051” as it’s formally designated) is seeking asylum from its US masters. But it’s also looking for meaning and purpose as a self-aware entity.
Through the ensuing story, Ghost touches on a number of deeply philosophical questions that lie at the heart of society’s relationship increasingly powerful technologies. These include what it means to be human, the value of diversity, and even the nature of death. As Emily Yoshida so aptly put it in their Beginner’s Guide to the Ghost in the Shell Universe, Ghost is a “meditation on consciousness and the philosophy of the self”.
This is where the film comes into its own as it jolts viewers out of the ruts of conventional thinking, and leads them to reflect more deeply on the potential social impacts of technologies like AI, human augmentation, and computer-brain interfaces.
Navigating responsible brain-machine augmentation
In 2016 Elon Musk established the company Neuralink to develop science fiction-like wireless brain-machine interfaces. Inspired by the neural laces of Iain M. Banks’ Culture novels, and echoing Ghost, Musk announced on Twitter that, in his opinion, “Creating a neural lace is the thing that really matters for humanity to achieve symbiosis with machines.”
Yet as Ghost in the Shell so presciently illustrates, where you have read-write brain connections, you’re likely to have brain-hackers.
It’s by no-means clear how successful Neuralink will be (the company is still largely flying under the radar). But its launch coincides with intense efforts to better-understand and control the human brain, and breakthroughs in optogenetics that could one day enable wireless machine-mind networks.
Given these and similar developments, it’s not beyond the realms of possibility that someone will try and fit a subject up with an internet-connected brain interface that can write as well as read what’s going on inside their head; and that someone else will attempt to hack into it.
Developing such capabilities responsibly will require great care as scientists and others tread the fine line between “could” and “should”. And it’ll demand novel ways of thinking creatively about what could possibly go wrong, and how to avoid it.
This is where films like Ghost are remarkably helpful in illuminating the risk-landscape around such technologies—not because they get the tech right, but because they reveal often-hidden aspects of how people and technologies interact.
But Ghost’s insights go far beyond unpacking the problems of hackable brain implants.
Who owns and controls your augmented self?
Throughout Ghost, Major Kusanagi is plagued by doubts of who she is. Do her cybernetic augmentations make her less human, or having less worth? Is her sense of self—her “ghost”—simply an illusion of her machine programming? And what autonomy does she have when she malfunctions, or needs an upgrade?
These are questions that are already beginning to tax developers and others in the real world. And as robotic and cyber technologies become increasingly advanced, they are only going to become harder to navigate.
In 2012, the South African athlete Oscar Pistorius made history by being the first runner to compete in the Olympic Games with two prosthetic legs. His iconic racing blades came to represent the promise of technological enhancements to overcome human limitations. Yet they stirred up fears of them giving him an unfair advantage that led to him being barred from competing in the previous Olympics.
The same year that Pistorius successfully competed in the Olympics, the Canadian researcher Steve Mann was allegedly assaulted because his computer-augmented eye extension offended someone. And in 2015, patient-advocate Hugo Campos discovered he didn’t legally have access to the implanted defibrillator that kept him alive.
These are all relatively small examples of the tension that’s growing between conventional thinking and human augmentation. But they illustrate how the angst that Kusanagi feels about her augmented body, and how it defines her, is already part of today’s society. And we’ve barely touched the tip of this particular iceberg.
Again, this is where Ghost forms a powerful canvas on which to explore challenges that often transcend conventional thinking, and play out at the borders of our moral and ethical understanding. Watched in the right way, it can help reveal hidden truths around our relationship with the technologies we’re building, and guide us toward more socially responsible ways of developing and using it.
This, to me, is a power that is inherent in science fiction movies. And isn’t limited to Ghost—in Films from the Future, I draw on films as diverse as Never Let Me Go and Minority Report, to Ex Machina, to tease out insights into the moral and ethical challenges and opportunities that increasingly powerful technologies present.
Having immersed myself in these movies and the technologies that inspire them, it’s clear that, if we want to ensure these trends don’t cause more problems than they resolve, we desperately need the perspectives that movies like Ghost in the Shell and others reveal.
The alternative is risking losing our own “ghosts” in the drive to innovate bigger and better, without thinking about the consequences.
_____________________
Dr. Andrew Maynard is the author of Films from the Future: The Technology and Morality of Sci-Fi Movies (Mango Publishing, 2018), a physicist, and leading expert on the socially responsible development of emerging and converging technologies in the School for the Future of Innovation in Society at Arizona State University. He can be found on Twitter at @2020science.