Great sci-fi films are both fantastical and prescient. They can whisk viewers to a galaxy far, far away or exaggerate real scenarios on a fictionalized version of our planet.
But the genre is at its best when it holds up a “fun house mirror to our present” and reveals something about the world in which we live, said Lisa Yaszek, professor of science fiction studies at Georgia Tech.
“As audiences, we love both optimistic and pessimistic science-fiction films like these precisely because they are virtual laboratories where we can imaginatively experience the best and worst our technologies have to offer in a safe and fun environment,” Yaszek told CNN.
Films such as “Gattaca,” “Her” and even the horror comedy “M3GAN” have predicted what our future might look like if developments in gene editing and artificial intelligence accelerate. Meanwhile, pandemic thrillers such as “Contagion” seem even more realistic than they did upon their release after Covid-19 drastically upended the world in 2020.
Here’s what some notable films get right about science and tech — and what’s still the stuff of science fiction.
“Gattaca” drew inspiration from real events leading up to its 1997 release — including the Human Genome Project launch in 1990 and the successful cloning of Dolly the sheep, Yaszek said — and the film imagines a society obsessed with and dictated by genetic perfection. It seems to “eerily anticipate our own society’s current fascination with at-home genetic tests like 23andMe,” Yaszek noted, as well as recent advances in gene editing that hold promise for human health.
In the movie, genetics decide social class. Gene editing becomes the norm, and characters who are born without it are considered “in-valids” with a greater potential for hereditary disorders than “valids,” humans genetically engineered to avoid those illnesses. Vincent Freeman (Ethan Hawke), an “in-valid” cleaner at an aerospace facility, uses genetic material such as fingernails and urine from paraplegic former Olympian Jerome Morrow (Jude Law) to fraudulently join an interplanetary mission reserved for “valids.”
“Gattaca” came out about 15 years before the introduction of CRISPR-Cas9 as a tool used to make precise edits to human DNA. Though it’s mostly been used for research purposes, CRISPR-Cas9 has appeared to make a notable difference in genetic disorder treatment: A woman named Victoria Gray said her sickle cell disease symptoms were significantly alleviated after scientists treated her using CRISPR, CNN reported in March. Scientists removed premature cells from Gray’s bone marrow and modified them. The gene-edited cells, once returned to Gray’s body, appeared to have produced fetal hemoglobin, a type of hemoglobin that makes it harder for cells to sickle and stick together.
Current gene therapy trials — including the sickle cell trial Gray was a part of — involve altering nonreproductive cells in what’s known as somatic gene editing.
But the process of preemptively manipulating the genes of human sperm, eggs or embryos in a way that evokes “Gattaca” — called heritable gene editing — has raised serious ethical concerns. In 2018, Chinese doctor He Jiankui said that he had modified two human embryos using CRISPR-Cas9 and that the modifications would make them resistant to HIV. The scientific community swiftly condemned his work, and he received a three-year prison sentence in 2019.
Society’s fascination with artificial intelligence has resulted in no shortage of films that depict both its potential to facilitate a more advanced way of life and the hypothetical horror of AI overtaking humanity.
“These movies tend to reflect both our hopes and fears about our increasing reliance on digital companions,” Yaszek said.
In Spike Jonze’s “Her,” Joaquin Phoenix’s Theodore falls in love with Samantha, an advanced AI operating system who says she returns his affections. Siri, she is not: Samantha speaks with a human affect and has opinions and feelings, or at least is programmed to. It’s the rare sci-fi film that doesn’t villainize AI that’s able to mimic — or even genuinely feel — human emotion.
Samantha does not yet have a perfect equivalent in our own world — she could even view the physical world through a lens and comment on it, but there are some realistic AI-powered virtual assistants. Popular chatbots such as ChatGPT can closely imitate human speech and have been used to write extensive essays and answer complex questions posed by users, though they aren’t perfect. The tech outlet CNET published several articles generated by AI that contained major errors. And AI experts told CNN this year that they fear chatbots could be used to perpetuate disinformation since they’re programmed to give users more of what they’re seeking and hold their attention.
While “Her” humanized AI, the 2022 horror hit “M3GAN” tapped into viewers’ fears. M3GAN is a humanoid doll and caretaker for the young Cady, who loses her parents in a car accident, and the two form a sisterly bond. But M3GAN takes her duties as android big sister dangerously seriously, murdering anyone who threatens Cady or Cady’s trust in her.
Yaszek noted that robot caretaking tools are already in use: Nursing homes in Japan have for years employed robots to entertain and engage residents. Studies on whether the quality of elder care has improved in the country are ongoing, but several senior care facilities in Minnesota last year took a cue from Japan and started to incorporate robots built by University of Minnesota Duluth experts into residents’ care routines.
There are independent robots that deliver food, perform stunts at Disney’s California Adventure and dispose of bombs on behalf of police departments. Commercial robots are nowhere near as lifelike as M3GAN. But her AI capabilities — known as artificial general intelligence, which describes a bot’s ability to learn anything a human can — are closer to being a reality, said Shelly Palmer, a Syracuse University professor of advanced media and an expert in emerging tech, in an interview with CNN in January.
“We may be both grateful for these tools but also a bit worried,” Yaszek said. “What happens if these marvelous new technologies break down and leave our loved ones more vulnerable than ever before?
During the early days of the Covid-19 pandemic, many turned to Steven Soderbergh’s “Contagion,” a 2011 film that depicts the stunningly quick spread of a deadly virus across the world. Upon its release, a scenario in which the world could change so drastically in a matter of days or weeks seemed unlikely. But when Covid-19 sent much of society into isolation in 2020, “Contagion” seemed like a prescient example of what a pandemic response could look like.
Even before Covid-19, experts at the Argonne National Laboratory, operated by the US Department of Energy, praised the film in 2012 for accurately portraying the rate at which a society would experience shortages of resources and the collective effort it takes to address a rapidly spreading virus.
Kelly McGuire, associate professor of English at Trent University in Ontario, wrote in 2021 that “Contagion” presents the development of a vaccine as the “end point of the arc of pandemic,” when, in our Covid-19 reality, the virus may never be eradicated despite the widespread availability of Covid-19 vaccines and boosters.
Though the Covid-19 vaccine has prevented more than 3 million deaths, according to one 2022 study, hundreds of thousands of Americans continue to be infected with the virus and thousands die monthly, according to the US Centers for Disease Control and Prevention. Immunocompromised people and those who haven’t received the vaccine remain at a higher risk of serious illness and death.
Reality has often surpassed the bounds of sci-fi, said Melissa Monique Littlefield, a professor at the University of Illinois Urbana-Champaign who teaches courses on sci-fi and speculative fiction. Even when our reality feels stranger than fiction, though, stories such as “Gattaca,” “M3GAN” and “Contagion” still have something valuable to say about the world in which we live and where it could head.
“(Sci-fi) doesn’t simply predict or merely comment on scientific discoveries or technological phenomena,” she said. “Instead, it offers us the opportunity to continually evaluate ourselves, our societies, and our assumptions about the world.”