top of page

Our Recent Posts

Tags

Companion (2025): A Bloody Honest Take on New Age Romance

  • theghoulsnextdoor
  • Jul 3
  • 18 min read


Companion is a seamlessly hilarious sci-fi romantic comedy that freshly examines power dynamics in artificial intelligence relationships. Ripe with refreshing and quick twists, the film leaves little room for questions and instead guides us through a bloody tale of deprogramming and toxic relationships.


Sources in this Episode:


Other Reading:


Media from this week's episode:

Companion (2025): A weekend getaway with friends at a remote cabin turns into chaos after it's revealed that one of the guests is not what they seem.

Directed by: Drew Hancock


Companion: Justice for Fuckbots, HER 2.0

by gabe castro

RED: Quotes, someone else's words.


Synopsis


Companion is a seamlessly hilarious sci-fi romantic comedy that freshly examines power dynamics in artificial intelligence relationships. Ripe with refreshing and quick twists, the film leaves little room for questions and instead guides us through a bloody tale of deprogramming and toxic relationships. 


Beginning as a romantic comedy, we meet Iris as she meet-cutes Josh, an unassuming smiling fellow. Iris tells us about the impact of knowing love, of experiencing it as it is so rare in life. That she has only felt this feeling twice in her life, when she met Josh and when she killed him. So we know this romance is destined for doom. The couple head to a remote cabin by way of a self-driving car to meet up with Josh’s friends. Owned by Sergei, the cabin is a large estate by a lake. Iris is nervous the others won’t like her, sensing some darkness or unworthiness in her in regards to Josh. She meets couple Eli and Patrick who share the story of their own meet-cute many years ago and Sergei’s partner and Josh’s best friend, Kat.


After a charming night of fun and drinking, Iris wakes in the morning hoping to have a romantic stroll with Josh near the lake. He urges to go without him as he is still unwell from the previous night. At the lake, she encounters Sergei who tries to seduce her, claiming this is precisely what she’s for. She returns to the house drenched in Sergei’s blood and while her immediate explanation revolves around self-defense and fear, she spirals into a strange defense about being only for Josh. This is when Josh demands Iris “go to sleep.” Revealing our heroine to be a robot all along.


Josh tells the now tied up Iris that she is a robot and after accepting her fate, she decides to flee. A good deal of laughs and fun twists occur along with some unexpected heartfelt moments. We learn that Josh and Kat set the entire thing up - hacking into Iris so that her rage and self-defense was maxed out along with her rule to cause no harm, muted. Josh and Kat wanted Sergei dead to steal his money and hoped to blame it on poor robot Iris. Patrick is revealed to be Eli’s robot companion. Kat is revealed to have misled Josh about Sergei’s evil deeds though Josh is the most at fault in the entire piece. 


After Iris steals Josh’s phone, making herself independent and sliding her intelligence from 40% (rude) to 100, she begins the fight for her life against a man she’d been programmed to love. 


Justice for Fuckbots - HER2.0


Companion joins a rich tapestry of films following the fuckbot companion and her needy, male partner. In Spike Jonze’s HER, Samantha’s companionship is seen as emotionally fulfilling with Theodore and Samantha’s relationship seemingly growing in a natural and quite human way. Samantha is essentially a new-age manic pixie dream girl only instead of teaching a sad, boring man to let loose and dance in the rain, she’s the answer to everything - better yet she has no autonomy. Men love the idea of control and compliance and this fear of being replaced by robots goes far back, we can see the warning signs in Stepford Wives where suburban husbands literally replace their wives with compliant robots. Iris's wardrobe deliberately evokes that film's aesthetic—the perfectly coordinated outfits and perpetual smile of programmed femininity. But where the original Stepford Wives were replacements for "difficult" real women, Iris represents the logical endpoint: why bother with real women at all when you can manufacture the perfect one?


Iris is, as Josh reluctantly explains, a fuckbot (though she’s so much more than that.) She is designed to fulfill the roles that society has long expected women to perform - sexual availability, emotional labor, and complete devotion (less companion and more slave). Kat says that she’s worried Iris will replace her (at the time, Iris doesn’t know she’s a robot so she thinks she means this as a woman-to-woman concern.) Kat feels defensive from a few levels - as Josh's best friend (at first a tad overprotective and skeptical of Iris - like she is secretly pining for him) but also on a larger scale of fear since Kat is also regulated to a fuckbot for Sergei. He’s married and wealthy, and she’s expected to simply exist for him. Kat exists for Sergei’s convenience much in the way Iris exists for Josh’s. When Sergei whispers into Iris’ ear that “this is what you’re for,” he doesn’t just mean robots - he means women. There’s no tears for Sergei as he deserved some reaction - maybe not murder but a punch to the throat perhaps. 


Olivia Wilde’s Don’t Worry Darling is one of the more recent takes on the modded companion horror. In the film, women are trapped in a virtual reality designed to fulfill male nostalgia for 1950s domesticity. While Iris isn’t quite the 1950’s aesthetic or level of dedication (leaning more into the non-time specific future like what we see in HER where fashion is nostalgic but the world is technologically advanced.) It reminds me of the rise of tradwives, these Handmaid’s Tale wives, who have embraced the traditional roles of women in an effort to preserve the past and control the future. They are the new-age Stepford Wives, baking bread and obedience. Iris feels like a dark version of this, not quite the wife but certainly a manic-pixie-dream-girlfriend whose sole purpose is to have her life revolve around Josh. 


I wonder what kind of profile Josh built into her program, what were the answers on his questionnaire that led to a character like Iris. She is so terribly clingy, constantly seeking validation to the point of extreme anxiety. He seems to thrive off her panic and desperation that leads to easy dependence. For a man-boy who's never been truly needed by anyone, having someone programmed to worship him must feel like validation.


As film critic Tanya Malik notes, Companion transforms feminist horror into sci-fi allegory. Iris's "glitches": her moments of self-doubt, forced smiles, and reflexive apologies mirror the trauma responses of women in psychologically abusive relationships. When she begins questioning her reality, the men around her dismiss her as "broken," echoing the age-old tactic of calling women "crazy" when they stop being agreeable.


What Makes Us Human - the age old query


While one of the bigger questions is whether Iris’ life matters - she’s a robot after all. I’m not the person to ask this as I can personify anything (even my roomba - he’s my son.) But a being with personality, memories and traits feels quite human to me. As it’s repeated in the film over and over again, it’s just programming - but even our human brains are programming. Our brain is a neural network of information, connection, and blueprints. When Iris holds onto her memory of their meet-cute, of college, and her first job as proof of her existence, Josh dismisses them as flavoring - to make her feel more real. But as someone who suffers from terrible memory retention due to trauma, I felt terribly attached to Iris at that moment. Am I any less human if I only have a few memories to hold onto? Many of my childhood memories have been formed after the fact - through the retelling of them from someone else who experienced them. Are those memories any less real? Could we consider them programming too?


The truth is that all human memory is, in a sense, programming. Our brains construct narratives from neural patterns, filling gaps with assumption and emotion. We remember selectively, editing our past to make space for the present. If Iris' artificial memories don't qualify as real experience, then what do we make of our own reconstructed, selective, and often unreliable memories?


And while Iris and Josh’s relationship was purely transactional and built upon a foundation of hierarchies, the film also offers a rather human relationship that hints at the possibility that not all AI relationships are toxic. Like Theodore and Samantha in HER, Patrick and Eli seem like a genuine match, as genuine as one can be when starting as a fabrication of memory. Eli doesn’t trade in Patrick for newer models, opting instead to continue to grow their companionship. Patrick even knows he’s AI but still chooses to remain in the relationship, building something unique within himself that feels as close to love as he can imagine - something outside of his programming and instead, in his very soul. Even after being hard-reset by Josh, Patrick remembers Eli and further, reignites his love for Eli (something that would be impossible if it were only attached to his programming.) How can we call their love any less real than the biological versions? They’ve built a life together and grown a love from those memories and experiences. Are they any less real because Patrick is artificial?


Perhaps it's not a question of whether artificial beings are “real,” but more an exploration into what “real” means. If we decide that what makes us human is our capacity for growth, our emotional complexity, our relationships and memories, then I think both Iris and Patrick check all the boxes. Through her fight for independence and freedom, Iris recognizes toxic patterns, asserts boundaries (literally breaking up with the man she’s programmed to love unconditionally), and her ability to choose her own future breaks through her simple programming into something quite human. Patrick’s enduring love, his willingness to choose love for Eli even after discovering he’s a robot feels quite human to me. 


The Oppression of the Average White Man (aka a myth)


The film is hilarious and Jack Quaid’s delivery is perfect for the whiney, bratty Josh who feels himself an underappreciated man. In his world, the reason he is struggling isn’t because he has no drive, is incredibly uninteresting and pathetic - it's because white men are on the downward trend right now. Without a single Joe Rogan-ism, Josh encapsulates the incel mentality. He feels he’s owed all of this and cannot understand it all not working in his favor. Surely it's finally time for Josh to shine. 


As the ever nice guy, Josh is clearly owed success and any lack thereof is not a reflection on his own shortcomings but instead a conspiracy against white men like himself. To Josh, he’s the protagonist of this story - the victim of a scheme gone wrong at the hands of the corrupting force of women in his life (both real and unreal.) It’s comical if not terribly accurate and scary. The film and Jack Quaid do an impressive job of ensuring the story is very much not Josh’s, that he is the butt of the joke and the villain. He is a lonely, worthless man who is so incapable of genuine connection that he needs to literally program someone to love him. 


Companion is a brilliant sci-fi film because it both critiques the ethics of AI relationships and holds up a mirror to existing power dynamics that influence those relationships. Every "glitch" in Iris's programming reflects real patterns of manipulation and control that women navigate daily. The film asks uncomfortable questions: How different is programming devotion from conditioning it? How many relationships already operate on the assumption that one person exists primarily for the other's benefit?

In giving Iris the power to choose differently, we get the possibility of breaking cycles that feel hardwired into our systems.

For more analysis of AI, gender, and power dynamics, see Sierra Greer's "Annie Bot" for a complementary exploration of artificial consciousness and programmed love.


Companion: Even AI Isn’t Safe From Misogyny

by Kat Kushin


RED: Quotes, someone else's words.


In a Misogynistic/Sexist World, even the Robots aren’t safe 


What we see unfold in Companion is familiar. A quote unquote nice guy, who actually is the worst, just can’t find a real human girl who will love him. Why? One might ask. The answer is “ding, ding, ding” misogyny. It turns out when you don’t view human or android women as equals, worthy of real care and respect, it makes finding love pretty hard. Josh doesn’t want a partner, he wants someone who will do what he wants, when he wants, how he wants. I.e. he doesn’t want a person at all, he wants the idea of one, and to him that translates to an android girlfriend, who he’s convinced is fine to mistreat because she’s not human. The joke is that he doesn’t view real human women as human either, as evidenced by how he treats Kat. This not so far off future is a callback to marriages of the 1950s and back. The desire for a time when women lacked autonomy, personhood, and the ability to tell the Josh’s of the world, “ew no”. 


Men like Josh exist. They have and will continue to. Some run the country, or own X. We’ve all met a Josh, or seen him, or heard his podcast. The issue with sex robots other than the obvious ethical concerns is they don’t acknowledge the societal issue that necessitates their creation. It ignores the Josh in the room, the root of the issue, and puts a bandaid on an incel, which does not solve the problem of the fact that incels exist, and misogyny and sexism exists. It just takes their problematic behaviors and viewpoints and directs them at something that cannot defend itself because they were not programmed to be able to. I’m not talking about a rechargeable vibrator, or flesh light. They don’t give those things consciousness. But to create an android like Iris, is ethically, fucked. They program her to believe she is alive, and human. They program the androids like her to believe that they are human, to feel both emotions and pain, and then throw them to the wolves to be tortured. Just because a person decides that Iris and other androids like her “aren’t real, cause they are not human”, does not justify mistreatment. If anything, from a psychological standpoint, further damages the users of these robots by encouraging their behavior, further isolating them from connection, and limiting their empathy and self control. It doesn’t solve the problem, it makes it worse. 


Also, this may be an extremely neurospicy take, so forgive my soap box for a minute, but as someone who will personify a rock, who in every video game where robots gain sentience, sides with the robots, and whose favorite movie was Terminator 2…if we’re gonna make robots, we shouldn’t abuse them. You should feel just as bad for kicking a roomba as you should for kicking a puppy. Just should not treat things, people, the planet, badly. If you need an outlet, go to the gym, run, play sports. Redirect that energy into healthier coping mechanisms. For anyone who is like why be nice to robots when they may or may not have feelings? Idk to not be a dick? To model the empathy you hope to receive from them when they surpass us? But also, In a world where you can program a robot to fly around and shoot weapons out of their bodies, if you can’t be nice to a robot out of humility and empathy, do it out of fear. They are made of metal. We will not win. How does anyone think being mean to something with that capability is a good idea?! Only someone who has never experienced consequences would. eyes 


Also, the want for robots to be subservient slaves just to make our lives easier is also hella gross. Like if we’re making tech to make life more accessible, dope, do that. Make a robot arm, or improve our ability to live without harming the environment. Function specific robots, like my purpose is to serve butter level robots, make those, but don’t give them feelings? Don’t program them to believe that they feel pain and cause pain? Why?! Why do we want to create humanoid robots, with the capacity to feel, just to abuse and exploit them? How does one decide whether or not they have sentience? Or emotions? Just because Chad from engineering says the robots don’t have feelings, why would I believe them? I do not trust some man who could not identify empathy with a 12 foot pole to tell me whether or not a robot has the capacity to feel pain. And to be so for real, even if they don’t, but they’re programmed to THINK THAT THEY DO, in what world are we justifying harming them? It’s GROSS. Just like treating humans, or animals like that is gross. Don’t give something life just to torture it. 


I didn’t bring you here to yell at you, but thank you for listening. Whoever is making robots like this please stop. Okay. And if you are going to make those kinds of robots, maybe make laws and psychological evaluations for people who would get the robots to guarantee that they don’t get abused. 


To get to the real point of what this section is about, it’s that there is a rise in robot-human companionship. We’ve covered this somewhat before, talking about apps like Replika and chat bots. The creation of this technology, unfortunately because we refuse to address the societal issues at large, has also led to a rise in men using this technology to abuse their robot companions. In an article on Futurism.com, titled Men Are Creating AI Girlfriends and Then Verbally Abusing Them they go through an upsetting trend where men create an AI companion on Replika and apps like that to abuse them, posting the conversations on Reddit. As a warning, the article is upsetting to read. There are many ethical concerns here, and even instances where chatbots are programmed to act abusively towards the user. All in all, it becomes both an issue of the people creating the chatbots, and the people using them.


When the people creating this technology have race and gender based biases, they will program those problems into their tech. The article brings up the point that by allowing “users who flex their darkest impulses on chatbots could have those worst behaviors reinforced, building unhealthy habits for relationships with actual humans. On the other hand, being able to talk to or take one’s anger out on an unfeeling digital entity could be cathartic. But it’s worth noting that chatbot abuse often has a gendered component. Although not exclusively, it seems that it’s often men creating a digital girlfriend, only to then punish her with words and simulated aggression. These users’ violence, even when carried out on a cluster of code, reflect the reality of domestic violence against women.”


This reflects a widely known societal issue that we live in a deeply misogynistic and sexist society, that has as a result, created a lot of men who are angry and miserable, and are encouraged to take that misery out on femme humans, and femme robots. 


As I mentioned previously, if a person holds gendered biases, they will program them into their tech. An example of this is that many of the most widely used AI assistants were programmed to be femme. Specifically, Amazon and Apple for example created femme ai assistants in Alexa and Siri, whose purpose is to be subservient. In another article on Futurism.com titled Study: Turns Out People Are Sexist To Female Robots Too they unpack a study where it was shown that users prefer femme presenting AI in products such as Amazon Alexa and Google Home, because they felt “more human”. But with this, found that  “the products might be inadvertently promoting "the idea that women are simple tools designed to fulfill their owners’ needs," said the study’s authors to The Academic Times.” The issue here is that many of these Femme AI assistants were also programmed misogynistically. The article uses an example where Siri used to respond to user requests for sex, not with a no, but that they were not that sort of assistant. They now have been reprogrammed to reply with a simple no, but it showcases how by programming AI assistants to be femme and passive, they reflect a societal bias that the human’s creating them view non-robot/ai women that way. Similarly the femme Replika is also designed to be passive, which presents a bigger issue of how these AI companions are being used, abused, and viewed by our society. 


When it comes to Sex with Robots, a Psychology Today article titled Who's Most Interested in Sex with Robots?, claims it has little to do with sex, and much more to do with Sexism and Dominance. Research conducted by Jessica  Johnson and Connor Leshner, asked undergraduate students in Canada about their attitudes towards robosexuality, ie interest in sexual relations with robots. Their goal was to see what this interest might reveal about deeper social relationship issues. They found that people higher in hostile sexism were more likely to be interested in sex with robots. Additionally, it was found that men were more interested in sex with robots than women. They go on to note that: “Statistical modeling showed that this relationship could be partially explained by hostile sexism. In other words, men who wanted sex with robots also tended to have anger and distrust of women. The robot may offer an alternative to women that they find easier to trust. Alternatively, the robot may be a sexual object that they can treat with disdain—which is how they already want to treat women.” 


Another finding from the study revealed that people who wanted to maintain status differences between groups were also more likely to be interested in sex with robots. They stated that: “People who see some groups of people as less valuable than others also appear to be more interested in sex robots. It may be that these people already see others as being more like objects to use, and so also using a robot as a sex object feels more natural.” This study paints a clear picture of Josh’s character in Companion


AI Companions in today’s world 


AI companions however are a different story. When we remove the sex and dominance component of the equation, it is found that people are generally nice to their AI companions. When you remove the gender component, or utilize more gender neutral chatbots, it was found that people were generally kinder. The widespread use of AI Companions reflects a different societal issue though. The problem is social isolation and a population of lonely humans. In America especially, where in the past year true social expression has been severely condemned, it is not safe to be yourself. In some ways AI offers you judgement free companionship, which in some ways is freeing, and in other ways dangerous, in that you have no real checks to toxic behavior. Additionally, from a surveillance standpoint, your data here and what you tell these companions is not protected, so is it really safer than trusting a human being? 


Additionally, within capitalism, and the deepening economic hardships, we don’t have time to be ourselves and connect with other people either. The further we are from being able to afford our basic needs, the more isolated and stressed we become. Enter in, Chatgpt, or other chat bots that you can alter to sound like your new best friend. When you have connection in your pocket, which to be honest you already have through texting and other phone based communication, a chatbot won’t leave you on read, or delivered. The constant access to AI, can make it potentially more appealing than connecting to a real human who will set boundaries, have limited availability, and a calendar that is booked most of the time. 


Under capitalism, these chatbots have so much potential to become addicting and isolating. To provide some numbers, in an article titled: What Are AI Chatbot Companions Doing to Our Mental Health? On Scientific American, they unpack just how many people are using these chatbots. They say: “These chatbots are big business. More than half a billion people around the world…have downloaded products such as Xiaoice and Replika, which offer customizable virtual companions designed to provide empathy, emotional support and — if the user wants it — deep relationships. And tens of millions of people use them every month, according to the firms’ figures.” Additionally, the use of other chatbots like Chatgpt, Gemini, Claude and others is in the billions.


In another article titled: Will AI Companions Make Us Lonelier? They unpack the real concerns around the motives of these companies, and how AI Companions can become addictive, and further isolate us from human connection. It’s not to deny the benefits Ai can pose, aside from the obvious environmental concerns, to our mental health. These chatbots can be a great tool for accessibility, to reduce anxiety and help organize us. But, at the same time the article states: “scientists warn that developing relationships with “ever-pleasing” chatbots can lead to excessive use as well as psychological dependence similar to what we’ve seen with internet gaming, social media use, and compulsive mobile-phone use. The misaligned motives of companies can exacerbate the problem as these platforms make money by keeping users engaged on the app. There is a legitimate fear that people will shy away from human relationships because their chatbots are portable, available, and frictionless, in terms of both access and quality of engagement.” 


Additionally as I mentioned before, there are no checks for bad behavior. Thinking of young people specifically, the most ingrained in the new tech, their social boundaries and relationships are still being established. Giving a young person unchecked relationships with a chatbot, that is programmed to be passive and agreeable, will not model healthy human connections that generally will have conflict in order to encourage growth. The article continues that “Scholars in this field are deeply concerned about users getting accustomed to relationships which neither mimic authentic human-to-human dynamics nor offer room for growth and emotional maturity.” A Professor in bioethics at UC Berkeley compares AI to Fast food, in that “It gets the job done in the short term but doesn’t offer nourishment the way a healthy meal would.” As we see with Josh, he is not any better off after his relationship with Iris, if anything it helped him double down on all his worst traits. 

コメント


©2018 by The Ghouls Next Door. Proudly created with Wix.com

bottom of page