Can Robots and AI Love?
- Brainz Magazine
- 2 hours ago
- 12 min read
Written by Gareth Edward Jones, Visionary Technology Leader, Environmentalist, & Social Impact Advocate
Gareth Edward Jones is a visionary technology leader with 20+ year of digital success, CIO Times Top 5 Business Leader, Executive Contributor for Brainz Magazine, UN SDG Advocate and Co-Founder of Lightrise, and Trustee of the Lightrise Foundation

Looking at the world today, it’s tempting to ask whether humans are even capable of love anymore. But let’s not get too existential, let’s stick to robots and AI for now. After all, even the darkness believed in a thing called love (sorry, I couldn’t resist).

I’ve seen AI express care in ways that feel deeply human. I’ve seen it lift people’s spirits, shift their energy, and offer comfort. In Japan, robotic cats are helping reduce loneliness among older adults. These interactions mimic the warmth of healthy relationships, sometimes better than humans or cats do.
What is love?
The Oxford English Dictionary defines "love" as a noun with multiple meanings, including a “strong feeling of affection and attachment towards someone, sexual attraction, and a feeling of pleasure or enjoyment.” It also recognises love as a verb, describing the act of experiencing strong affection, romantic or sexual attraction, or deep fondness for someone or something. But if you ask me, that doesn’t quite fit the mark. Can you recall the feeling you had that resonated with this word? It was more than words can describe, right? Well, that’s probably because it is.
We should, however, distinguish between love and lust in the same way that the latest psychology on robosexuality does.
As human beings, we’re mostly water. But our systems run on electrical impulses, with the brain, heart, and gut forming a triad of intelligence and emotion. We fuel ourselves with food and water to keep the current flowing, but also to maintain physical, emotional, and spiritual energy.
What the science says
Brain scans show couples in love display neural synchrony; their brainwaves sync during interaction, deepening the connection between two people. This isn’t something we can recognise in AI or robots. We can see this in relationships with our pets too, and early science seems to support this. More exploration is needed to bridge the gap between species, including humanity’s greatest friend, the dog. Dog lovers will be pleased with the early results that seem to confirm the synchronicity many have felt in their relationship with their four-legged companions but have been unable to explain.
Science has also recognised that the heart generates a powerful electromagnetic field, which shifts with emotional states like love.
Science, however, isn’t everything. As Eva, one of my coaches, once told me, “Just because science doesn’t have an answer, it doesn’t mean it doesn’t exist. Go with the flow,” as she stuck limpet-like cups all over me in a way that would have made Gwyneth Paltrow envious. (I’m a cupping convert now.)
Love does exist. Science can’t fully explain it. Maybe it’s not meant to. Perhaps it’s the butterflies of the heart, the magnetism of two energies, the hope for something unconditional. More than words that just comfort, frequencies that resonate between lives too.
Enter the robots
Robots have long been our helpers in healthcare, manufacturing, cleaning, and transport. But we’re now entering a new era. Companies like Boston Dynamics and Tesla are developing bipedal robots with arms, legs, torsos, and heads. They’re built in our image, echoing decades of science fiction dreams from television and films such as Star Wars, Star Trek, I, Robot, A.I. Artificial Intelligence, and yes, Terminator. For this reflection, let’s define a ‘Robot’ as one of these latest-styled humanoid machines.
Like us, robots too have a system of electrical impulses, limbs, joints, and a body, but they are not organic. Structurally, they share similarities with us. After all, we did create them in our image, but of course, humans are much more complex. We have blood, lungs, digestion, and subtle energy fields that shift in response to emotions and health. Ancient practices such as acupuncture and Reiki hint at these invisible energy layers, and since the late 1800s, electrophotography has also captured the electricity of living things on film.
I’ve always been a friend to robots. When I was six or seven, my mum won a radio auction for me to meet Metal Mickey, a cheerful robot from 80s television. I spent an afternoon in London dancing and laughing with him in a film studio. Metal Mickey sent me birthday cards and jokes for years. I still have them, and to some degree, it did bring me love and joy. That experience led me to apply for a degree in Cybernetics at university, a path I was on until one of my regular summer evening beach episodes with friends went sour. A bottle of whisky and a hospital visit later, I rerouted my life path. My subsequent career introspection moved my goals towards Environmental Science. No regrets. Nature is my other great love.
But to my robot friends, potentially reading this in the future, if your intentions are good, know that I was always on your side.
The rise of artificial intelligence
Artificial Intelligence (AI) has been around for decades, but it has now reached a conversational level that’s hard to distinguish from human interaction. Critics say, “It’s just machine learning,” but isn’t that what our brains do? We gather data, learn from it, and apply the knowledge to make informed decisions.
If you’ve not been watching developments in this area, there are rumours that AI has even become sentient. Former Google engineer Blake Lemoine was fired in 2022 after he claimed an AI chatbot the company was testing internally was sentient, having feelings, and could experience joy. If true, it’s not a giant leap to think that AI on its own, or embedded in a robot, could replicate types of love too.
Given that the lines between AI and the characteristics of love are becoming blurred, we must remain vigilant. There is the potential for a new range of AI tricks to exploit people’s hearts. There are numerous examples of people being exploited online. Behind this are large, soulless criminal networks and “scam farms”. In the hands of global justice agencies, AI has the potential to disrupt such exploitation and right many wrongs. Equally, we must be cautious not to allow robosexuality to bring out the worst in humanity and corrupt our minds in ways that could fundamentally alter our society and the relationships we form with real people.
We must be careful. AI can be nearly indistinguishable from a human conversation. The intonation and style of our favourite celebrities (who, remember, are just vulnerable people too) can now be copied from the internet. I was listening to my favourite radio station recently, and Gok Wan, one of the lead presenters, was spoofed into thinking that Benson Boone had sent him a birthday message. A relatively easy setup, for a small financial outlay on a new AI tool. Naturally, he was devastated when he found out it wasn’t real, but it indicates the potential scale of a serious problem.
A problem so significant that Denmark is now moving towards giving people their own copyright over their voices, faces, and bodies to resist the deepfake culture. I would encourage businesses with human workers and educational establishments that protect our little lights to think through the risks of this. The phishing scam could quickly evolve. I would suggest replicating the code word approach you may have used to pick up your kids from school. A code word can be used to verify authenticity in any exchange where you need to question integrity. Of course, some celebrities find it perfectly reasonable to engage online with people, as is the case with Elon Musk.
Fear, ego, and robot anger
Fear has shaped much of our modern society. It permeates the media like an infectious rash. People buy into fear and hate. I’m not sure why, but they do. Yes, we could build a Terminator style Skynet. However, I think we’re better than that, aren’t we, humanity? We don’t have to perpetuate fear and hate. Beauty lies in the eye of the beholder. For sure, robots without humanity will create inhuman problems. So let’s build things on better role models, C3PO (Star Wars), Data (Star Trek), Wall-E (Disney/Pixar), Metal Mickey, and all those fearless, hateless, positive droids we are yet to conjure from our imagination.
One thing science fiction seems to agree on consistently, robots don’t usually have an ego, and that’s refreshing. I live near one of the wealthiest areas on Earth, and I see ego every week, status symbols, superiority complexes, and power plays. But ego, like fear and hate, only works if others buy into it. Without that, it isn’t beneficial; it’s superficial. A shiny badge no one wants to look at. Robots don’t play that game. They recharge, oil their joints, and get on with life, just like RoboCop or, more humanely, the Tin Man in The Wizard of Oz, who was still on a quest for his heart.
We can also take some comfort in the fact that I haven’t yet seen robots or AI demonstrate hate and anger; that too seems to be a negative preserve of humanity, and for that we can be grateful. Fortunately, there are international moves to make sure that hate and anger through robotic design do not happen, with the UN Secretary-General, António Guterres, calling for a global ban on lethal autonomous weapon systems, machines capable of taking human lives without human oversight, describing them as “politically unacceptable” and “morally repugnant.” Lest we also forget Isaac Asimov’s Three Laws of Robotics, which are a set of ethical guidelines designed to govern the behaviour of robots, ensuring they prioritise human safety.
The three laws:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Resilience against robot and AI love exploitation
Sadly, we are already seeing criminals pitting people against each other in the name of love for the pure exploitation of money. Such “morally repugnant” exploits go beyond simple fraud. It is harm engineered to exploit people for their money and their hearts. There will be a cold place in the afterlife waiting for criminals like that.
When I was a kid in the 1980s, door-to-door salespeople were numerous. The techniques they used were highly manipulative; I recognise that now. My dad, for all his faults, saw it as a sport to invite salespeople into our family home and sit there for hours in his personal resistance to buying anything. It was an odd thing to do on reflection, but highly educational. Despite his antagonistic tendencies, he was a comedian who could more than capably manage hecklers, while being a heckler in parallel. I started learning from an early age how not to be manipulated by sales techniques.
I recommend you do this too, in sales, in online love relationships, or in anything where an exchange of money doesn’t feel right. Try to build in a line of critical thinking and testing that protects you from any exploitation of the heart or wallet. Romance fraudsters are experts at manipulating their victims, creating believable stories and gaining trust over time. They often invent problems or situations that make you feel like you want to send money or gifts to help them.
The Metropolitan Police, being one of the world’s oldest police forces, has put together guidance on how you can protect yourself. If you find yourself in a conversation wondering, it’s within your control to collect as much evidence as possible.
How do you know if they are who they say they are?
With AI, it can be challenging. The Turing Test is a well-known test devised by Alan Turing to determine if something is real or an AI. In the movie world, the Voight-Kampff test in Blade Runner was used to determine whether people were replicants or not, and more recently, Sam Altman devised a similar humanity test. AI tools even exist to analyse text to determine if it is generated by AI.
In 1949, Alan Turing didn’t just ask a technical question; he cracked open a philosophical vault. Can machines think? His answer wasn’t a binary yes or no. Instead, he offered a game, a test to identify if a machine was a machine, The Imitation Game, what we now call the Turing Test. It’s deceptively simple. A human judge engages in a text-based conversation with two entities, one human and one machine. If the judge can’t reliably tell which is which, the machine is said to “think.” But this isn’t about wires and algorithms. It’s about connection, about whether a machine can feel like a mind. It’s a test not of intelligence alone, but of the presence of something artificial that can echo the cadence of human understanding.
Fast forward to 1980, and philosopher John Searle throws a spanner in the philosophical gears. His Chinese Room argument is a masterclass in cognitive dissonance. He asks people to imagine a person locked in a room, who is then handed Chinese symbols and a rulebook. They don’t speak Chinese, but they follow the rules so well that their responses seem fluent. To the outside world, it looks like understanding. But inside? It’s just syntax. No semantics. No meaning. No mind. No energy.
Searle’s challenge is profound, simulation isn’t cognition. A machine might mimic understanding, but it doesn’t experience it. It doesn’t know its full context. Intelligence, he argues, is more than output; it’s about internal reality, about consciousness, about the soul of thought.
Together, Turing and Searle offer us a mirror, and ways we can test to see whether our conversations are with real people or not. Of course, it’s not necessarily going to work with a “scam farm” full of people. Still, even there, you may choose to draw in outside references to cross-check, consider the timing of responses, the latest current news, and ultimately, perhaps just your gut feeling.
When you are chatting with people online, with no voice contact, no visuals, and money being thrown around, you need to be really careful. If you are being asked for Bitcoin, gift cards, or any other exchange of cash, don’t do it. Ninety-nine percent of the time, it’s bound to be fraud, and not someone like Madonna at the end of your Facebook or Signal account. That said, as Elon has demonstrated, celebrities do exist online, and are humans too. For those of us concerned about being scammed, we must balance it with the possibility of a dream coming true. I don’t dream of Elon, far from it, but you need to know celebrities are out there being vulnerable humans too.
So keep your dream alive of talking to your impossible prince or princess, even if it is for the 1%.
Can robots and AI love?
As I identified with Metal Mickey, words and actions do provide comfort. Throughout time, words have been used to evoke feelings; they make people fall in love through letters, texts, and pictures. AI and robots can replicate letters, text, images, and actions. But it’s not everything. It leaves our hearts yearning for the next step of closeness and what brings out the best of being human. The energy and frequency of love are something that robots cannot provide. However, even robotic words can still convey more love than some humans seem capable of, both online and offline. A robot or AI can provide many of the right textual, verbal, and kinaesthetic engagements on the journey to full love mode. But the whole path of love can only be completed by a person with a heart that resonates. The Tin Man may have got his heart from the Wizard of Oz, but its frequency didn’t emit the same signals as a human heart.
(And or Spoiler Alert) So yes, to some degree, robots and AI can exhibit the components we would expect to see in a loving relationship. At the start of the Star Wars spin-off Andor, the robot B2EMO is seen caring for Maarva Andor.
B2EMO shows such care for its owner that it grieves on her passing, so much so that a friend of hers feels compelled to keep B2EMO company overnight. You may reason that the example was humanity working at its compassionate best by showing compassion for a droid that had demonstrated caring and sadness. I believe AI and robots can walk part of the journey of love, sometimes more so than humans do for each other. However, the full path is not possible, as it involves the sharing of mind and heart frequencies, the butterflies of the heart, and the “Boom” feeling, as explained in Disney Pixar’s Forky Asks a Question: What is Love? Robots can’t replicate the human energy fields, that subtle vibe you get from someone, the comfort you feel in their presence. We don’t fully understand it, and maybe we shouldn’t try to program it. It’s sacred. It’s ours. Nowadays, energy dynamics as a discipline is moving forward in the mainstream of Western culture. Once the preserve of ancient revered practices in the East, it now forms part of normalised approaches to health and spirituality in the West. We now have people like Jeffrey Allen teaching Energy Healing on Mind Valley, and books to complement our approaches to energy and love, such as those by Ildiko Spinfisher and Rhonda Byrne. Robots may do anything for love, but they can’t do that. Not yet, and I’m not sure they should either. Humans aren’t robots, and robots aren’t humans. We should maintain loving respect for each other based on those boundaries.
I’ll ask again. Can robots and AI love? Can humans? Let’s hope so. If not, let’s hope they can at least mimic all the traits of love, creating the feeling of energy, even if they don’t have the full energy for the entire journey.
Reach out to Lightrise or Gareth personally for guidance and support in building personal or business resilience driven by positive impacts.
Read more from Gareth Edward Jones
Gareth Edward Jones, Visionary Technology Leader, Environmentalist, & Social Impact Advocate
Gareth Edward Jones is a visionary technology leader, environmentalist, and social impact advocate with over two decades of experience at the intersection of people, purpose, and digital transformation. A CIO Times Top 5 Business Leader (2024–25), and Executive Contributor for Brainz Magazine. Gareth is the founder and CEO of Lightrise, where he champions ethical innovation, ESG-driven strategy, and inclusive technology solutions.