How Online Gaming Companies Are Destroying Our Children's Lives?
- 18 hours ago
- 10 min read
Written by John Comerford, Author/Motivational Speaker
John Comerford is the author of Tarzan Loves Jane and Battle Armour (25 Tools for Men's Mental Health). John is also one of the authors of the number one Amazon best-selling book series, Start Over.
There’s a word we carry quietly, locked behind the armour we build to survive, lonely. For men, especially those stumbling through the long shadow of childhood trauma, loneliness is not just a passing mood. It’s an epidemic, gnawing at self-worth, sabotaging connection, and deepening wounds that never seem to heal.

In 2025, loneliness in men isn’t just trending on social media. It’s a public health crisis, woven through the stories we refuse to tell and the pain we rarely admit. This article, written in solidarity with every man who has ever felt unseen, explores why loneliness devastates male mental health, how childhood trauma shapes isolation, and ways to reclaim connection and hope.
The crime scene hiding inside your child's gaming console
Every single second, ten children somewhere in the world are being sexually exploited or abused online. One in twelve children globally, approximately 300 million a year, are victims of online sexual exploitation. In Australia alone, 82,764 reports of child exploitation were made to authorities in 2024/25, a 41% jump from the previous year. Online enticement reports to the US National Centre for Missing & Exploited Children nearly doubled in the first half of 2025, surging from 292,951 to 518,720.
Much of this happens inside children's games. In these environments, predators can start grooming a child in just 19 seconds, the time it takes to pour a cup of coffee. Most parents do not realise that the device they gave their child could connect them directly to organised paedophile networks worldwide. Parents should watch for warning signs such as excessive secrecy about online activities, sudden changes in behaviour, reluctance to talk about new online friends, and spending significant time on social media or gaming platforms. Recognising these signs early can help intervene before harm occurs.
A single predator used one gaming console to victimise 459 children
This is a global issue. Here in Australia, a 27-year-old man was arrested in Maryborough as part of Operation X-ray Wick. Over 12 months, specialist investigators conducted forensic examinations of his electronic devices and uncovered over 23,000 videos and images of offending against 459 identified victims across Australia and overseas. He has been charged with 596 offences, including producing child abuse material and engaging in sexual activity with a child using a carriage service.
Police say this man targeted children on social media and gaming platforms from 2018 to 2025, using many profiles to groom or coerce children, mostly aged 7 to 15. He carefully saved and organised the material into named folders, creating a disturbing collection. Among the devices police took from his home were gaming consoles. One man, over seven years, harmed 459 children. Investigations are still ongoing. Crime Command Detective Acting Chief Superintendent Denzil Clark was blunt. "We are seeing an increasing prevalence of children being groomed, coerced, or threatened into taking and sending sexual images of themselves, often through popular apps, games, and social media sites. The trauma that this causes a child is significant". These stories are not just warnings. They are evidence of crimes where children's games are the tools.
Uncomfortable truth: Popular games made for children have become tools for exploitation, and company leaders often overlook this problem
To be clear, the gaming industry has built colourful, immersive digital worlds for children. These worlds have chat, voice, and messaging features that let strangers contact kids directly, but oversight has not always kept up with how complex these platforms have become. Roblox has over 85 million daily active players, the vast majority of them children. Researchers who tested the platform described the risks as "deeply disturbing", pointing to a "troubling disconnect between child‑friendly appearance and reality". Test avatars overheard players discussing sexual activities and heard sounds associated with sex acts during voice chat. In one test, an adult avatar successfully requested the Snapchat details of a five‑year‑old's avatar using language vague enough to bypass moderation. An adult contacted a five-year-old's avatar on a platform designed for children. A former Roblox employee was quoted in court filings as saying, "You have to make a decision, right? You can keep your players safe, but then there would be fewer of them on the platform. Or you just let them do what they want. And then the numbers all look good, and investors will be happy".
Every boardroom in Silicon Valley should remember that quote. Fortnite is no safer. Predators use Fortnite Creative maps to post vague prompts like "Join our Discord" that funnel children off the platform and into unsupervised servers with no age gates, no safeguards, and no way of knowing who's on the other end. When concerned users report these tactics, they're met with automated responses and silence. A parent trying to flag the same issue to Epic Games was passed from inbox to inbox until the complaint quietly died. The pattern is the same across every platform, predators initiate contact through in‑game chat, build trust, then push children to move conversations to third‑party apps like Discord, Snapchat, or WhatsApp, where monitoring is weaker, and abuse escalates. A UNICRI report warned that as gaming and metaverse platforms expand, developers have focused on "rapid user growth, which can come at the cost of effective community moderation and management". The UN is saying these companies chose growth over children's safety, and the industry has barely reacted.
The lawsuits are piling up, and they paint a devastating picture
Legal action has started, and the findings are serious. Louisiana sued Roblox, with Attorney General Liz Murrill accusing the platform of being "overrun with harmful content and child predators" and alleging it prioritises "growth, revenue, and profits over child safety". Kentucky Attorney General Russell Coleman called Roblox a "playground for predators" and told parents directly, "Get your kids off Roblox". His 68‑page complaint alleges Roblox knowingly enabled the sexual exploitation and abuse of children across the United States, and that in‑game currency, Roblox, is used by predators to lure kids into dangerous situations. Law firms in several states have filed lawsuits for families whose children were exploited on the platform. One lawsuit called Roblox "a digital and real-life nightmare for kids." By December 2025, more than 80 lawsuits had been consolidated into multidistrict litigation in California under Judge Richard Seeborg, with more expected to follow. Several cases allege severe abuse and, in some instances, suicide following prolonged grooming on the platform.
Roblox's sign-up process only asks for a username, password, and birthdate, with no age verification. For years, anyone could send direct messages to children. The company only started blocking adults from messaging users under 13 in 2024, long after the problem was known.
In Australia, the eSafety Commissioner required Roblox to make accounts for users under 16 private by default. These protections, which many expected from the beginning, were only introduced after the government stepped in because the company had not acted on its own. Roblox responded by calling the lawsuits "sensationalised, outdated and out-of-context". Tell that to the 459 children in Queensland.
The AI accelerant: When technology becomes a predator's best friend
On top of the gaming industry's negligence, artificial intelligence has made the problem even worse.
AI-generated child sexual abuse material increased by 1,325% from 2023 to 2024, according to industry tracking organisations.
AI-generated abuse videos rose by 26,362% in a single year between 2023 and 2024, as tracked by online safety organisations.
Sixty-five percent of AI-generated material is classified as Category A, the most extreme level, showing rape and torture of children, according to online child protection agencies.
US reports of generative AI linked to child sexual exploitation rose from 6,835 to 440,419 in one year, as documented by national reporting centres.
Predators now use AI to create sexually explicit images of children who are just playing games like Roblox, Fortnite, and Call of Duty. They alter a child's face and body to make realistic abuse material from a game screenshot. The Internet Watch Foundation calls this a "child sexual abuse machine" and says it has never seen a crisis like this in 25 years of tracking online abuse. So far, gaming platforms that host children's images and avatars have done little to stop them from being taken and misused.
The grooming playbook: Six steps that take 19 seconds
Australian Federal Police investigators have identified a six‑step grooming process that plays out with chilling consistency inside games and on social platforms:
Fake account: The predator creates a profile pretending to be a child, using stolen photos and teenage slang. In games, they pose as a skilled player offering help.
Trust building: Innocent chat, "What games do you play?" "I hate school too." The predator mirrors the child's world.
Off-platforming: The conversation moves to encrypted apps like WhatsApp, Telegram, or Kik, where no one is watching. The predator starts asking personal questions, "Are you happy at home?"
Sexual conversation: Sexual topics are introduced gradually, framed as normal curiosity. Boundaries are tested.
Sexual images: Requests escalate from selfies to swimwear to nudity. Some predators pose as talent scouts, others share stolen images to create false trust.
Blackmail and exploitation: Once compromising material is obtained, the trap snaps shut, "Send more or I'll tell your parents." Terrified and ashamed, children comply as demands escalate.
On some gaming platforms, it takes only 19 seconds to go from the first message to a sexual conversation. What once took months now happens in the time it takes a parent to load the dishwasher.
What parents need to know right now?
Predators don't pick children at random. They target the lonely, the struggling, the ones seeking validation, and they weaponise three tools:
Fear: "I know where you live." "I'll send these photos to everyone at school." "I'll hurt your family."
Flattery: "You're more mature than other kids." "No one understands you as I do." "You're beautiful."
Fake offers: "I'm a talent scout." "I can make you famous on TikTok." "I'll send you Robux if you send me a picture."
A smartphone is not just a harmless toy, it is a portal. A gaming console is not a babysitter, it can be a potential crime scene. Parents need to know which apps their children use, which games they play, and who they talk to online. They should also create an environment where kids feel safe to speak up if something feels wrong. To foster this trust and openness, consider starting conversations with questions like, 'Have you come across anything online that made you uncomfortable?' or 'What do you like most about the games you play, and are there parts you don't enjoy?' You can also encourage kids to talk about any new friends they've made online by asking, 'Who did you play with today?' or 'What's your favourite thing about talking to your gaming friends?'. These conversation starters can help parents stay informed while making children more comfortable sharing their online experiences.
What must change and who must be held accountable?
The gaming industry must face legal consequences
Self-regulation has failed badly. Companies like Roblox and Epic Games, which create worlds for children, must require age verification, real-time AI-powered grooming detection, and strict enforcement backed by law, not just public statements. If you build a digital playground and invite millions of children, you have a duty of care. This is not just a suggestion, it is a legal, moral, and ethical responsibility. Parents can play a critical role in advocating for change. They should consider reaching out to local lawmakers to demand stronger regulations and accountability from gaming companies. Joining or supporting organizations that focus on online safety and child protection can amplify their voice. By participating in campaigns or initiatives that call for legislative reforms, parents can help ensure the protection of all children online.
Governments must act more quickly
Australian law enforcement needs more funding and resources. The eSafety Commissioner's action with Roblox shows that regulation works, but it came years too late for many children. Stronger penalties for AI-generated abuse material and platform negligence must be made law now.
Schools must teach digital survival
Every school in Australia should have thorough online safety programs, starting in primary school, that show children how grooming works, what warning signs to watch for, and where to get help.
Survivors need support
Millions of children have already been victimised. They need mental health services that understand trauma, legal help, and communities that believe them.
The battle armour mission: Breaking the silence
Through my Battle Armour podcast and advocacy, I have heard from survivors of childhood sexual abuse. Many of us have been burdened for years. Shame, self-blame, and the belief that we were at fault. Silence protects predators, not children. The 300 million children being exploited online today could become tomorrow's traumatised adults if we do not act now. Many will not survive. Financial sextortion alone has led boys and young men to take their own lives. For 40 years, I thought I was alone and that speaking up would ruin me. I was wrong. Speaking up saved my life. Breaking the silence gave me purpose. Helping other survivors showed me I am not alone, and neither are they.
The children being groomed on Roblox right now, the teenagers being blackmailed on Discord, the children being groomed on Roblox, the teens being blackmailed on Discord, and the young gamers being led from Fortnite into predator-run servers, they are all of us. They are me at 11 years old, confused and hurt, carrying a secret that affected me for forty years. We need to hold the people who built these platforms accountable for what happens inside them.
The reckoning
The gaming industry created these dangerous spaces. They allowed children to be harmed. When the lawsuits came over 80 so far, they called the evidence "sensationalised". It’s been reported that one person, one screen, and one gaming console can ruin 459 childhoods. The platforms that allowed this are worth billions. It isn't whether the gaming industry can protect children. It's whether they ever intended to. The 300 million children who depend on us deserve an answer.
If you or someone you know has experienced childhood sexual abuse, support is available through local government agencies.
Read more from John Comerford
John Comerford, Author/Motivational Speaker
John Comerford is a leading advocate for men’s mental health and trauma recovery. A survivor of childhood sexual assault, he spent 40 years suffering in silence. After a suicide attempt, John began the journey to confront his past and rebuild. His book "Tarzan Loves Jane", a dark romantic comedy, is based on his true story. He later created "Battle Armour" and "25 Tools for Men’s Mental Health" to give back. Today, he speaks, writes, and leads with one clear message to all men, Speak up.
His mission states, "No man suffers in silence".










