top of page

Oops, AI Just Snatched Your Voice, Face, and Cat Pics and Might Be Using Them Better Than You

  • Writer: Brainz Magazine
    Brainz Magazine
  • Oct 22, 2025
  • 4 min read

AI isn't just a nosy roommate anymore, it's more like a con artist wearing your hoodie, your face, and maybe even your LinkedIn profile. From apps quietly stockpiling your selfies to bots absorbing every rant you've ever posted at 2 a.m., your digital DNA is being cloned without so much as a "thanks."


Woman in blue-striped shirt holds red phone, surrounded by digital interface graphics, conveying a tech-savvy and futuristic mood.

The fallout? Deepfakes that can nuke reputations, stolen data feeding shady algorithms, and a professional identity crisis where your AI twin outperforms you in job interviews. Health-wise, constant digital exposure and the anxiety of not knowing where your data goes can chip away at mental well-being.


The fix is to audit your apps like you're catching a bad ex in a lie, call out sketchy terms of service, watermark your work, and maybe stop yelling into the internet (just kidding, kind of). Protect your digital self before AI makes a deepfake version that sings karaoke.


The free-for-all of your digital DNA


Remember when you posted that one picture of your cat wearing sunglasses? Cute. A global news organisation reported that photos of actual Australian children were swept into massive AI training datasets without their parents' knowledge or permission.[1] If kids' faces aren't safe, what makes you think your tabby's Instagram career isn't being hijacked for machine learning glory?


Then there's Reddit! You thought your 2 a.m. existential rant about cereal mascots went unnoticed? Nope. Reddit is suing AI company Anthropic for allegedly scraping user comments to train its chatbot Claude.[2] Which means, congratulations, your most unhinged post might already be living rent-free inside a chatbot's brain.


"No one knows how AI is going to evolve tomorrow. Personal data is not legally protected, and therefore not protected from misuse by any actor or any type of technology."[1]


The serious fallout (yes, even beyond cat pics)


Jokes aside, the consequences aren't just embarrassing. We're talking about:


  • Professional harm: Imagine a deepfake version of you nailing a job interview better than the real you.

  • Identity theft: Your voice and face could be cloned into scams or fake endorsements you'd never sign off on.

  • Mental health stress: Constant uncertainty about where your data ends up can leave you spiraling harder than a YouTube rabbit hole at 3 a.m.


Top 10 internet AI self-defense moves (2025 edition)


Protecting your data in an AI-driven world demands active defense.[3] Think of it as digital karate, except your opponent is invisible, tireless, and really into scraping memes.

Here are ten steps to fight back:


  1. Audit app permissions: Like you're catching a bad ex in a lie. Revoke unnecessary access to your camera, mic, or contacts.

  2. Scrutinize terms of service: If it's longer than War and Peace and full of vague "may use your data" clauses, beware.

  3. Watermark or cloak your images: Subtle tech like Fawkes can scramble facial recognition systems without altering the photo.[4]

  4. Use privacy tools: VPNs, tracker-blocking browser extensions, and encrypted messaging apps are your new digital sunscreen.

  5. Set up Google Alerts for your name: So you'll know if your "digital twin" shows up somewhere it shouldn't be.

  6. Strong authentication: Two-factor, passkeys, and unique passwords for every account. Think of it as adding multiple locks on your digital front door.

  7. Keep everything updated: Outdated apps and systems are hacker candy. Enable automatic updates.

  8. Be mindful about oversharing: Sure, post your latte art, but maybe skip the geo-tagged vacation pics until you're back home.

  9. Support stronger AI regulation: Systemic change matters. Laws should make your data yours, not free fuel for algorithms.[3]

  10. Practice digital minimalism: Less data shared = less data to exploit. Ask yourself before posting, "Does future-me want this living online forever?"


The bottom line


Protecting your digital self doesn't mean going off the grid and raising goats in the mountains (though tempting). It means fighting smarter, locking down permissions, watermarking your creations, and supporting policies that force companies to treat your data as your property.


Because here's the deal, if you don't protect your digital self, AI will. And unlike your roommate, it won't even leave a passive-aggressive Post-it on the fridge.


Follow me on Instagram for more info!

Read more from Maranda Sloan

Maranda Sloan, Special Guest Writer and Executive Contributor

Maranda is a passionate advocate for holistic health, with a personal journey that led to the creation of Haloblujuices, where life breeds life. Through research and hands-on experience, she’s uncovered innovative, sustainable solutions that go beyond conventional wellness. Maranda empowers others to make meaningful improvements to their health, offering a refreshing approach to living well.

References:

[1] The Guardian. (2024, July 3). Photos of Australian children used in dataset to train AI, human rights group says. Retrieved here.

[2] AP News. (2025, September 18). Reddit sues AI company Anthropic for allegedly ‘scraping’ user comments to train chatbot Claude. Retrieved here.

[3] Stanford HAI. (2024, Jan 16). Privacy in an AI era: How do we protect our personal information? Retrieved here.

[4] Shan, S., Wenger, E., Zhang, J., Li, H., Zheng, H., & Zhao, B. Y. (2020). Fawkes: Protecting Privacy Against Unauthorized Deep Learning Models. In Proceedings of the 29th USENIX Security Symposium (USENIX Security ’20) (pp. 1589-1604). USENIX Association. Retrieved here.

Tags:

 
 

This article is published in collaboration with Brainz Magazine’s network of global experts, carefully selected to share real, valuable insights.

Article Image

Why Talking About Sex Can Kill Desire and What to Do Instead

For many of us, “good communication” has been framed as the gold standard of intimacy. We’re told that if we could just talk more openly about sex, our needs, fantasies, and frustrations, then desire...

Article Image

Is Your Business Going Down the Drain?

Many business owners search for higher profit, stronger staff performance, and better culture. Many overlook daily behaviour on the floor. Most profit loss links to repeated small actions, unclear roles...

Article Image

7 Signs Your Body Is Asking for Emotional Healing

We often think of emotional healing as something we seek only after a major crisis. But the truth is, the body starts asking for support long before we consciously realise anything is wrong.

Article Image

Fear vs. Intuition – How to Follow Your Inner Knowing

Have you ever looked back at a decision you made and thought, “I knew I should have chosen the other option?” Something within you tugged you toward the other choice, like a string attached to your heart...

Article Image

How to Stop Customers from Leaving Before They Decide to Go

Silent customer departures can be more costly than vocal complaints. Recognising early warning signs, such as declining engagement, helps you intervene before customers decide to go elsewhere...

Article Image

Why Anxiety Keeps Returning – 5 Myths About Triggers and What Real Resolution Actually Means

Anxiety is often approached as something to manage, soothe, or live around. For many people, this leads to years of coping strategies without resolving what activates it. What is rarely explained is...

Why the Grand Awakening Is a Call to Conscious Leadership

Why Stress, Not You, Is Causing Your Sleep Problems

Healthy Love, Unhealthy Love, and the Stories We Inherited

Faith, Family, and the Cost of Never Pausing

Discipline Unleashed – The 42-Day Blueprint for Transforming Your Life

Understanding Anxiety in the Modern World

Why Imposter Syndrome Is a Sign You’re Growing

Can Mindfulness Improve Your Sex Life?

How Smart Investors Identify the Right Developer After Spotting the Wrong One

bottom of page