The Tea Isn’t Spilled—It’s Stored, Mined, and Monetized
The unexplained story behind the Tea leak
I wasn’t planning to write about the Tea app.
Honestly, it felt too ridiculous. But then I saw the screenshots.
Moods. Journals. Session times. All just sitting out in the open, like someone took a therapy session and left the notes posted on the bulletin board at the local coffee shop.
It’s gone semi-viral, and for good reason.
People trusted a pastel-colored journaling app with their innermost thoughts. Meanwhile, the backend was stitched together with duct tape, Firebase defaults, and what I assume was a forgotten slack message that said, “Add encryption?”
Naturally, I had to jump in.
Not because this leak shocked me (it didn’t), but because it fits perfectly into what I’ve been hammering all week—in the newsletter and on the podcast.
Behavioral surveillance is getting softer. And sneakier.
“We’re pouring our emotional lives into apps that were built like marketing tools, not safe spaces.”
This Tea situation? It's not an isolated screw-up. I see it as a sign of where we're headed:
Wellness tech as emotional extraction engines. Calming interfaces hiding aggressive data practices. “Private” mood logs that end up training ad and AI models.
So yeah, let’s talk about the Tea app.
Let’s talk about soft surveillance.
And let’s talk about why “mental health tech” is becoming one of the most dangerous categories on your phone—and no one seems to care.
Before using an app like this it’s a great opportunity to find out exactly what data is being used, by who, and where it goes.
Use my latest tool Clarity
Also if you haven’t upgraded yet to get the full Beyond The Firewall experience
Wellness Apps Are the New Trojan Horse
This is starting to make my blood boil.
People will scream about privacy violations if a government agency reads their text messages—but they’ll happily log their depression cycles, their 2AM spiral journaling, and their relationship stress into an app made by a VC-backed startup using a $9/month Firebase tier.
Because the user interface looks soft. The notifications are friendly. The branding says “self-care.”
That’s the Trojan horse. Wellness apps don’t look like surveillance tech. They look like help.
“The cleaner the interface, the dirtier the backend usually is.”
This is the sinister game they play.
These apps promise calm, but they’re often built with the same infrastructure as ad-tech:
Firebase Analytics tracking user behavior in real time
Third-party SDKs (software development kits) for A/B testing, attribution, and monetization
Session logs and mood entries stored in plaintext or weakly encrypted databases (or in this case no encryption)
Minimal security hardening—no threat modeling, no proper authorization flows, default access tokens still active
No local-only mode—everything gets piped back to the cloud by design
Most of these apps weren’t built by people who understand privacy. They were built by product managers who understand getting a project to the end user as quickly as possible, often cutting corners.
They use your mood data to personalize your “experience,” sure—but they also use it to shape retention flows, upsell premium tiers, and experiment with engagement nudges that prey on emotional lows.
If you log “feeling anxious” for three days in a row, that becomes a trigger. A push notification might suddenly remind you to come journal. Or upgrade. Or share feedback.
Sounds helpful.
It’s not.
It’s behavioral nudging based on vulnerability.
“You’re not being supported. You’re being segmented.”
What Actually Got Leaked—And Why It Matters
The headlines said “data leak.” The reality? It was an open window into the most intimate parts of people’s lives.
From what we know, the Tea app leak exposed:
User session logs – every time the app was opened, closed, how long sessions lasted.
Mood entries – raw emotional check-ins like “anxious,” “unmotivated,” “hopeful.”
Time-stamped logs of emotional state changes, with precise session timing and ordering.
Device metadata – including model, OS version, and sometimes IP-linked details.
Behavioral flows – how users moved through different features in the app.
On paper, this sounds like generic analytics. But here’s what people miss:
“In emotion tech, context is identity.”
If someone logs “anxious” at 3:14AM five days in a row, on the same IP, with the same emotional pattern, that’s no longer anonymous. That’s a behavioral fingerprint.
This isn’t like leaking an email address. This is how people feel when they’re at their most exposed—insomniac, overwhelmed, reaching for comfort in an app that promised privacy and gave them Firebase logs instead.
And because this data was time-stamped, sessionized, and tied to technical fingerprints, it’s trivially easy to correlate with other app activity, location data, or advertising IDs.
With a little cross-referencing, you could link those mood logs to:
Fitness tracker data
Google Maps history
Instagram scroll patterns
Spotify emotional playlists
Purchases made within hours of “low” mood entries
You see where this goes?
Once you can line up someone’s emotional volatility with their consumer behavior, you don’t just have a user profile. You have a psychographic weapon.
This means people can be grouped based on social status, interests, and opinions
Leaks like this aren’t just embarrassing. They’re exploitable.
Want to see the end game of this? Check out my post from last week on the surveillance economy:
The Myth of “Safe” Data in Emotion Tech
There’s this comforting lie baked into the marketing of wellness and mental health apps:
“We don’t collect sensitive data. Just general mood logs, usage metrics, and anonymous behavior trends.”
Sounds nice. But it’s bullshit.
Because emotional data—especially time-stamped, device-linked, repeatedly logged emotional data—is sensitive, whether it contains your name or not.
Let’s break the myth wide open.
❌ “It’s anonymized.”
No, it’s not. Not really.
The Tea app, like most startups, didn’t build true anonymization. They stored logs tied to device IDs, IPs, or session tokens. That’s re-identifiable with basic cross-referencing. Especially if you're logging emotional states from the same phone, on the same WiFi, at the same time every night.
“Anonymity ends the second you behave like yourself.”
❌ “It’s not personal—it’s just moods.”
You know what’s more revealing than a search history?
Your mood history.
Searches show curiosity.
Mood logs show vulnerability.
If a model knows when you feel anxious, burnt out, jealous, or lonely—it doesn’t need to know your name. It knows when you’re weak. It knows when to sell. It knows how to shape the next “nudge.”
The patterns are the payload.
❌ “It’s for your benefit.”
Let’s be real.
You think your three-day sadness streak is triggering a care protocol?
More likely it’s being flagged for engagement optimization:
“Users are more likely to open the app after logging low moods—trigger push notification.”
“High anxiety users spend more time in the premium journal view—offer upsell.”
“Correlate low mood logs with Instagram scroll sessions for ad retargeting.”
This is why emotional data is so lucrative. It doesn’t just describe how you feel—it predicts how you’ll act.
Real Stats That Got the Memo… or Didn’t
A study of wellness apps found over 60% collected personal data, but only 30% implemented strong encryption
Another analysis showed 89% of mental health and wellness apps push data online, often without local-only modes or proper controls
And in some app portfolios, nearly 41% had no privacy policy at all—no transparency about what data was tracked, retained, or shared
These aren’t minor oversights. They’re systemic failings—and they’re in apps people use to explore their vulnerabilities.
What You Can Actually Do – Tactical Digital Detox
Alright. So your emotional state is being mined. Your mood logs are up for grabs. And your “calm” app might be quietly working with the same analytics stack as a mid-tier mobile game.
But this isn’t a lost cause. Here’s how to fight back—smartly, practically, and without needing to go off-grid.
Step 1: Assume Emotional Data = Sensitive Data
If an app collects how you feel, treat it like it’s storing your medical records. Because in the wrong hands, it basically is.
Red Flags:
No local-only mode
No clear privacy policy
Vague “data sharing for service improvement”
Built on Firebase or other default-insecure cloud tools (yes, you can often check this in app store reviews or basic searches on your favorite search engine)
If you wouldn’t text it to your boss, don’t log it in an app.
Step 2: Break the Tracking Chain
Even if you’re not logging emotions directly, emotional profiling happens via behavioral context: when you open the app, what other apps you use before/after, how fast you scroll, where your thumb hovers, etc.
Your Moves:
Split identities – Use alternate browsers or profiles for different tasks. Don’t journal and shop in the same data session.
Network-level defenses – Use tracker-blocking DNS (like NextDNS) or a hardened VPN (like Mullvad, ProtonVPN etc.) to strip out metadata and calls to third-party ad SDKs.
Use trusted tools – Local-first, encrypted-by-default, or open-source alternatives. You can always run them sandboxed if you're unsure.
If you’re unsure where to start, I’ll say this clearly:
I recommend Proton’s suite, Brave, Vivaldi, Firefox, Bitdefender, Malwarebytes, Above Phone, Kanary, Incogni, Optery, DeleteMe. (If you want to try something else—ask me first.)
Step 3: Clean Your Emotional Exhaust
Just like a car leaves behind emissions, you leave behind behavioral exhaust. The goal here isn’t total erasure—it’s noise. Confuse the machine.
Practical moves:
Wipe unused apps. If you haven’t opened it in 30 days, kill it.
Request data deletions. Use tools like Kanary or Optery to trigger removal requests.
Log manually in secure environments. Pen + paper still wins for true privacy. If it must be digital, try local-only apps like Obsidian or Standard Notes (with no cloud sync).
Don’t link payment accounts to emotion-based apps. Keep that data island separate. Always.
“If they can’t predict you, they can’t profit from you.”
Ready to Take This Seriously? The Digital Detox Clinic Is Open
Look—I’m not here to scare you for clicks. If you’ve been following my recent posts or listening to the podcast, you already know this isn’t a one-off Tea app drama.
This is systemic.
It’s Google watching what you almost search.
Meta feeling out your emotions mid-scroll.
Amazon guessing when you’ll finally buy.
Now it’s “mental health” apps capturing your mood swings and anxiety logs and storing them like CRM data.
And it’s all stitched together.
If you’re starting to see the pattern, good. That means your defenses are waking up.
The Digital Detox Clinic is the perfect next step.
This isn’t about going full hermit or smashing your phone with a hammer. It’s about reclaiming agency. Taking back leverage from platforms that profit off your mind, your mood, and your moments of weakness.
Inside the Clinic:
I show you how to map your real exposure—not just your settings, but your behavior
We walk through getting that exposure cleaned up (the easy way)
I show you the hidden settings in all your social media that need to be adjusted right now
You’ll learn how to opt-out of data brokers the right way
And You’ll get a proven strategy to stay private long term
“You don’t need to disappear. You just need to become unpredictable.”
→ Click here to enter the Clinic
For the next 24 hours you can get special insider pricing usually reserved for my paid subscribers.
Before You Go: Drop a Comment, Spill the Tea
Let’s crowdsource the real stories.
Have you ever caught an app reacting to your emotions a little too closely?
Do you remember the moment you realized your “private” data wasn’t actually private?
Or are you still trying to figure out where the hell all these creepy ads are coming from?
Leave a comment. Name the app. Share the moment. The more we say it out loud, the harder it is for these systems to hide behind “oops, we care about your privacy.”
Because they don’t.
But we do.
Coming up Monday join me for a brand new episode of The Privacy Files.
If you missed the last one you’ll want to watch it now
The Privacy Files: Deep Cover Edition
In this special edition of Privacy Files, we're launching Deep Cover—a new format that goes beneath the headlines and straight into the mechanisms of control.
Until next time…
Great write up! This was one hell of a mess up. I recently helped a client who was literally deploying “apps” from lovable to their community with NO supabase or any authentication. Forms with hardcoded data…for grants
The more perfect things are from the outside, the more rotten they usually are from the inside.
Applies perfectly to this situation.
Even saw a youtuber doing an analysis on this situation and God the architecture tea is built on is so messed up, it has no security measures, no sanitizations.