How To Know Trust When You See It (Without Guessing)
An introduction to Privacy-Enhancing Technologies
Most of what I publish focuses on what’s broken.
The manipulation. The surveillance. The quiet, back-end deals where your data becomes currency.
And honestly?
There’s a lot to call out. A lot of rot behind the interfaces we use every day.
But here's something I’ve noticed—especially lately:
Not just myself but many people in many arenas are great at pointing out what’s wrong.
But they rarely offer a clear picture of what right looks like or what a good solution would be.
For my area of discussion this means going beyond just “use Signal” or “turn off app tracking,” but:
What does real digital trust actually look like?
What does it mean for a platform to handle your data well?
How do you know if a tool actually deserves your trust—or if it’s just good at pretending?
This post is my attempt to answer that. Not with platitudes, but with practical signals.
If we can’t define what good looks like, we’ll keep settling for “not as bad.”
So today, I’m going beyond just calling out the flaws, and mapping the actual blueprint for, in my opinion, what right looks like.
We’ll look at privacy-enhancing technologies (PETs) in plain terms.
I’ll show you how the most trustworthy platforms are built differently.
And I’ll give you a clear checklist to spot the difference between real digital trust... and marketing copy pretending to be it.
So instead of more criticism, this post is about clarity—what to look for, what to demand, and how to recognize when a platform actually respects your privacy.
Before we get into today’s post be sure to try out my new app Clarity to analyze a platform’s Terms of Service or Privacy Policy before you sign up.
Also try the browser extension currently available for Firefox (all other browsers later next week)
Also for a limited time get a discount when upgrading to my paid tier
What Are PETs—and Why Should You Care?
If you’ve heard the term “privacy-enhancing technologies” thrown around lately, you might assume it’s just another industry buzzword. Something platforms say to sound responsible without changing how they actually operate.
But the truth is, PETs aren’t a slogan. They’re a set of technologies and design choices that, when done right, make it technically impossible for a platform to abuse your data—even if it wanted to.
Here’s what I mean.
Privacy-enhancing technologies are exactly what they sound like:
Tools built to protect you by design, not just by policy.
They don’t ask you to trust a company. They’re structured so you don’t have to.
These aren’t futuristic concepts—they’re already out there in tools you might be using:
End-to-End Encryption → Keeps your messages readable only by you and the intended recipient. No middleman access, not even the platform itself.
Example: Signal, Session, ProtonMail.Zero-Knowledge Architecture → Your data is stored, but the company has no way of seeing it. Think of it like a locker where you hold the only key.
Example: ProtonDrive, Standard Notes, Tutanota.Local-First Design → Data is processed on your device instead of being shipped off to a cloud server to be analyzed and monetized.
Example: Obsidian, Mullvad VPN, the Above Phone.Differential Privacy → Allows platforms to collect broad trends without exposing your individual behavior.
Example: Apple uses this when collecting usage statistics across iOS without tying it to your identity.Decentralization → Removes the single point of control entirely, giving you direct ownership of your identity and data.
Example: Mastodon, Matrix (Element), SimpleX Chat.
The common thread here is these aren’t just features you have to go in and adjust, they’re structural choices.
Any platform can write a privacy policy full of nice-sounding promises. But if those promises aren’t backed by actual PETs, they’re just words.
The strongest form of digital trust is when a company couldn’t violate your privacy even if it tried.
And that’s the shift we need to start looking for—not companies who say the right things, but companies who build systems that remove the option to do the wrong thing.
Do you have a solid privacy plan based on your unique digital footprint?
If not you need one now. Get yours for FREE today —> Custom Privacy Plan
What Real Digital Trust Looks Like
The problem with “privacy” is that it’s been reduced to branding. Every company says they care about your data. Every platform has a clean-looking privacy policy. But very few of them are actually building systems where trust isn’t required.
So let’s step back from the slogans and look at what real trust looks like in the wild.
✅ They collect as little data as possible.
Trustworthy platforms don’t default to data hoarding. They ask only for what they need to function—and often give you the option to say no.
Example:
Brave blocks third-party trackers by default and runs ad matching locally, so your browsing history never leaves your device.
Proton doesn’t log IP addresses, and their apps work without tying usage to a phone number or name.
✅ They design with PETs from the start.
This is the key difference between companies that care and companies that comply. PETs aren’t something you bolt on later. They’re a foundation.
Example:
ProtonMail built end-to-end encryption into its architecture. Even if their servers were breached, the emails would be unreadable.
Obsidian lets you write and store notes locally without any cloud sync unless you set it up. The default is your machine, not theirs.
✅ They publish what they’re doing (and invite scrutiny).
Real trust is open-source, auditable, and documented. If a company is serious about privacy, they’ll publish transparency reports, security audits, and even their codebase.
Example:
Signal is open-source from top to bottom—including the protocol itself.
Mullvad publishes detailed updates about everything from app security to warrant requests (and doesn't even require an email address to use their service).
✅ They give you control, not just settings.
A real privacy-respecting tool doesn’t bury your options under layers of UI. It gives you clean, meaningful control over what’s collected, stored, and shared.
Example:
Firefox Containers let you isolate websites and accounts inside separate browser identities, which means you can keep platforms from leaking into each other—even in the same browser session.
✅ They’re not part of the ad ecosystem.
This is non-negotiable. If a platform’s business model is behavioral targeting, it can’t be privacy-first. Full stop.
Example:
Vivaldi makes this clear—they don’t sell data, serve ads, or work with ad-tech partners.
Above Phone is built without Google dependencies, period. No Firebase, no Play Store telemetry, no silent data tradeoffs.
These aren’t just “good vibes.” They’re engineering choices with real trade-offs. In many cases, it costs more to do privacy right. But that’s the point.
When a platform is willing to make privacy harder for themselves, it makes things safer for you.
Want to know more about what data brokers are and how to get rid of your data? check out this post:
5 Signals That a Platform Deserves Your Trust
You don’t need to be a security researcher to spot a surveillance-driven app. You just need to know what to look for and more importantly, what questions to ask.
Here are five signals that tell you whether a platform actually deserves your trust, or if it’s just hiding behind a privacy-scented marketing campaign.
1. Data Minimization
Are they collecting only what’s necessary—or are they grabbing everything they can?
Look for:
No forced logins (can you use it without creating an account?)
No unnecessary permissions (like camera/mic/GPS when it’s not core to functionality)
The option to skip analytics, personalization, or usage tracking
Red flag: If the app “needs” your location for a journaling feature or can’t run without full access to contacts, it’s likely monetizing that data behind the scenes.
2. Default Privacy Settings
Do they protect your data by default—or make you dig through settings to opt out?
Look for:
Tracking is off by default, not opt-out
Encryption is on from the first launch
Minimal data collection happens unless you explicitly allow it
Red flag: If you have to spend 15 minutes turning everything off, they’ve already gotten what they wanted.
3. Open Source or Auditable Infrastructure
Can third parties actually verify what’s under the hood?
Look for:
Open source code (or at least a public-facing audit)
Regular security reviews with documentation
Transparency reports that show how they handle legal requests
Red flag: No transparency at all. Closed code with vague promises and zero public track record.
4. No Third-Party Trackers or Data Brokers
Are they clean—or just pretending to be?
Look for:
No use of Facebook SDKs, Google Firebase, or third-party analytics platforms
Clear documentation of what data is collected and who it’s shared with
Network traffic that doesn’t quietly ping random ad networks
Pro tip: Use Exodus Privacy or Tracker Control to scan apps for hidden SDKs.
5. Local-First or Zero-Knowledge Architecture
Is your data processed on your device—or uploaded for “improvements”?
Look for:
Local data processing by default
Cloud sync is optional, not required
Passwords, notes, or messages are encrypted in a way they can’t read
Red flag: They “encrypt your data,” but still offer personalized ads or content. That means they’re reading it somewhere.
You don’t need all five boxes checked every time. But if a platform hits none of them, it’s not a tool—it’s a trap.
Don’t Just Trust—Verify
Every app claims to protect your privacy.
Some even say things like “we don’t sell your data” or “your security is our top priority.”
It sounds good. That’s the point.
But here’s what most people don’t realize:
Privacy claims mean nothing unless you can confirm them.
The good news? You don’t have to be tech-savvy to check the basics.
Below are simple, practical ways to spot whether a platform actually respects your data—or just markets itself like it does.
✅ Step 1: Search the App on Exodus Privacy
It’s a free site that tells you (Android only):
Which trackers are built into an app
What permissions the app asks for (like your location or contacts)
Just type the app’s name into the search bar.
What to look out for:
If the app says it’s “private” but is using Facebook, Google, or ad trackers behind the scenes—walk away.
iOS users try TrackerControl
✅ Step 2: Skim the Privacy Policy (You Only Need 60 Seconds)
Don’t read the whole thing. Just search for a few key phrases:
“...trusted partners”
“...third-party services”
“...used for product improvement”
“...anonymous analytics”
Those usually mean they’re collecting more than they should.
Also check:
Is there a clear way to delete your data?
Do they say how long they keep your info?
Can you turn off tracking or ads?
If none of that is easy to find, that’s your answer.
Better yet use my app Clarity to take the guess work right out of it.
✅ Step 3: Pay Attention to What the App Does
Sometimes your phone tells the truth before the company does.
Did it ask for way too many permissions when you installed it? (Camera, mic, location—even though it doesn’t need them?)
Are you getting creepily specific ads after using it?
Did you try turning off personalization, but nothing changed?
Those are all signs it’s watching more than it says it is.
✅ Step 4: Try This Yourself — Ask for Your Data
Most apps are legally required to let you:
Download the data they have on you
Delete your account entirely
Remove personal info from their systems
If they make this process difficult, ignore your request, or take forever to respond—that’s a red flag.
A company that respects your privacy won’t fight you on this. They’ll make it easy.
Bottom line:
You don’t have to trust companies at face value anymore.
You just have to ask the right questions—and pay attention to the answers.
Final Thought: Privacy Isn’t a Setting—It’s a Standard
Most people still think of privacy as something you “turn on.” A setting buried somewhere. A button you toggle.
But the reality is: privacy is either built into the system or it isn’t.
You either use platforms that respect your boundaries, or you get used by ones that don’t.
This post wasn’t meant to overwhelm you. It was meant to reset your filter.
To give you a new lens for spotting what’s real, what’s fake, and what deserves your trust.
Because the truth is, you’re not powerless. You just haven’t been shown what to look for until now.
We’ve mapped out the real mechanics of digital trust—what signals matter, how to verify claims, and how to spot when you’re being sold safety instead of being given it.
Here’s your next step.
Unlock Smart Privacy Tools—Now at 50% Off
If you’re ready to move beyond theory, I’ve built practical tools that put trust into action. here’s a few of my top sellers:
No BS Guide to Securing Your Network : A clear roadmap for locking down your home tech—no jargon, just steps.
The Incident Response Playbook : A step-by-step plan to handle cyber threats confidently—perfect for freelancers, small businesses, or anyone flying solo.
Digital Privacy Toolkit : A curated collection of templates, workflows, and setup guides I use every day to stay private and secure.
Opting out of Data Brokers Guide : A quick, hands-on guide to help you reduce your exposure across the major scrapers and data reseller networks.
Everything’s built for people who care about their privacy—but don’t have time to sort through hype.
Use code AUGUST50 at checkout for 50% off everything until 8/31.
Browse the store here
Let’s End with a Real Conversation
Before you go, I’d love your perspective:
What tool or service did you wish was private—but turned out not to be?
Have you ever quit an app because it crossed the line—or maybe you wish you had?
Or, what’s one real win you’ve had by choosing a privacy-first tool?
Drop a comment or reply. These posts aren’t just monologues—they’re blueprints built on real experience. And yours matter.
Until next time…
This is amazing! Thank you for your work on this and explaining what IS good. It’s very helpful to know the services and things we can use that do respect our privacy. I think we get so desensitized to how much they ask now, we don’t realize how far we’ve fallen into the trap.
Jason - your privacy insights appear to be biased away from Apple (again), which is a disservice to your readers.
You mention Apple only one time in this article, however the Apple ecosystem implements almost EVERY privacy enhancing technology you reference – across their apps, devices and cloud services - as shown on their privacy page: https://www.apple.com/privacy/
If you disagree, then how about objectively spelling out exactly where the Apple ecosystem falls short in the area of privacy enhancing technologies and how they can improve? I'm sure Apple would appreciate your feedback, as would 2 billion Apple users.