Verke Editorial

Is my data private with AI therapy? End-to-end encryption explained without jargon

Verke Editorial ·

Is my data private with AI therapy? With Verke, yes — your conversations are encrypted on your device and the keys never leave it, which means even Verke staff cannot read them. With other AI coaching products, the answer varies a lot. The honest version is that "your data is private" can mean five different things, and most products use the loosest definition. The article walks through what to look for in any product's privacy policy, what the technical terms actually mean, and the specific choices Verke made — explained in plain language rather than spec sheets.

The reason this matters: AI coaching gets shared with content most people would never tell a friend. Privacy isn't an abstract legal box on these conversations; it's the thing that lets you be fully honest in the first place. A conversation you're editing for an audience is a different conversation, and the technical posture of the product determines whether you're editing or not. Below: what "encrypted" actually means, how Verke handles your conversation specifically, what we still see, what to ask any AI coaching product, and the anonymity options that combine with encryption to give you a strong overall posture.

The TL;DR

What "encrypted" actually means

Encryption is the process of scrambling text so it can only be read with a specific key. Without the key, the encrypted version looks like random bytes — useless to anyone who happens across it. The interesting question is never "is it encrypted?" (almost everything is, somewhere). The interesting question is "who holds the key?" That single question separates the products where the company can read your messages from the products where they cannot.

End-to-end encryption (E2EE) means the key lives on the user's device, not the server. The server only ever sees ciphertext — scrambled bytes — and has no way to decrypt them, because the key isn't there. Server-side or at-rest encryption (the more common flavor across SaaS products) means the company holds the key on their end. They've protected the data from outsiders, but the company itself can decrypt and read whenever it chooses to. Both are technically "encryption," but they answer the question of who-can-read very differently. E2EE is the harder thing to build and the stronger posture for the user.

Want to talk about something private without it being read?

Talk it through with Anna — no signup, no email, no credit card.

Chat with Anna →

How Verke handles your conversation

When you send a message, your device encrypts it before it leaves — using AES-256-GCM, a strong symmetric cipher that's the standard choice for this kind of work. Symmetric means the same key encrypts and decrypts; the cipher is the algorithm that scrambles the bytes. AES-256-GCM is what banks, governments, and serious messaging apps use for the same job. It's well-studied, well-implemented, and not the place anyone breaks first.

To get keys to the right places without anyone in the middle intercepting them, Verke uses RSA-4096 — an asymmetric cipher with a public/private key pair. Asymmetric means encryption uses the public key (which can be shared) and decryption uses the private key (which never leaves your device). The math means a server can hand you keys without ever knowing what those keys decrypt to. The 4096-bit key length is large by today's standards and gives generous headroom against future improvements in cryptanalysis.

The result, in plain terms: ciphertext lives in the database. The private key lives on your device. Verke staff querying the database see scrambled bytes, not your conversation. There's no back-door, no master key, no "but if a manager really wanted to" — the architecture is the same architecture used by end-to-end-encrypted messengers. You can read the deeper detail in our privacy policy; the section above is the gist.

The honest part

What we still see (honestly)

End-to-end encryption protects the message content. It does not protect everything about every interaction. Verke sees metadata — when you logged in, which coach you talked to, how long the session lasted, roughly how many messages were exchanged. We see crash logs (which contain stack traces, not message text — we've checked this carefully). We see aggregate counts across all users — how many sessions, how many users opened the app this week, how many chose Anna versus Judith. None of that requires reading any individual's conversation, and none of it does.

What we do NOT see is message content — yours or anyone's. Subpoena, lawful intercept, internal investigation, curious engineer with database access — none of those produce readable conversation text, because the keys to read it aren't on our side. This isn't a promise about what we won't do. It's a description of what we technically can't do. That distinction is the whole point of E2EE.

What to ask any AI coaching product

Whether or not you choose Verke, this is the checklist worth running against any AI coaching product before you trust it with conversation content. Most products will fail at least one of these questions, and the answers are usually buried in privacy policies rather than highlighted up front:

  • Is content end-to-end encrypted, or only encrypted at rest? "Encrypted" alone usually means at-rest, which leaves the company able to read.
  • Can company staff read user conversations? Look for a clear no, not a vague "we follow strict access controls."
  • Is the conversation used to train models? "De-identified" or "aggregated" training data is still training on your content.
  • Where is data stored geographically? Jurisdiction affects what governments can compel and what protections apply.
  • Can I delete my account and have content erased? Look for an actual delete, not a 30-day soft-delete buried in a retention schedule.
  • Is a data-processing agreement (DPA) available? Required for any serious privacy posture, especially under GDPR.

Anonymity options

Verke requires no email, no phone number, and no payment detail to start the trial. A nickname is enough. The technical reason is simple: the less identity is tied to the account, the less there is to leak even in the worst case. The product reason matters too: some of the people who most need this kind of conversation are the people most reluctant to attach a real-world identity to it. Asking for an email up front filters them out, and there's no good reason to ask.

Combined with end-to-end encryption, this gives you a fairly strong privacy posture: no identity tied to your conversations on Verke's side, and no readable content even if there were. If you decide later you want to subscribe and pay, that adds a payment identifier — but the conversation content remains as opaque to us as it was during your nickname-only trial. The encryption boundary doesn't move when the billing relationship starts.

When to seek more help

Self-help and AI coaching can do a lot, but they have limits. If you're experiencing severe depression that hasn't lifted, panic attacks that interrupt daily life, thoughts of self-harm, active trauma processing, or substance dependence — those are signals to work with a licensed clinician, not signals to push harder on a coaching tool. You can find low-cost options at opencounseling.com or international helplines via findahelpline.com. There's no prize for waiting longer than you need to.

Work with Anna

Privacy is fundamentally about trust, and trust is what depth work needs. Anna's psychodynamic approach is the closest thing Verke offers to a thinking partner you can be fully unguarded with — the kind of conversation where the things you'd normally edit out can stay in. The encryption posture above is what makes that posture not just an emotional permission but a technical one: the conversation is between you and the coach, no audience, no archive anyone else can read. For more on the method, see Psychodynamic Therapy.

Talk it through with Anna — no signup, no email, no credit card.

FAQ

Common questions

Can Verke staff read my conversations?

No. Verke uses end-to-end encryption — AES-256-GCM for messages, RSA-4096 for key exchange. The keys never leave your device, so even with full database access staff would see scrambled bytes, not message text. This is a structural guarantee, not a policy promise: the ability to read conversations doesn’t exist on Verke’s side.

What if Verke gets subpoenaed?

Verke can only hand over what it actually has — encrypted ciphertext (unreadable without your device key), session metadata (timestamps, durations), and account-level data, which is minimal because no email or phone is required. The actual content of your conversation is technically not retrievable. End-to-end encryption is the reason this answer is short.

Is my conversation used to train AI models?

No. Verke does not train its models on user conversation content. The underlying language models from OpenAI and Google have their own data-handling terms; Verke uses the API endpoints whose terms exclude training-on-content. Your conversation is for your conversation — it doesn’t become tomorrow’s training data on anyone’s side.

Can I use Verke completely anonymously?

Yes. No email, phone, or payment is required to start the trial — a nickname is enough. The combination of pseudonymous accounts and end-to-end encryption is one of the strongest privacy postures in the AI coaching space. There’s no identity tied to your conversations on Verke’s side because, structurally, there doesn’t need to be.

What if I lose my device?

Your encryption keys live on your device. Losing the device means losing access to past conversation history; Verke can give you a fresh account, but previous content cannot be decrypted by anyone — including you — without the original device. That’s the trade-off end-to-end encryption asks you to make: nobody else can read your past, including in the case where you wish they could.

Verke provides coaching, not therapy or medical care. Results vary by individual. If you're in crisis, call 988 (US), 116 123 (UK/EU, Samaritans), or your local emergency services. Visit findahelpline.com for international resources.