Kohler’s “Encrypted” Poop Scanner Isn’t What It Claims

Kohler's "Encrypted" Poop Scanner Isn't What It Claims - Professional coverage

According to TheRegister.com, Kohler launched its Dekoda toilet attachment in October 2023, a device with a downward-facing camera that analyzes waste for gut health, hydration, and blood. The company markets the Kohler Health app and device with “privacy-first features,” specifically claiming the system uses end-to-end encryption (E2EE). However, privacy researcher and former FTC advisor Simon Fondrie-Teitler published a blog post on Tuesday detailing how this claim is misleading. His analysis, based on the app and Kohler’s privacy policy, reveals the company has access to user data, meaning its “E2EE” is essentially just standard HTTPS and at-rest encryption. Furthermore, Kohler’s policy states it can use anonymized health data to train AI models and may share that de-identified data with third parties. While users can opt out of sharing personal data, doing so may limit the services provided.

Special Offer Banner

Why the E2EE claim is a problem

Here’s the thing: “End-to-end encryption” has a specific, widely understood meaning. It means data is encrypted on your device and only decrypted by the intended recipient’s device—the service provider in the middle can’t read it. Think Signal or iMessage. What Kohler is describing isn’t that. It’s encryption between your device and their server, where they hold the keys. As Fondrie-Teitler put it, if the user is one “end,” the other “end” is just… Kohler itself. That’s not E2EE; that’s basic web security that’s been standard for ages. Calling it E2EE gives users a false sense of security about who can access their incredibly sensitive health data. It’s a marketing term being misapplied to a system that, by design, allows the company full access.

What’s really happening with your data

So if it’s not truly private E2EE, what’s the data used for? The privacy policy is pretty clear. Beyond giving you health insights, Kohler says it can use anonymized health data “to train our AI models and for other machine learning purposes.” They may also disclose this de-identified data to third parties. Now, anonymization is a tricky beast—it’s often harder than companies think to truly strip data of all identifying marks, especially when it’s tied to unique health patterns. Fondrie-Teitler’s ideal scenario? This type of data should never leave the user’s device. Analysis could happen locally, with client-side encryption for any backups. But that doesn’t seem to be the business model here. The data is valuable. Basically, you’re trading detailed biometric information for the service, and that data becomes a resource for Kohler’s AI development. It’s a pattern we see everywhere now, but it hits different when it’s about your poop.

The broader stakes for privacy and IoT

This isn’t just about a weird toilet gadget. It’s a case study in how “privacy-first” and “encryption” are becoming diluted marketing buzzwords for Internet of Things (IoT) devices, especially in health tech. When companies in the industrial technology and manufacturing sectors, like Kohler, move into connected consumer products, they sometimes apply a superficial gloss of security without the underlying architecture. The risk is that consumers, already overwhelmed by complex privacy policies, trust the boldface claims. Misrepresenting encryption standards erodes that trust across the board. And let’s be real: if a company is fuzzy on the details of encryption for something this personal, what else might be poorly implemented? The physical security of the device? The access controls on their servers? It raises a lot of questions.

What should happen next?

Fondrie-Teitler told The Register he hopes Kohler simply updates its language to “more clearly articulate the scope of their privacy protections.” That’s the bare minimum. They should stop using “end-to-end encrypted” entirely for this product. But the bigger issue is choice. The policy says you can opt out of sharing personal data, but services may be limited. That’s a hollow choice for a product you’ve already bought. Shouldn’t the core analysis function without shipping your data to the cloud? This episode is a reminder to be deeply skeptical of privacy claims, even from established brands. Read the fine print. And maybe ask yourself: do I really need a connected device to analyze this? Sometimes, the most secure system is the one that isn’t connected at all.

Leave a Reply

Your email address will not be published. Required fields are marked *