I Let AI Guess My Age From a Selfie. The Result Was Brutal.

I Let AI Guess My Age From a Selfie. The Result Was Brutal. - Professional coverage

According to CNET, freelance journalist Amanda Smith tested Noom’s newly launched AI Face Scan feature in October 2025, which uses technology from NuraLogix to estimate health metrics from a selfie. The tool, part of Noom’s health platform, analyzed her photo using remote photoplethysmography (rPPG) to detect blood flow and stress patterns. Despite being 37 years old, the AI estimated her biological age at 44, flagged potential issues with cardiac workload and triglycerides, and then directed her to a page to purchase GLP-1 drugs like Ozempic. Smith, who took the photo without makeup while recovering from an IVF procedure, found the lack of substantive lifestyle advice disappointing. She subsequently fed the raw data into ChatGPT, which generated a detailed 30-day action plan involving cardio, diet, and sleep optimization.

Special Offer Banner

The science is fascinating but fuzzy

Here’s the thing about that rPPG tech Noom uses. It’s trying to do something pretty wild: turn your phone‘s camera into a medical sensor. The idea is that tiny, invisible color changes in your skin can reveal your heart rate, stress levels, and blood flow. And in controlled lab settings, this stuff shows promise—one 2021 study even suggested image-based methods could outperform some wearable sensors for pulse rate.

But. And it’s a big but. The real world is messy. Lighting, skin tone, even your recent activities can throw it off. Another 2023 review pointed out major limitations for estimating things like blood pressure outside a clinic. Noom knows this, of course. That’s why the app buries a disclaimer saying its insights are “for informational purposes only.” So, you’re not getting a diagnosis. You’re getting a high-tech, algorithmically generated guess.

The real product is often the upsell

Smith’s experience highlights the awkward dance these health apps perform. They dangle the carrot of personalized, futuristic insight. But the stick often leads straight to a checkout page. In her case, after delivering the jarring biological age number, the app’s primary recommendation was to buy prescription weight-loss drugs. Talk about a buzzkill.

It feels like a bait-and-switch. You go in curious about your body’s inner workings, and you come out feeling like you’ve been profiled for a sales funnel. This isn’t unique to Noom—it’s a rampant business model in digital health. The “free” scan is just the lead generator. The real goal is to monetize your anxiety about the result. So, while the tech itself is cool, you can’t ignore the commercial engine driving it.

The power move? ChatGPT as your health interpreter

Now, this is where Smith’s story gets clever. Instead of stewing over the result or buying the suggested product, she did what more of us will probably start doing: she used one AI to explain another. She copy-pasted Noom’s findings into ChatGPT and asked for an actionable plan. And ChatGPT delivered a comprehensive, seemingly reasonable list of lifestyle tweaks—from HRV breathing to Mediterranean diets.

Basically, she used the AI app as a data-gathering tool and a separate, more general AI as an analyst. This is a fascinating workaround. The specialized app (Noom) handled the complex biometric interpretation from an image. The large language model (ChatGPT) synthesized that data into plain English and structured advice. It’s not a replacement for a doctor, but as a way to make sense of these opaque health reports? It’s pretty powerful.

So, should you try it?

Look, if you’re curious and can treat the results as a piece of entertainment or a gentle nudge—not gospel—these tools can be a fun mirror. They might highlight trends you already suspect. Maybe you’ll see a high stress score and finally commit to that meditation app. As Noom itself suggests in a blog post, the value is in sparking positive change.

But you have to go in with serious skepticism. Don’t panic over the number. Don’t buy the supplement or drug they push without talking to a human professional. Think of it like a high-tech fortune cookie—a vague, generated reading that might contain a kernel of truth. The best use case, as Smith found, might just be to take that kernel and use it to start a more informed conversation with your doctor, or even with a disinterested AI that isn’t trying to sell you anything.

Leave a Reply

Your email address will not be published. Required fields are marked *