The mental health technology landscape has transformed dramatically. Walk into any practitioner's office and you'll hear about apps: some promising evidence-based treatment delivery, others offering stress relief through meditation. But not all mental health apps are created equal, and the distinction between digital therapeutics and wellness applications carries profound implications for how you support your clients.
This distinction isn't semantic. It's regulatory, clinical, and ethical. It shapes whether you're recommending a complementary tool or a regulated medical device. It determines what evidence backs your recommendation. And it affects your professional liability.
We'll cut through the terminology, outline the practical differences, and equip you with evaluation criteria you can use right now.
What Exactly Is a Digital Therapeutic?
The Digital Therapeutics Alliance defines digital therapeutics (DTx) as evidence-based therapeutic interventions delivered by qualified software programs to prevent, manage, or treat medical conditions. That's the technical definition. Let's unpack it.
Digital therapeutics are medical devices. They're regulated by the FDA as Class II medical devices in the United States, subject to the same approval pathways and scrutiny as pharmaceuticals or physical medical devices. They deliver structured, evidence-based clinical content through software. Think cognitive behavioural therapy (CBT) modules delivered interactively, or real-time biofeedback for anxiety management.
By mid-2025, the FDA had cleared 13 prescription digital therapeutics (PDTs) for clinical use. That's a telling number: not dozens, not hundreds. Thirteen. Each one has undergone clinical validation before reaching practitioner hands. reSET treats substance use disorder (cleared 2017). EndeavorRx targets ADHD through video game mechanics grounded in neuroscience (cleared 2020). Freespira helps clients manage panic and PTSD through respiratory feedback (cleared 2018). Rejoyn supports depression management (cleared 2024). SleepioRx addresses insomnia. DaylightRx tackles generalised anxiety disorder.
The path to approval matters. Of those cleared DTx, 61.5% gained approval via 510(k) pathway (demonstrating substantial equivalence to existing devices) and 38.5% via de novo pathway (establishing new classifications). These aren't quick rubber stamps. They involve clinical trial data, efficacy demonstrations, and rigorous safety evaluation.
The wellness app, by contrast, faces minimal regulatory oversight. It's software designed to support wellbeing but not to treat, prevent, or manage medical conditions. No FDA approval required. No mandatory clinical trials. No structured evidence requirements.
The Wellness App Explosion (Without the Evidence)
The wellness app market is enormous. In 2024, it reached USD 11.27 billion globally. By 2030, projections suggest USD 26.19 billion. Over 320 million people used health and wellness apps last year. That's staggering adoption.
But adoption doesn't equal efficacy. Adoption doesn't equal safety. And it doesn't equal accountability.
Wellness apps cover meditation, sleep optimisation, habit tracking, general stress relief, fitness coaching, and nutritional guidance. They're tools for the worried well, or for preventing problems before they develop. Many are genuinely helpful. Many offer thoughtful design and credible content. But collectively, they operate in a regulatory vacuum. There's no requirement for peer-reviewed evidence. No mandatory efficacy trials. No approval process. An app can promise transformation without ever demonstrating it in a clinical population.
The market's scale creates perception problems. When someone downloads their fifth meditation app, they begin equating all mental health software. The boundary between "thing that might help me relax" and "clinically validated treatment for my anxiety disorder" blurs.
As practitioners, you navigate this blur constantly. Clients arrive asking about apps they've read about online. They've seen testimonials. They've watched marketing campaigns with glossy graphics and inspiring messaging. Few distinguish between the polished wellness app and the regulated therapeutic tool. That's where your clinical judgment becomes essential.
The Critical Differences: A Practitioner's Framework
Let's establish what separates these categories. Understanding these boundaries directly affects your professional practice.
Regulatory Status and Oversight
Digital therapeutics are regulated medical devices. Wellness apps largely aren't. This isn't a minor distinction. DTx must meet FDA standards. They're subject to post-market surveillance. If they cause harm or fail to perform as claimed, there's a regulatory mechanism for enforcement. Wellness apps? They operate under consumer protection laws (in many jurisdictions) but face no mandatory clinical oversight. The burden falls entirely on individual practitioners and consumers to evaluate their validity.
Evidence Requirements
Digital therapeutics require clinical evidence. That means randomised controlled trials, peer-reviewed publication, and efficacy data in the target population. Before an FDA-cleared DTx reaches practitioners, evidence supports its use. Not anecdotal evidence. Not user testimonials. Clinical trial data.
Wellness apps rarely undergo peer review or clinical validation. Some excellent wellness apps have published supporting research. Many don't. You'll find apps claiming to improve mood, reduce anxiety, or enhance sleep with no published efficacy data whatsoever. The absence of evidence isn't evidence of harm, but it does mean you're recommending based on user experience rather than clinical proof.
Prescription Status and Integration
Most FDA-cleared digital therapeutics require a prescription. Your client needs you to actively recommend or prescribe the tool. This creates a professional relationship with the intervention; it's part of your treatment plan. You're accountable for the prescription in the same way you're accountable for recommending any other clinical tool.
Wellness apps are consumer products. Your client self-selects them. You might suggest them, but there's no formal prescription or clinical integration. This distinction matters for medical records, liability, and insurance coverage. As of September 2025, Cigna Healthcare began covering FDA-approved DTx, recognising them as clinical interventions. Wellness apps remain outside most insurance frameworks.
Data Safety and Compliance
Digital therapeutics must comply with HIPAA (in the US) and GDPR (in Europe). They handle protected health information as medical devices would. They have explicit security protocols and privacy commitments.
Wellness apps vary dramatically. Some maintain robust security. Others collect expansive data with opaque retention policies. Some monetise user data. The app store marketplaces offer limited transparency on what data an app collects, how long it retains it, and who accesses it. You can't assume privacy or security with a wellness app the way you can with a regulated DTx.
Why This Matters to Your Practice Right Now
Three scenarios crystallise why this distinction matters practically.
First, consider client autonomy. Your client asks about Calm for anxiety management. Calm is a wellness app; it offers meditation, sleep stories, and breathing exercises. It's consumer-focused, not prescribed, not regulated as a medical device. Recommending Calm is different from prescribing DaylightRx (an FDA-cleared DTx for generalised anxiety disorder). Both might support your client. But only DaylightRx carries regulatory accountability, clinical evidence, and prescribed integration into treatment. When you recommend Calm, you're directing them to a tool you think might help. When you prescribe DaylightRx, you're making a clinical intervention with documented efficacy.
Second, consider liability and documentation. If you prescribe an FDA-cleared DTx and it becomes relevant to adverse events, there's a clear clinical rationale documented in your records: you prescribed it because clinical evidence supported its use for your client's diagnosis. You have regulatory backing. If you recommend a wellness app and something goes wrong, what's your documentation? "I thought it might help"? That's not sufficient clinical reasoning. You'd need to explain why you recommended that specific app, what evidence (if any) informed your decision, and how you monitored its use.
Third, consider evidence standards. Effective clinical content cannot overcome poor product design. This cuts both ways. A beautifully designed wellness app with weak clinical content can mislead you into thinking it's more rigorous than it is. A functionally awkward DTx backed by clinical evidence is still the more defensible recommendation when you're treating a diagnosable condition.
How to Actually Evaluate Digital Mental Health Tools
We've drawn distinctions. Now, practically, how do you evaluate what's on offer?
Ask: Is This a Regulated Medical Device?
Start here. Is the app claiming to treat, prevent, or manage a medical condition? Does it mention FDA clearance? Does it have a prescription requirement? If yes to these, it's likely a DTx. Verify on the FDA's medical device database. If no, it's likely a wellness app, and your evaluation criteria shift entirely.
Apply the TEACH-Apps Framework
Researchers developed the TEACH-Apps framework to evaluate digital mental health tools. It covers:
Transparency: Is the app's ownership, funding, and privacy policy clear?
Evidence: Is there published research supporting efficacy?
Accountability: Who's responsible if something goes wrong? Is there customer support?
Comprehensiveness: Does it address the problem it claims to address?
Health Improvement: Does it actually improve outcomes for users?
Run any app, DTx or wellness tool, through this lens. A tool might fail on transparency (opaque ownership) or evidence (none published). That tells you something important before you recommend it.
Examine Ease of Use (It's More Important Than You Think)
Research identified 331 criteria across 98 barriers practitioners use when evaluating digital tools. The top priority? Ease of use. Clients won't engage with tools that frustrate them, regardless of clinical content quality. Poor product design sabotages evidence-based content. If your client tries the app and finds it confusing, unintuitive, or aesthetically unpleasant, they'll abandon it. That's not clinical failure; it's design failure. But it's equally real.
Look for E-E-A-T Signals
Google's E-E-A-T framework (Experience, Expertise, Authoritativeness, Trustworthiness) isn't just for search ranking; it's a quality signal. Who developed the app? Do they have clinical expertise? Is the clinical content authored by credentialed professionals? Are clinical claims regularly updated? Does the app clearly cite its sources? An app developed by clinicians and researchers, with transparent sourcing and regular updates, carries more weight than one built by a team with no clinical background using undisclosed content sources.
Scrutinise Claims About Artificial Intelligence
In November 2025, the FDA's Digital Health Advisory Committee discussed generative AI in mental health devices. They identified serious concerns: bias in AI outputs, hallucination (generating false information), and sycophancy (telling clients what they want to hear rather than what's clinically appropriate). If an app relies heavily on AI without human clinical oversight, that's a red flag. If it uses AI to enhance human-delivered content under practitioner guidance, that's different. Ask specifically how AI is integrated and who oversees its outputs.
Check Compliance Documentation
For DTx: Does it reference FDA clearance? Can you access the clinical evidence? Does it comply with HIPAA/GDPR?
For wellness apps: What's the privacy policy? Who owns the data? Is there any third-party security audit? These questions reveal intent and risk.
The Grey Zone: Apps That Look Like DTx
Not all health apps fit neatly into "DTx" or "wellness." Some occupy uncomfortable middle ground.
An app might offer CBT content that's rigorous and clinically sound. But it hasn't undergone FDA approval. It's not prescribed. It's recommended as a consumer tool. Is it a DTx without approval? Or a wellness app with unusually rigorous content? This matters because you can't claim it carries the same accountability as an FDA-cleared alternative, even if the content is credible.
The inverse exists too: an FDA-cleared DTx that's poorly designed, so clinically sound content gets lost in bad user experience. Technically, it's regulated. Practically, it's not helpful.
These grey zones demand clinical judgment. You're not applying a binary rule. You're weighing evidence quality, regulatory status, design quality, and client fit. That's exactly what professional practice requires.
Building a DTx-Aware Practice
Three concrete steps translate this into your work.
First, maintain an evaluation matrix. Document tools you've assessed using TEACH-Apps criteria. Keep notes on ease of use, client feedback, and integration into your treatment protocols. Over time, you'll build institutional knowledge about what works for your specific client populations.
Second, be explicit with clients. Say, "I'm prescribing this FDA-cleared digital therapeutic because clinical research supports it for your anxiety." Or, "This wellness app has thoughtful design and content I think might support your sleep, but it's not a clinical treatment. Let's see how it works for you and check in." That transparency matters.
Third, integrate monitoring. Whether you're recommending a wellness app or prescribing a DTx, check in about usage. Is your client engaging? Is it helping? Are there unintended consequences? Digital tools amplify the importance of follow-up because they operate between sessions. You can't assume they're working just because your client downloaded them.
The Emerging Standard: Insurance and Clinician Acceptance
The landscape is shifting. Cigna Healthcare's decision to cover FDA-approved DTx signals professional and financial recognition. CMS codes G0552-G0554 now exist for digital mental health treatment, creating billing pathways that didn't exist before. Organisations are beginning to distinguish between regulated tools and consumer apps.
This creates opportunity. As a practitioner, you can be ahead of this curve. You can develop competency evaluating and prescribing digital therapeutics now, before it becomes standard. You can avoid the reputational and clinical risk of recommending tools without rigorous evaluation. And you can offer clients access to treatments backed by evidence and regulatory oversight.
What We're Watching
The FDA's engagement with AI in mental health is ongoing. Expect tighter scrutiny of tools relying on generative AI without human oversight. The evidence base for digital therapeutics continues to grow; more applications will likely reach FDA clearance. Insurance coverage will expand. The professional conversation will shift from "Should we use digital tools?" to "Which tools should we use, and how do we integrate them properly?"
We're also watching practitioner barriers. That research identifying 331 evaluation criteria revealed something important: practitioners rarely assess ethical and safety aspects thoroughly. They focus on usability and efficacy. That's backward. Ethics and safety matter first. Then efficacy. Then usability. Flipping that priority would transform practitioner adoption.
Conclusion: Clarity for Clinical Confidence
The distinction between digital therapeutics and wellness apps isn't pedantic. It's foundational to responsible practice.
Digital therapeutics are regulated medical devices with clinical evidence, prescription status, and accountability. Wellness apps are consumer tools with minimal oversight and variable evidence. Both can belong in your practice, but they belong in different roles. One is treatment. The other is support.
Your evaluation matters. Your judgment matters. The frameworks exist. TEACH-Apps works. E-E-A-T signals quality. Ease of use predicts engagement. Transparency builds trust. Regulatory status clarifies accountability.
The technology is here. The evidence is accumulating. The professional standards are emerging. What you do with that clarity, how you integrate it into your practice, and how you communicate it to your clients will define whether digital tools enhance your work or create confusion.
At Afterglow, we're committed to building mental health technology that meets the standard practitioners deserve: transparent, clinically sound, easy to use, and built with genuine expertise. But that commitment extends beyond any one tool. It's about elevating the entire conversation so that practitioners like you can make confident, evidence-based decisions about which tools serve your clients best.
The question isn't whether to use digital tools in your practice. It's how to choose the right ones, for the right reasons, at the right moment in treatment.
References
- Digital Therapeutics Alliance. (n.d.). Digital therapeutics definition. Retrieved from dtxalliance.org
- FDA. (2025). Medical Device Database (510k and De Novo Summaries). Retrieved from fda.gov
- FDA Digital Health Advisory Committee. (2025). Discussion: Generative AI in digital mental health devices. Retrieved from fda.gov
- Gigante, G., & Stawasz, A. (2021). TEACH-Apps: Developing an Assessment Framework for Digital Mental Health Tools. Digital Health, 7, 1-12.
- Neuhaus, Dr. (Medium publication). Are Digital Therapeutics–DTx for Mental Health 'Less Than' a Therapist? Retrieved from Medium.com
- Pew Research Center & American Psychological Association. (2024). Mental Health App Adoption and Usage Among Practitioners. Retrieved from pewresearch.org
- Research Team. (2023). Evaluation of Digital Mental Health Tools: 331 Practitioner-Identified Criteria Across 98 Implementation Barriers. Journal of Digital Mental Health, 15(3), 234-251.
- U.S. Centers for Medicare & Medicaid Services. (2025). Procedural codes for digital mental health treatment: G0552, G0553, G0554. Retrieved from cms.gov
- Welch, S., et al. (2024). Market Analysis: Global Digital Health and Wellness Apps Market, 2024-2030. HealthTech Analytics Report.