Only 15% of practitioners currently integrate digital tools into their ACT delivery. Yet the evidence suggests this hesitation rests on outdated assumptions about what technology can accomplish in therapy. We've spent the last decade watching digital therapeutics prove themselves in clinical trials. The question is no longer whether ACT works online. It's why you're not already using it.

Acceptance and Commitment Therapy has always been elegant in its logic. Six core processes, trackable to measurable outcomes, portable across contexts. The natural question followed: if ACT's architecture is based on psychological flexibility rather than a particular room or a practitioner's presence, what happens when we move it to a screen? The answer, buried in a growing body of literature, surprises many: delivered thoughtfully through technology, ACT doesn't just work. It outperforms some in-person implementations.

This isn't a tech-first manifesto. Rather, it's an honest assessment of where digital delivery of ACT stands today, what the evidence actually shows, and why integrating these tools into your practice might be the most pragmatic clinical decision you'll make this year.

The Evidence Is Already Here

Let's start with the meta-analytical foundation. Herbert et al. (2021) examined 20 randomised controlled trials spanning 2,430 participants receiving technology-supported ACT for chronic health conditions. The results favoured the digital delivery: an effect size of Hedges' g = -0.49 (p=0.002) on functional outcomes, with an even stronger effect on ACT process outcomes (Hedges' g = 0.48, p<0.001). This means that clients who completed digital ACT didn't just report feeling better; they showed measurable shifts in psychological flexibility, the active ingredient ACT targets.

Depression presents another compelling case. Kong et al. (2025) synthesised 10 RCTs of ACT specifically for depressive disorders. The standardised mean difference for depressive symptom reduction was -0.69, alongside -0.64 for anxiety. Critically, psychological flexibility itself improved by an SMD of 0.35, again suggesting that digital delivery wasn't bypassing the mechanism of change; it was engaging it. For practitioners who've questioned whether clients could develop genuine psychological flexibility through an app, this pattern repeats across condition after condition.

Chronic pain research illustrates the point even more sharply. Rickardsson et al. (2021) studied internet-delivered ACT as brief, repeated microlearning modules for persistent pain. Pain interference demonstrated a large effect size (d=1.2), pain intensity (d=0.99), and psychological inflexibility (d=1.0). The follow-up at one year maintained these gains. Notably, clients accessed these interventions independently, asynchronously, without live clinician contact.

The caregiving population presents an equally compelling example. Atefi et al. (2025) evaluated online ACT for dementia caregivers, a group typically burdened by logistical barriers to in-person care. Depression symptoms fell with d=-0.78, stress with d=-1.13, and anxiety with d=-1.38. The effect sizes for online delivery rivalled, and in some cases exceeded, those reported in in-person caregiving interventions.

These aren't outliers. Ferreira et al. (2022) pooled 48 RCTs of group-based ACT (n=3,292) and found anxiety effects of g=0.52 and depression effects of g=0.47, regardless of delivery modality. Meanwhile, Lappalainen et al. (2024) examined internet-delivered ACT for persistent physical symptoms and reported between-group effect sizes of d=0.71 to d=1.09, depending on the comparison condition.

What emerges from this literature is not a footnote or a provisional finding. It's a clinical reality: technology-supported ACT, when designed thoughtfully, produces measurable functional and process-level change.

Why Digital Delivery Actually Suits ACT's Architecture

Here's the uncomfortable truth many practitioners avoid: ACT's theoretical framework may be more compatible with digital delivery than with traditional weekly therapy rooms.

Traditional in-person therapy asks something logistically brutal of clients. They schedule 50 minutes once weekly. Between sessions, they're meant to practise values-driven behaviour, sit with discomfort, and notice thoughts without fusion. Most don't. The research on compliance with between-session engagement consistently shows that fewer than 50% of clients complete suggested behavioural work outside the therapy room.

Digital delivery restructures this dynamic. A well-designed ACT app makes values clarification, mindfulness practice, and acceptance work available at 11 p.m. when a client is ruminating. It offers a tool during a moment of psychological struggle, not a week later when that moment has passed.

More subtly, digital platforms make ACT's six core processes quantifiable in ways face-to-face work cannot. Psychological flexibility isn't merely observed in a session; it's tracked across weeks. Values are articulated, revisited, and measured against behaviour. Mindfulness practice accumulates verifiable minutes. Acceptance exercises are logged and reviewed. This isn't a reduction of the therapeutic relationship; it's an augmentation of the evidence trail.

There's also a non-trivial practical dimension. Chronic health conditions, depression, anxiety, and caregiving stress are often episodic. Clients need support not weekly, but contextually. Digital tools provide just-in-time intervention at the moment of need, rather than forcing clients to wait six days for their next appointment.

What Actually Works: Design Principles That Matter

Not all digital ACT is created equal. The research shows clear differentiation between well-designed interventions and rushed digital ports of in-person protocols.

Effective digital ACT typically combines three elements. First, structured yet flexible microlearning, where clients access brief modules on specific ACT processes, rather than hour-long video lectures. Second, behavioural tracking and feedback. Clients aren't simply completing exercises in isolation. They log values-driven behaviours, record mindfulness practice, or rate their willingness in specific situations. Third, regular contact with a qualified practitioner, though not necessarily traditional weekly therapy.

The gold standard in the research isn't in-person therapy or fully autonomous apps. It's guided digital ACT: a client moving through structured digital content with periodic practitioner touchpoints that centre on the client's actual data, not a predetermined session agenda.

The Persistent Practitioner Concerns (And What the Evidence Says)

We hear these objections regularly. The therapeutic relationship. The nuance that only human connection can offer. The risk that technology depersonalises care.

These concerns deserve to be named directly. The therapeutic relationship matters. It's consistently the strongest predictor of outcome in psychotherapy, regardless of modality. But here's what the digital ACT research shows: the therapeutic relationship doesn't vanish online. It transforms. A practitioner reviewing a client's logged values, offering feedback on tracked mindfulness practice, or problem-solving barriers to values-consistent behaviour via message or brief call is offering something deeply relational.

Second concern: digital tools oversimplify ACT's nuance. There's truth here, but it's conditional. A poorly designed app that prescribes generic mindfulness and calls it ACT won't work. But research-informed platforms, built by clinicians who understand ACT's theoretical depth, translate the model's complexity effectively.

Third: the digital divide and access. We acknowledge this. Not all clients have reliable internet or comfort with technology. But for the growing population that does, digital tools don't replace in-person practitioners; they extend capacity.

Who Benefits Most, and When It Doesn't Work

Digital ACT is not a universal solution. It works exceptionally well for specific populations and circumstances.

Clients with chronic, episodic conditions benefit enormously. Geographically isolated clients are an obvious fit. Motivated, digitally literate populations engage more readily.

Digital ACT works less well when acute crises require immediate practitioner assessment, when someone presents with complex comorbidity requiring sophisticated clinical judgment, or when cultural or linguistic barriers haven't been addressed in the platform design.

The research offers no evidence that digital delivery is superior for all presentations. It's superior for specific populations, at specific times, when integrated thoughtfully into a practitioner's clinical framework.

The Practical Integration Question: How Do You Actually Start?

Most practices approach this in one of three ways. Some practitioners recommend a standalone app to motivated clients between appointments, maintaining face-to-face work as the core service. Others build a hybrid model where clients complete core ACT modules digitally over six to eight weeks, attending fortnightly in-person sessions focused on integration. A third approach involves seeking accredited digital therapeutic platforms designed for ACT delivery, which often include integrated practitioner dashboards, client progress tracking, and clinical protocols.

The evidence doesn't specify which approach is optimal. Rather, it suggests that combining structured digital content with periodic practitioner contact outperforms either element alone.

What the Next Five Years Likely Holds

The digital ACT literature is growing faster than it was five years ago. Researchers at major institutions are refining delivery models, expanding to new populations, and studying long-term follow-up. We're moving from "does digital ACT work?" to more sophisticated questions: "for whom does it work best, at what dose, integrated with which other interventions, and over what time horizon?"

We also expect to see convergence between digital therapeutics, clinical dashboards, and practitioner workflows. Rather than practitioners managing ACT delivery separately from their digital health infrastructure, the tools will integrate.

A Word on Overreaching

The digital ACT literature is strong, but it's not yet definitive. Most RCTs measure outcomes across relatively short timeframes. Long-term follow-up data beyond 12 months is limited. The population most studied in digital ACT trials are highly motivated, reasonably digitally literate, and often English-speaking. Generalisability to less resourced populations remains uncertain.

Additionally, the field hasn't yet settled the question of whether digital ACT's effect sizes reflect the intervention's intrinsic effectiveness or simply the fact that actively engageable, motivated clients self-select into digital trials.

The Pragmatic Case

What remains clear is this: we have good evidence that ACT delivered through thoughtfully designed digital platforms, with practitioner oversight, produces meaningful functional change and advances psychological flexibility. We have practitioner testimony that such approaches ease the logistical burden of chronic care, extend practitioner capacity, and deepen between-session engagement.

The question isn't whether digital ACT is theoretically possible or empirically viable. Both are established. The question is whether your practice is ready to consider it, and whether your clients might benefit.

At Afterglow, we've built our clinical infrastructure around this conviction: that ACT's six core processes are not diminished by technology, but rather amplified and democratised. Whether you use our platform or another, whether you adopt full digital delivery or hybrid models, the direction of travel is clear. Digital ACT is no longer emerging. It's here.

References

  • Atefi, A., Zomorrodi, S., Akrami, M., Mohammadi, F., & Goharinezhad, S. (2025). Online acceptance and commitment therapy for dementia caregivers: A randomised controlled trial. Cognitive Behaviour Therapy.
  • Ferreira, E., Spittal, M. J., & Larsen, M. E. (2022). Group-based acceptance and commitment therapy: Meta-analysis of randomised controlled trials. Journal of Affective Disorders, 298, 151-158.
  • Herbert, J. D., Dooren, M., Testa, R. J., & Levin, M. E. (2021). Meta-analysis of randomised controlled trials of technology-supported acceptance and commitment therapy for chronic health conditions. Behaviour Research and Therapy, 143, 103881.
  • Kong, L., Cui, H., Chen, S., & Wang, X. (2025). Acceptance and commitment therapy for depression: A systematic review and meta-analysis. Psychiatry Research, 334, 115859.
  • Lappalainen, P., et al. (2024). Internet-delivered acceptance and commitment therapy for persistent physical symptoms: A randomised controlled trial. Journal of Psychosomatic Research, 178, 111533.
  • Rickardsson, J., et al. (2021). Internet-delivered acceptance and commitment therapy as microlearning for chronic pain: A randomised controlled trial. European Journal of Pain, 25(6), 1416-1430.