"Your newsletter felt like someone holding my hand through an unfamiliar field. It had none of the jargon-heavy complexity I brace for in most tech newsletters—just clarity, warmth, and flow."
Share
#20 | How a Clinician in India Is Using AI to Train Therapists
How a Clinician in India Is Using AI to Train Therapists
Hello dear reader,
It's an autumn Saturday morning.
This is my first fall in the States, rather - my first fall ever. I'm a tropical girl living in a temperate world, and the changing season have been such a joy to witness. 🍂
While we're talking of joys, one of the joys of writing TinT is meeting clinicians who are quietly redefining what it means to be tech-informed.
Today’s story is about one such clinician.
Jai Arora, is a counselling psychologist from New Delhi, India, and cofounder of Kirana Counselling.
Jai’s curiosity led him from the therapy room to the world of large language models, and eventually, to building AI-powered client personas to train therapists.
From Counselling to Curating Data
I met Jai over LinkedIn earlier this year. We got chatting about our shared interests and our projects in the AI MhTech space.
Jai completed his master’s in counselling psychology from Christ University, Bangalore, and cofounded Kirana Counselling, which offers both online and offline therapy services.
In early 2025, when the startup he worked at shut down, he used the unexpected pause to explore AI, a topic that had by then gripped nearly every profession.
He began with a foundational AI program at IIT Delhi and soon after partnered with Behtar Foundation, an NGO using AI to improve access and quality in India’s mental health sector.
That’s where he started building AI for therapy training.
Building AI Client Personas
…or as Jai calls them, “cadavers for practising therapy skills”
At Behtar Foundation, Jai and the team are building AI simulation that behaves like clients; presenting emotional cues, symptoms, and conversational dynamics that mirror real therapeutic encounters.
These AI personas are designed to help train young therapists. Students can practice core counselling skills in a safe, feedback-rich environment before entering real sessions.
“Most AI mental health product today try to be the therapist, or adjacent to the therapist”, Jai says.
“We’re using AI to train the therapist instead.”
Images courtesy Jai Arora
Know more about Behtar Foundation’s work on their website. Behtar Foundation is open to collaborators.
Why India Needs AI-Assisted Training for Therapists
Jai’s motivation stems from a systemic gap: most psychology programs in India don’t emphasise practicum-based training.
Through his own research, he found that over 80% of Indian under-graduate programs in clinical psychology lack the opportunities for students to practice their skills through role plays, supervised sessions, or micro-skill labs.
The result? Many graduates enter the field underprepared, often unsure of how to translate theory into practice.
This under-preparedness can have ripple effects, from therapist burnout to inconsistent client experiences.
That’s the gap AI clients aim to fill: giving students a place to practice before practice.
Read more about Jai’s work and the EMSFPP cohorts at kirana.counselling on Instagram. Kirana Counselling is open to enquiries about the next EMSFPP cohorts.
But AI Alone Isn’t Enough
Jai acknowledges that while AI can simulate realistic clients, it can’t yet replicate the human nuance of supervision, feedback, and mentorship.
So he’s building a bridge:
AI-supported skill training + human-led learning cohorts.
Under Kirana Counselling, Jai runs Essential Micro-Skills for Psychologists (EMSF-P), a series of cohort-based programs where students learn foundational skills and later use AI clients for practice and reinforcement.
Now heading into its fourth cohort, the program has already helped dozens of early-career therapists gain confidence and readiness to begin practice.
Kirana’s work was recently featured in an orientation talk at Montfort College, one of India’s leading institutes for counselling psychology.
Jai’s Message For Clinicians
I'm always looking for what what drives clinicians to experiment with tech, what truly motivates them. I asked Jai exactly this, and here's his answer:
We’re in the middle of a technological revolution. I believe it is every clinician’s responsibility to understand what AI can (and can not) do in our field.
Perhaps we can start small. Use ChatGPT to make one part of your workflow 1% easier. Watch a video. Take a course. You don’t have to build the next big tool or business, just solve one small problem that matters to you.
AI isn’t here to take our jobs; it’s here to transform them. Therapy will evolve but the outcome of therapy, which is to help clients heal, doesn’t have to change.
My Thoughts
The AI conversation in the mental health space often feels repetitive, polarising, or just plain exhausting. This pace of change feels disorienting and destabilising for everyone (I include myself in that list!)
And yet, it’s clinicians like Jai, who respond creatively, who are helping ground this transition in practice and ethics.
I believe experiments like his are vital. They explore possibilities that engineers or business leaders alone might never see.
Most importantly, they prove that the direction of evolution of mental health tech doesn’t have to be dictated solely by big tech or venture capital.
Of course, time will weed out ideas that yield meaningful outcomes from those that don't.
But if more clinicians tinker, test, and teach with technology, we move closer to a future where AI enhances, not erodes, therapeutic integrity.
Tinkering And Need Help?
On the note of keeping the spirit of exploration alive, I have a question for you:
Are you a clinician exploring AI to improve skills, build tools, or reimagine a problem in your practice? Are you early on in trying an idea out, or deep into building something?
Need help, feedback, or just a soundboard?
I’d love to hear from you. I’m a trained product designer, former product manager at an Indian unicorn, a tech consultant to therapists, and now a 2x founder in mental health tech. I can help you think through your idea with all these lenses in mind.
Reply to this email. Let’s feature your story next.
PS. I'm working with clinicians and machine learning experts to map the new skills therapists need to make their practise resilient.
Therapists deserve clarity on AI! TinT delivers insights and community to help you stay grounded and ahead
"Your newsletter felt like someone holding my hand through an unfamiliar field. It had none of the jargon-heavy complexity I brace for in most tech newsletters—just clarity, warmth, and flow."
5 min readWebsite How To Evaluate LLMs For Crisis Response Paid AI work opportunity for clinicians in this newsletter. Scroll to bottom highlight. Hello dear reader, We’ve hit the six-month mark of this newsletter. I’d promised myself I’d send it out every weekend, a promise I’ve now broken twice. Once while moving cities, and again last weekend on a trip to Hawaii accompanying my partner for his conference. Both times, I assumed I could keep my routine going on despite big life...
5 min readWebsite #21 | Clinical OpEd: What should therapists look for when evaluating AI tools? Hello dear reader, It’s Saturday evening. The sunset hues outside my windows are magnificent. My weekend writing rhythm has set in. Today’s piece is unlike any before. This is the first time Tint feature’s a therapists’ own writing. I met Daniel Fleshner a few months ago via LinkedIn. I feel I should send the LinkedIn team a thank you note for all the meaningful connections LI has sparked for me...
4 min readWebsite Signals from India: Mapping AI’s Growing Role in Indian Mental Healthcare Hello dear reader, Earlier this week I caught myself making an assumption: that my Indian clinical network didn’t seem as interested in AI and its implications for the future of the field. But as the week went on, I was proven wrong. Gladly wrong. Today’s piece is a collection of signals that AI in mental health is already here in India. This one’s for clinicians who, like me, may have doubted, and for...