#14 | TinT Labs | Dear Clinicians: A Letter from AI PhDs


3 min read
Website

Dear Clinician: A Letter from AI PhDs

Hello reader,

It's a sunny Sunday in my corner of the world!

Today we continue with our TinT Labs' five part special series co-written by two illustrious PhD Researchers who study AI and Mental Health.

And right now, it's time for part 2.


When we were brainstorming for this piece, my nudge to Aseem (Postdoc) and Vasundhra (PhD) was simple:

Knowing what you know about AI and it's evolution, do you have a message for mental healthcare providers?

What we have today for you is a message that comes from a culmination of years of research, professional experience, deep reflection, and vision for a safer future.

Dear Clinician,

AI is being pitched as the next big thing in therapy and mental health care. But before you accept any claim, pause and ask the most important question:

What, exactly, is being automated?

You don’t need to decode every algorithm or read every line of code to evaluate AI powered tools critically.

What you do need is a socio-technical lens — an understanding of how data, design choices, and cultural assumptions shape the technology being sold to you and your patients.

As a caregiver in an increasingly algorithmic world, here’s how to develop a socio-technical lens:

  • Data practices matter. How do AI powered mental health support offering companies collect and process information? Whose voices are represented — and whose are excluded?
  • Modeling choices are not neutral. AI tools reflect the social, cultural, and linguistic biases of their designers as much as the data they are trained on.
    What comprised of the training data for this product?
    Who helped shape it?
  • Privacy and security are just the beginning. Algorithmic risks go deeper, intertwining with culture and context in ways that technical safeguards alone cannot fix.
    Who is this product being marketed and sold to?
    Who is using it? What is the intended use, how does that contrast with the way it is realistically being used?

Ask sharper questions — not only about if these tools “replace therapists,” but about what they realistically achieve, for whom, and at what cost.

Who do AI systems serve well? Who might they leave behind?

Your patients are already experimenting with AI tools, whether you endorse them or not. The more you understand what these systems do — and how they do it — the better you can contextualize, guide, and protect those in your care.

This awareness stretches from surface-level features like chatbot 'personas' and scripted empathy, to deeper systemic issues like biased outputs, inappropriate responses, or dangerously persuasive and inaccurate advice.

Finally, acknowledge that biases in therapy can get translated into biases in AI models.

Algorithmic bias isn’t magic; it grows from the same social biases that affect humans, compounded by design decisions and modeling frameworks.

In short: stay curious, stay critical, and stay informed. AI may change how care is delivered, but it’s your expertise — not an algorithm — that keeps mental health services in any form humane, contextual, and safe.

At your service,
Researchers Aseem and Vasundhra

Aseem Srivastava investigates how large language models can be engineered not just for accuracy, but also for cultural and psychological sensitivity in real-world counseling interactions. He’s currently a postdoc at MBZUAI in Abu Dhabi. He completed his PhD from IIT-Delhi, India.

Vasundhra Dahiya works in Critical AI studies and algorithmic accountability. Informed by a socio-technical lens, her research focuses on understanding how cultural values, AI design, and AI policy shape each other. She is a is a doctoral researcher at the IIT-Jodhpur, India.


Think of a person who would be interested in AI, therapy, and the future of mental health. Would they like to read this piece?

This newsletter is free and by subscribing, you tell us that you are interested and here to know more!

📬 Support us by subscribing here.

💬 Connect with me, Harshali on LinkedIn.

See you soon,
Harshali
Founder, TinT

W Mifflin St, Madison, WI 53703
Unsubscribe · Preferences

The Technology Informed Therapist

"Your newsletter felt like someone holding my hand through an unfamiliar field. It had none of the jargon-heavy complexity I brace for in most tech newsletters—just clarity, warmth, and flow."

Read more from The Technology Informed Therapist
AI may be sitting at the base of therapeutic frame today. How much we tend to notice is the question.

min readRead online #31 | How To Open The AI Can Of Worms With Clients – Part 2 of 2 Hi there, this is Yash again! New name in this newsletter (I know), so here’s a refresher: I’m the content and outreach guy at TinT, while also a trainee therapist at TISS, Mumbai. It’s my last few weeks as a trainee, and I’m cherishing my days here, as I sit down to write this week’s edition. *Last time, we talked about the first side of this conversation: how to explore your clients’ use of AI without...

5 min read@be_tintwebsite How To Open The AI Can Of Worms With Clients – Part 1 of 2 Hello dear reader, I’m back again at my desk with a mug of warm haldi doodh – turmeric latte for friends who didn’t grow up with cringe Indian kids associate with the beverage – settling into my usual writing posture for today’s newsletter. Except today's edition is unlike anything we've published before. Most of what TinT publishes lives in the tech, law, or business layer of mental health innovation. We’ve...

![Illustrative from a PhD thesis using federated learning for assessing depression [[2](https://etda.libraries.psu.edu/catalog/18870sxb701)]  ](attachment:a639d4f0-458e-424c-bf79-d1ea3c486830:image.png)  Illustrative from a PhD thesis using federated lear

7 min read@be_tintwebsite What Therapists Need to Know About Data Privacy in Mental Health AI Hello dear reader, Confidentiality has always been one of the cornerstones of the therapeutic relationship. Clinical practice evolved around the architecture of closed doors, quiet rooms, and deeply held secrets. But increasingly, some parts of therapy live inside software systems. Notes are typed into digital platforms. Assessments are completed through apps. AI models are trained on patterns of...