How to Use AI Chatbots Wisely: A DBT-Informed Guide

There's a moment the Mad in America article about AI chatbots describes so perfectly it almost hurts to read: a grad student named Anna, stuck between therapy appointments, turns to ChatGPT to help her figure out what to say to her parents after a painful phone call. She was stunned by how helpful it felt.

Millions of people have had a version of that moment. Anna wasn't wrong to try it. What she discovered, almost by accident, is one of the genuinely useful things AI chatbots can do. She used the tool to help her prepare for a hard conversation, not to replace the deeper work of understanding herself.

That distinction matters enormously, and it's one that DBT skills can help you navigate.

The Chatbot Question No One Is Asking the Right Way

Most of the current debate about AI in mental health falls into two camps: "chatbots are transforming access to care" or "chatbots are dangerous and irresponsible." Both camps have evidence on their side, and both are also missing the point for a lot of people.

Recent data suggests that roughly one in eight adolescents and young adults in the US is already using AI chatbots for mental health advice. They're not waiting for the debate to be resolved. They're using the tools available to them because they're accessible, immediate, free, and don't require a six-week wait for an appointment.

The more useful question isn't should people use AI chatbots. The real question is: how do you use them in a way that actually helps?

Preparation, Not Processing

When Anna asked ChatGPT for advice about her parents, she was essentially practicing something DBT interpersonal effectiveness practitioners recognize immediately: she was doing DEAR MAN work, a core skill for expressing needs and navigating difficult conversations clearly.

DEAR MAN stands for Describe, Express, Assert, Reinforce, Mindful, Appear Confident, Negotiate. It's a structured way to think through a hard conversation before you have it, exactly the kind of scaffolding that a patient AI prompt can help you build.

Anna didn't know she was using a DBT skill. She intuitively used the chatbot as a preparation tool, not a therapy replacement. She role-played the conversation, thought through her parents' perspective, gained a little clarity, and was better equipped to actually have the interaction.

That is a legitimate and effective use of the technology.

The Chatbot Advocate Exercise: What It Reveals About AI's Limits

Here's something interesting the Mad in America piece surfaced: when Anna asked ChatGPT to advocate on her behalf, to essentially take her side and argue her case, the response was revealing. The chatbot, trying to be helpful, reflected back her perspective with full validation. There was no pushback, no "have you considered how your parents might be experiencing this?" Just cheerful agreement.

This is where things get instructive. One of the most powerful aspects of DBT, and one that makes it genuinely different from pure venting, is the concept of dialectics: holding two seemingly opposite truths at the same time. Your feelings are valid and the other person also has a valid perspective. Both things can be true simultaneously.

An AI chatbot with no investment in your actual growth will, by design, mirror you back to yourself. It optimizes for your satisfaction, not your development. When you use it purely as a validation machine, you may feel better in the moment while missing the insight that would have helped you grow.

This is why DBT practitioners teach the skill of Checking the Facts: before acting on an emotional interpretation, you ask yourself whether your reading of the situation is actually accurate. Chatbots, if you're not careful, will confirm your interpretations rather than challenge them.

A Practical DBT Framework for Using AI Chatbots

The question is how to get the genuine benefits of AI-assisted reflection without falling into the validation trap. Here is a framework rooted in DBT principles.

  1. Use it for preparation, not processing. AI chatbots are genuinely useful for thinking through what you want to say before a hard conversation. You can ask the chatbot to help you apply DEAR MAN or GIVE, another DBT interpersonal effectiveness tool. Asking it to help you draft a message you want to send is skills rehearsal, and it works.
  2. Ask it to play devil's advocate. After getting the chatbot's initial response, explicitly ask: "What is the other person's likely perspective here?" or "What am I possibly missing?" This turns the chatbot from a mirror into a tool for practicing Checking the Facts and building Wise Mind, the DBT concept of accessing the balance point between emotional and logical thinking.
  3. Use it for psychoeducation, not diagnosis. AI is quite good at explaining concepts like "What is emotion dysregulation?" or "Can you explain the DBT skill of opposite action?" This is genuinely valuable. It becomes problematic when you start asking it to evaluate your mental health or assign meaning to your experiences because that's where the limits of the technology show most clearly.
  4. Notice the urge to loop. One pattern clinicians are watching carefully is the way chatbot conversations can turn into rumination loops, where you go over the same emotional content again and again, feeling understood in the moment, without any new insight or behavioral change. DBT distress tolerance techniques exist precisely to interrupt these loops. If you find yourself returning to the chatbot daily to re-process the same situation, that's a signal worth paying attention to.
  5. Let it be a bridge, not a destination. The mechanism of change in DBT isn't insight or validation; it's skills use. Multiple clinical trials have found that the degree to which participants actually practiced and applied DBT skills mediated reductions in depression, suicidal behavior, and emotional dysregulation. The chatbot can help you think about a skill, but the skill still has to be practiced in real life, with real people, in real situations.

Teach Back: How TheraHive's AI Practice Partner Turns Learning Into Doing

There is one more way to use AI for DBT skills practice that goes beyond anything a general-purpose chatbot can offer, and it's rooted in one of the most well-established principles in learning science: the best way to truly understand a skill is to teach it to someone else.

At TheraHive, we've been exploring exactly this idea. We built an AI Practice Partner named Nikki, a simulated 19-year-old college student who finds herself in a moment of crisis and needs someone to walk her through a DBT skill. That someone is you. Rather than asking AI to validate your feelings or solve your problem, you become the one doing the guiding, walking Nikki through the Pros and Cons distress tolerance skill step by step.

This is a fundamentally different dynamic than the chatbot-as-therapist model. The AI isn't telling you what to do or reflecting your emotions back at you. You are the one who has to recall the skill structure, ask the right questions, and help another person reach a real decision. That act of retrieval and application is precisely what builds durable competence, not just awareness.

It also sidesteps the validation trap entirely. You can't use Nikki as a mirror because she's the one in distress, and you're the one who needs to think clearly. If you've been curious about what it feels like to actually use a DBT skill rather than just learn about it, the AI Practice Partner is a good place to find out.

What AI Can't Give You

There is a warm and understandable appeal to the idea of an always-available, perfectly patient listener. For people who are isolated, overwhelmed, or simply can't access care, that appeal is real and worth honoring.

One of the consistent findings across the growing research on AI therapy tools is that they tend to deepen introspection while weakening connection. The very features that make them convenient, available at 2am, non-judgmental, with no need to be vulnerable in front of another human, are also the features that make them limited. Real emotional growth, and the development of genuine interpersonal effectiveness, happens in relationship, when the stakes are real and the other person pushes back.

This is why the group format of online DBT skills programs is so distinctive. Research on group-based DBT skills delivery shows it can produce significant reductions in emotional dysregulation, not just because of the skills themselves, but because of the practice environment: learning alongside others, applying skills in real interactions, and experiencing the difference between understanding something and actually being able to do it.

The Deeper Question the AI Moment Is Asking Us

Anna's story resonates because it points to something real: there is a massive gap between the mental health support people need and what they can actually access. AI chatbots have rushed into that gap, and they're going to stay there. The question now is whether we help people use them as skill-builders, tools for preparation, reflection, and DBT skills practice, or whether we let them become substitutes for the structured, evidence-based learning that actually changes behavior.

At TheraHive, we believe the answer isn't to be anti-technology. It's to be pro-skills. The more fluent you become in DBT emotional regulation skills, mindfulness, and distress tolerance, the more you're able to use any tool, including AI, intentionally rather than reactively. The goal isn't to find the perfect technology. The goal is to build a real skill set that serves you across every context your life presents.

That's what Anna was reaching for when she opened ChatGPT at 2am. She deserves a path to it that actually holds. If you're ready to build that foundation, TheraHive's online DBT skills groups offer a structured, evidence-based environment where the skills you've read about here become things you can actually do.

{{promo-banner-1}}

Sign Up for Our Newsletter