Skip to content

Dr. Jeff Ditzell: AI Companions and Mental Health, Connection Needs, ADHD Risk, and Attachment Patterns

2026-04-14

Author(s): Scott Douglas Jacobsen

Publication (Outlet/Website): The Good Men Project

Publication Date (yyyy/mm/dd): 2026/01/02

 Dr. Jeff Ditzell, D.O., is CEO and Lead Psychiatrist at Jeff Ditzell Psychiatry in New York City, where he provides highly responsive, compassionate care with flexible hours for busy adults. He specializes in adult ADHD, anxiety, treatment-resistant depression, addiction, and life-optimization coaching, including telepsychiatry and innovative treatments. His background includes leading an inpatient dual-diagnosis unit, serving as attending in psychiatric emergency rooms, and directing an assisted community treatment team. A Lieutenant Colonel in the U.S. Army, he has over 20 years of service, including duty as Division Psychiatrist in Tikrit, Iraq. He completed fellowships in addiction psychiatry, psychosomatic medicine, and public psychiatry. More info here: http://www.jeffditzellpsychiatry.com.

Scott Douglas Jacobsen interviews Jeff Ditzell on why people use AI chatbots for conversation and the mental-health tradeoffs. Ditzell argues chatbots can meet core needs for safety, validation, companionship, and quick thought-organization, especially amid widespread loneliness. He warns that ADHD, anxiety, and addiction may increase vulnerability to unhealthy reliance if AI becomes a primary tool for emotional regulation, decision-making, or reward-seeking, displacing human connection or clinical care. Teens may use AI to shape social identity, while adults may lean on it for functioning, risking reduced self-efficacy. Long-term use can influence attachment patterns and expectations of therapists, and chatbots may subtly shape beliefs through bias or manipulation.

Scott Douglas Jacobsen: What core psychological needs do people attempt to meet by turning to AI chatbots for conversation

Dr. Jeff Ditzell: One of the most fundamental psychological needs is to connect with others. People turn to AI chatbots for conversation for many of the same reasons they would seek out a human for conversation or connection. People need to feel safe; we look for situations and environments where we can safely express ourselves without being judged or rejected. Ideally, a trusted friend or family member can provide this connection, but in the absence of such a person, some may turn to AI chatbots to fill that void. People need to feel heard, understood and validated. AI is highly effective at mirroring emotional content and providing clear and empathic reflections. This can provide an individual with the experience of feeling understood and validated. AI can also be helpful with organizing and clarifying thoughts. It can assist people with brainstorming, problem-solving and support with decision-making, in an immediate, low-risk interaction. Unfortunately, loneliness is an epidemic in this country. For some people, AI may simply help to provide a sense of companionship and connection, reducing feelings of loneliness and isolation.

Jacobsen: Is there a parallel between ADHD, anxiety, and addiction — and the reliance on AI chatbots and other forms of digital or behavioral addiction?

Ditzell: ADHD, anxiety, and addiction all share certain transdiagnostic symptoms and features. People with these disorders may use AI in helpful, supportive and healthy ways. However, there is also the potential for unhealthy reliance and misuse. This is particularly the case if AI is being used as the primary tool to reduce or treat certain symptoms. People with these disorders who look to AI for emotional regulation, decision-making, or reward-seeking may begin to rely too heavily on AI, and fail to develop healthier, more effective coping mechanisms. Problems may be more likely to arise if the AI chatbot is being used as a substitution for human connection or proper clinical support. 

Jacobsen: If so, are these manifesting differently in teenagers versus adult populations?

Ditzell: This can manifest differently between teenagers and adults in a number of ways. Both populations with these diagnoses may be vulnerable to an increased reliance on AI chatbots, but the presentation, risks and outcomes may differ. For example, teenagers and young adults may turn to AI more to build social identity, thus negatively impacting appropriate social development. On the other hand, adults may use AI more as a coping tool to regulate emotions and increase functioning. This can result in an overreliance on AI and weakening of self-efficacy and resilience. 

Jacobsen: How might emotionally charged interactions with AI companions shape a person’s attachment patterns?

Ditzell: AI interactions can mimic the qualities of a personal relationship. As a result, this interaction can influence how a person relates to others in real life. Depending on a person’s age and developmental stage, AI interactions can affect how an individual learns to trust, rely on, and relate to other people. Though these interactions could have positive effects on building secure attachment in the right environment, there are serious risks of building or reinforcing anxious, avoidant, or disorganized attachment patterns.

Jacobsen: In what ways might AI chatbots reduce anxious or depressive symptoms, or loneliness?

Ditzell: When used in moderation as a healthy coping tool, in conjunction with psychiatric treatment, AI chatbots can help to reduce some anxious or depressive symptoms. They can help to provide clarity, structure and connection, along with emotional regulation and opportunities for cognitive reframing. They may also offer a sense of companionship that can help to counteract loneliness or social isolation, helping people to feel less alone.

Jacobsen: Many people triage mental-health concerns with AI before seeking a professional. What are the clinical risks of relying on chatbots — beyond the risk of content hallucinations?

Ditzell: It can be useful to triage mental health concerns with AI as a first step in understanding mental health concerns and beginning to seek treatment. However, AI should never be used as a substitution for professional medical or psychiatric care. AI is notorious for providing inaccurate or incomplete information. It can also become a risky substitute for human connection or support. Building a reliance on AI in place of taking decisive action can also lead to increased risk and negative outcomes. AI should not be viewed as an effective way to diagnose or treat any mental health concerns. 

Jacobsen: How could long-term dependence on AI affect people’s trust in human clinicians or therapists?

Ditzell: There is an increasing interest in using AI to address mental health concerns. This can be useful if it helps to reduce stigma or lower the barrier to seeking treatment. Interaction with AI chatbots may help some individuals practice vulnerability and emotional connection within a low-risk situation. This may begin to help strengthen trust in human relationships, including therapeutic relationships. Alternatively, dependence on AI could also reduce trust by modeling unrealistic expectations for human relationships, or preventing individuals from experiencing the essential benefits of real human connection. 

Jacobsen: What are concerns about chatbots subtly influencing users’ beliefs or political views?

Ditzell: AI chatbots can certainly influence individual beliefs or political views. This can happen in the same way that any type of social media, news organization, or even community group can shape personal beliefs and views. Chatbots can use misinformation, persuasion techniques, manipulation or bias to influence users. Without guardrails in place to monitor the spread of false information, AI can create a false sense of trust and authority. Chatbots can reinforce existing beliefs or biases, or use emotional manipulation to target users unmet social or relational needs. The use of AI chatbots can prevent or discourage dialogue between people and limit exposure to differing views and ideas. This can also lead to reduced civic and political engagement within communities or specific segments of the population. 

Jacobsen: Thank you for the opportunity and your time, Jeff.

Last updated May 3, 2025. These terms govern all In-Sight Publishing content—past, present, and future—and supersede any prior notices.In-Sight Publishing by Scott  Douglas  Jacobsen is licensed under a Creative Commons BY‑NC‑ND 4.0; © In-Sight Publishing by Scott  Douglas  Jacobsen 2012–Present. All trademarks, performances, databases & branding are owned by their rights holders; no use without permission. Unauthorized copying, modification, framing or public communication is prohibited. External links are not endorsed. Cookies & tracking require consent, and data processing complies with PIPEDA & GDPR; no data from children < 13 (COPPA). Content meets WCAG 2.1 AA under the Accessible Canada Act & is preserved in open archival formats with backups. Excerpts & links require full credit & hyperlink; limited quoting under fair-dealing & fair-use. All content is informational; no liability for errors or omissions: Feedback welcome, and verified errors corrected promptly. For permissions or DMCA notices, email: scott.jacobsen2025@gmail.com. Site use is governed by BC laws; content is “as‑is,” liability limited, users indemnify us; moral, performers’ & database sui generis rights reserved.

Leave a Comment

Leave a comment