AI-Mediated Mental Health Disclosure

This project explores the design space for AI-mediated disclosure of mental health and wellbeing among adolescents by exploring ways to transform user-AI conversations into collaborative tool empowering users to share their mental health conditions with care providers for timely care and support.

US adolescents are currently facing a growing mental health crisis [1]. Yet, many do not seek timely support due to barriers including difficulty articulating emotional experiences, uncertainty about when and how to ask for help, and social stigma [2]. At the same time, recent study shows that one in eight adolescents use Generative AI chatbots (e.g., ChatGPT, Gemini) for mental health advice due to their increased accessibility, immediacy, and perceived privacy [3, 4]. However, these AI-mediated conversations remain isolated from adolescents’ formal care circle and are vulnerable to hallucinated or potentially harmful guidance [ 5 ]. This presents a critical opportunity to design AI as a communication bridge that empowers adolescents to share mental health concerns with care providers and receive timely support.

Current research on AI and mental health highlights its promise for screening, psycho-education, and structured therapeutic interactions [6]. However, it has largely focused on AI as a single-user system, overlooking critical interaction design considerations, such as user roles, disclosure boundaries, interaction modalities, and coordination across multiple stakeholders, that are essential for collaborative care [7]. As a result, there remains a significant gap in design knowledge around how AI systems might be designed as privacy-preserving mental health disclosure mediators that support user agency, shared understanding, and trust, while facilitating coordination across care networks, rather than functioning as isolated conversational agents. This project explores the design space for AI-mediated disclosure of mental health and wellbeing among adolescents by exploring ways to transform user-AI conversations into collaborative tool empowering users to share their mental health conditions with care providers, for timely care and support.

References

  1. Centers for Disease Control and Prevention. Youth mental health: The numbers, Nov 29 2024. Accessed: 2025-12-12.
  2. Jerica Radez, Tessa Reardon, Cathy Creswell, Faith Orchard, and Polly Waite. Adolescents’ perceived barriers and facilitators to seeking and accessing professional help for anxiety and depressive disorders: a qualitative interview study. European child & adolescent psychiatry, 31(6):891–907, 2022.
  3. Ryan K McBain, Robert Bozick, Melissa Diliberti, Li Ang Zhang, Fang Zhang, Alyssa Burnett, Aaron Kofner, Benjamin Rader, Joshua Breslau, Bradley D Stein, et al. Use of generative ai for mental health advice among us adolescents and young adults. JAMA Network Open, 8(11):e2542281–e2542281, 2025.
  4. Hannah R Lawrence, Renee A Schneider, Susan B Rubin, Maja J Matari ´c, Daniel J McDuff, and Megan Jones Bell. The opportunities and risks of large language models in mental health. JMIR Mental Health, 11(1):e59479, 2024.
  5. Scott Monteith, Tasha Glenn, John R Geddes, Peter C Whybrow, Eric Achtyes, and Michael Bauer. Artificial intelligence and increasing misinformation. The British Journal of Psychiatry, 224(2):33–35, 2024.
  6. Raluca Balan and Thomas P Gumpel. Chatgpt clinical use in mental health care: Scoping review of empirical evidence. JMIR Mental Health, 12:e81204, 2025.
  7. Andreas Bucher, Sarah Egger, Inna Vashkite, Wenyuan Wu, and Gerhard Schwabe. “it’s not only attention we need”: Systematic review of large language models in mental health care. JMIR Mental Health, 12(1):e78410, 2025.