AI-Mediated Mental Health Disclosure
This project explores the design space for AI-mediated disclosure of mental health and wellbeing among youth by exploring ways to transform user-AI conversations into collaborative tool empowering users to share their mental health conditions with care providers for timely care and support.

Youth in the United States are facing a growing mental health crisis [1]. However, many adolescents do not seek timely support due to barriers such as difficulty articulating their emotional experiences, uncertainty about when or how to ask for help, and concerns about social stigma [2]. At the same time, recent studies show that one in eight adolescents use generative AI chatbots (e.g., ChatGPT, Gemini) to seek mental health advice because they are accessible, immediate, and perceived as private [3,4]. Despite this growing use, AI-mediated conversations remain isolated from adolescents’ formal care circle and are vulnerable to hallucinated or potentially harmful guidance [5]. This presents a critical opportunity to design AI systems that act as communication bridges, enabling youth to share mental health concerns with trusted care providers and receive timely support.
Existing research on AI and mental health highlights its potential for screening, psychoeducation, and structured therapeutic interactions [6,7]. However, most work treats AI as a single-user system and pays limited attention to interaction design challenges such as user roles, disclosure boundaries, interaction modalities, and coordination among multiple stakeholders, factors that are critical in collaborative care contexts [8]. As a result, there is limited design knowledge about how AI systems might function as privacy-preserving mediators of mental health disclosure that support user agency, shared understanding, and trust across care networks. This project explores the design space for AI-mediated disclosure of mental health and wellbeing among youth by exploring ways to transform user-AI conversations into collaborative tool empowering users to share their mental health conditions with care providers, for timely care and support.
References
- Centers for Disease Control and Prevention. Youth mental health: The numbers, Nov 29 2024. Accessed: 2025-12-12.
- Jerica Radez, Tessa Reardon, Cathy Creswell, Faith Orchard, and Polly Waite. Adolescents’ perceived barriers and facilitators to seeking and accessing professional help for anxiety and depressive disorders: a qualitative interview study. European child & adolescent psychiatry, 31(6):891–907, 2022.
- Ryan K McBain, Robert Bozick, Melissa Diliberti, Li Ang Zhang, Fang Zhang, Alyssa Burnett, Aaron Kofner, Benjamin Rader, Joshua Breslau, Bradley D Stein, et al. Use of generative ai for mental health advice among us adolescents and young adults. JAMA Network Open, 8(11):e2542281–e2542281, 2025.
- Hannah R Lawrence, Renee A Schneider, Susan B Rubin, Maja J Matari ´c, Daniel J McDuff, and Megan Jones Bell. The opportunities and risks of large language models in mental health. JMIR Mental Health, 11(1):e59479, 2024.
- Scott Monteith, Tasha Glenn, John R Geddes, Peter C Whybrow, Eric Achtyes, and Michael Bauer. Artificial intelligence and increasing misinformation. The British Journal of Psychiatry, 224(2):33–35, 2024.
- Raluca Balan and Thomas P Gumpel. Chatgpt clinical use in mental health care: Scoping review of empirical evidence. JMIR Mental Health, 12:e81204, 2025.
- Thieme, A., Hanratty, M., Lyons, M., Palacios, J., Marques, R. F., Morrison, C., & Doherty, G. (2023). Designing human-centered AI for mental health: Developing clinically relevant applications for online CBT treatment. ACM Transactions on Computer-Human Interaction, 30(2), 1-50.
- Andreas Bucher, Sarah Egger, Inna Vashkite, Wenyuan Wu, and Gerhard Schwabe. “it’s not only attention we need”: Systematic review of large language models in mental health care. JMIR Mental Health, 12(1):e78410, 2025.