Solace | Mental Health Crisis Support App

Research, design, and user testing for an AI-assisted mental health support platform

My Role

Designer

Team

Product Manager + Engineer

Duration

5 Months

Solace is a mobile app that provides real-time crisis support, personalized coping exercises, and access to mental health resources. I collaborated with a product team to design the app from concept to a fully interactive prototype. This process involved extensive research, user testing, multiple design iterations, and close collaboration with engineers to ensure a seamless and effective user experience.

Challenges:

Designing a highly sensitive and empathetic product in a deeply personal and complex field while balancing accessibility, privacy, and user trust.

Background

We started the project with a hypothesis that people experiencing a mental health crisis often lack immediate, discreet, and user-friendly support options outside of crisis hotlines. Many individuals hesitate to call hotlines due to stigma, privacy concerns, or anxiety about speaking to a stranger.

Solace solves this by providing a digital, AI-assisted mental health support system that offers real-time crisis guidance, personalized coping exercises, and a structured path to professional resources. The app integrates AI-driven check-ins, guided interventions, and a seamless support network—bridging the gap between crisis moments and professional care.

I also conducted market research to understand the industry:

“Mental health apps have seen rapid adoption, with the global mental wellness app market valued at $4.2 billion in 2021 and projected to grow at a CAGR of 15.9% from 2022 to 2030” (Grand View Research).

“Over 26% of adults in the U.S. experience a mental health disorder each year, yet more than half do not receive adequate support due to barriers like cost, availability, and stigma” (National Institute of Mental Health).

Solace leverages artificial intelligence to analyze user input, detect emotional distress patterns, and provide personalized crisis interventions in real-time. The AI-driven system helps users identify the best coping strategies based on their emotional state, past behaviors, and urgency level—offering discreet, guided support when it’s needed most.

Research - Online Survey

To verify our hypothesis and understand the real challenges faced by individuals in crisis, I conducted an online survey to identify key pain points. Due to limited resources, I focused on targeted digital communities where people openly discuss mental health experiences and crisis support needs.

I developed a custom screener survey and distributed it through mental health forums, online support groups, and Reddit communities related to mental wellness, anxiety, and crisis intervention.

The survey included 13 questions, focusing on:

  • Barriers to seeking help (e.g., stigma, cost, lack of trust in existing resources)
  • Preferred support methods (AI chat, breathing exercises, professional guidance)
  • Past experiences with crisis hotlines & mental health apps
  • Features that would make them feel safer and more comfortable using an app

Since Solace aims to provide real-time, AI-driven crisis support, I also included questions about user comfort with AI-based mental health tools and their preferences for privacy, anonymity, and personalization.


From the survey, we discovered that:

  • Fear of judgment (41.2%), lack of immediate crisis options (32.4%), and discomfort with traditional hotlines (23.5%) are the biggest barriers to seeking mental health support.
  • The majority of users (61.8%) prefer a mobile app for crisis support over web-based platforms or messaging services.
  • Anonymity is a key factor, with 50% of users stating they would feel safer using an app without requiring sign-up.
  • People are open to AI-powered mental health support, with 41.2% trusting AI if backed by professionals and research, while 38.2% would consider it with more proof of effectiveness.
  • Users prefer mental health apps that provide immediate access to professional resources (52.9%) and guided crisis intervention exercises (47.1%).
  • Affordability is a concern, as 35.3% of users are willing to pay up to $5 per month for a premium version, while 26.5% would only use a free service.

Additional Insights from Open-Ended Responses:

  • "I want something that doesn’t require me to talk on the phone when I’m in distress."
  • "Anonymity is super important to me—I don’t want to feel like I’m being tracked."
  • "I would trust AI to an extent, but I need to know that real professionals are involved in its development."
  • "Sometimes I don’t feel comfortable calling a hotline, but I still need guidance in the moment."
  • "It would be helpful if the app suggested coping strategies based on my emotions rather than just giving generic advice."
  • "A daily check-in or mood tracker would be useful—not just for crisis moments, but to prevent reaching that point."
  • "I want to see clear privacy policies before I use an AI-assisted mental health app."

The survey validated the need for a mental health crisis support app that prioritizes anonymity, AI-driven emotional support, and immediate access to professional resources. Privacy, accessibility, and personalization emerged as the most crucial factors for users, highlighting the opportunity for Solace to fill a major gap in digital mental health solutions.

Design - Onboarding

Since Solace is designed for anonymous use, there is no login process. Instead, the onboarding flow focuses on immediate access to crisis support and emotional tracking, ensuring users get help quickly without barriers.

The onboarding process consists of two main phases:

Phase 1: Introduction Screens (Before First Use)

Since mental health apps require trust and clarity, Solace’s first-time user experience introduces its purpose in three concise screens:

Phase 2: First Daily Emotion Check-In + Guidance (After Introduction Screens)

Instead of a login screen, users are immediately taken to their first check-in, guiding them through Solace’s key features.

1. How Are You Feeling?

  • Users select their current emotion (e.g., Calm, Stressed, Overwhelmed, Anxious, Panicked).
  • Option to write a short reflection (optional journaling).

2. AI-Guided Support Based on Mood

  • If the user selects mild distress (e.g., "Stressed"), Solace suggests guided breathing or grounding exercises.
  • If the user selects severe distress (e.g., "Panicked"), Solace offers crisis resources immediately:
    • One-tap access to a crisis hotline
    • Trusted contact alert (if enabled)
    • Guided self-regulation exercise (e.g., grounding techniques, emergency meditation)

3. Complete Check-in & Continue

  • Users can return to the homepage or explore more mental health exercises.
  • Solace remembers past mood trends (anonymously) and adjusts future AI support accordingly.

Design - Guided AI Crisis Support

Guided AI Crisis Support provides real-time, step-by-step assistance when users are experiencing distress. The feature quickly assesses the user’s emotional state and suggests an appropriate coping strategy, such as grounding exercises, breathing techniques, or emergency resource connections.

Before: Initial Guided AI Crisis Support Design

In the first iteration, the AI crisis support feature worked as follows:

  1. User selects “I Need Help Now.”
  2. AI suggests a coping strategy based on the selected distress level.
  3. User engages in a structured exercise (e.g., deep breathing, 5-4-3-2-1 grounding technique).
  4. Session ends, and users are given the option to contact professional help (e.g., crisis hotline).

Initial User Feedback:

  • Users appreciated the step-by-step guidance but wanted more support options after the session beyond just professional help.
  • Users did not understand why certain coping strategies were recommended—the AI suggested exercises without any explanation, which made them feel impersonal.
  • Some users wanted a middle-ground option between self-help exercises and calling a crisis hotline, such as a trusted contact feature or location-based mental health resources.

After: Improved Guided AI Crisis Support Design

Based on user testing and feedback, we refined the Guided AI Crisis Support feature to make it more personalized, transparent, and comprehensive.

  1. More post-session support options added (call a crisis hotline (as before), notify a trusted contact, location-based crisis resources)
  2. AI now explains why a strategy is recommended (ensuring users understand why a method is recommended)
  3. More options for coping strategies (allowing users to choose between multiple techniques instead of following one fixed path)

User Testing

To evaluate the overall usability, effectiveness, and clarity, we conducted a comprehensive user testing session covering all major features:

  • Onboarding & First-Time User Experience
  • Daily Emotion Check-in & AI Mood Tracking
  • Guided AI Crisis Support
  • Post-Crisis Support Features (Trusted Contact, Location-Based Resources, Journaling)

The goal was to determine whether users found the platform intuitive, supportive, and effective in both crisis situations and long-term emotional tracking.

Test Plan

We conducted moderated in-person user testing with six participants to evaluate the usability and effectiveness of platform’s mobile experience. Each participant completed four key tasks on mobile devices to assess the onboarding process, daily emotion check-ins, crisis support flow, and post-session features.

Tasks for Users:

  • Open platform and navigate through the onboarding screens. What do you think this app does?
  • Complete a Daily Emotion Check-in and describe whether the AI’s response feels helpful and relevant.
  • Use Guided AI Crisis Support and follow through with a suggested coping strategy. Did it feel effective and intuitive?
  • Explore the post-session options. Are they clear? Would you use them in a real-life scenario?

Each session lasted 30 minutes, with screen recording and audio feedback collected for analysis to identify usability challenges and areas for improvement.

Participant Recruitment

We recruited six users from mental health communities and digital wellness platforms. Participants were selected based on the following criteria:

  • Have experienced emotional distress in the past
  • Prefer digital mental health support over traditional phone hotlines
  • Comfortable testing an AI-driven mental health tool
  • Diverse in age, gender, and familiarity with mental health apps

Recruitment was conducted via online outreach, with users voluntarily participating to provide feedback.

User Testing Sessions

Findings & Recommendations

Key Strengths
  • Users found onboarding simple and clear, with minimal friction in accessing support.
  • Daily Emotion Check-ins helped users reflect on their mental state, and AI responses felt supportive.
  • Guided AI Crisis Support was highly effective, keeping users engaged through step-by-step coping exercises.
  • Trusted Contact and Location-Based Mental Health Resources were well-received, providing users with post-crisis support options.
Key Areas for Improvement
  • Some users felt AI-generated responses lacked warmth and wanted a more conversational, human-like tone.
  • Explanations for coping strategies needed more clarity, as users were unsure why certain methods were suggested.
  • A few users felt overwhelmed with too many post-crisis options and suggested a simplified flow for immediate relief.
  • Privacy settings were not immediately noticeable—users wanted clearer reassurance that their data was truly anonymous.

Overall Metrics

Conclusion & Learning

Solace is one of the most meaningful projects I’ve worked on, as it gave me the opportunity to design for a deeply personal and sensitive user experience. Building an app focused on mental health crisis support came with unique challenges—ensuring the interface felt calming, the AI responses felt human, and the support options were intuitive and accessible.

One of the biggest learning experiences was balancing AI-driven support with user autonomy. While the AI-guided coping strategies provided structure and guidance, users wanted more control over their experience, leading to iterative improvements in how personalization, pacing, and explanation of strategies were incorporated.

I also learned how crucial privacy and trust are in designing for mental health. Many users expressed concerns about data security and anonymity, and refining the onboarding experience to clearly communicate Solace’s privacy policies was an important takeaway.

It’s rewarding to see that the design is making a real impact. Based on user testing and feedback, the combination of Guided AI Crisis Support, Daily Emotion Check-ins, and Post-Crisis Support Features has been well-received. Users have shared that Solace makes them feel less alone in distressing moments, which is one of the most fulfilling outcomes I could hope for in a design project.

This project has reinforced my passion for designing with empathy, understanding user psychology, and leveraging AI in a way that feels personal rather than mechanical. I’m excited to continue refining Solace, ensuring it becomes an even more effective, accessible, and supportive tool for those who need it most.