Solace | Mental Health Crisis Support App
Research, design, and user testing for an AI-assisted mental health support platform

Solace is a mobile app that provides real-time crisis support, personalized coping exercises, and access to mental health resources. I collaborated with a product team to design the app from concept to a fully interactive prototype. This process involved extensive research, user testing, multiple design iterations, and close collaboration with engineers to ensure a seamless and effective user experience.
Challenges:
Designing a highly sensitive and empathetic product in a deeply personal and complex field while balancing accessibility, privacy, and user trust.
Background
We started the project with a hypothesis that people experiencing a mental health crisis often lack immediate, discreet, and user-friendly support options outside of crisis hotlines. Many individuals hesitate to call hotlines due to stigma, privacy concerns, or anxiety about speaking to a stranger.
Solace solves this by providing a digital, AI-assisted mental health support system that offers real-time crisis guidance, personalized coping exercises, and a structured path to professional resources. The app integrates AI-driven check-ins, guided interventions, and a seamless support network—bridging the gap between crisis moments and professional care.
I also conducted market research to understand the industry:
“Mental health apps have seen rapid adoption, with the global mental wellness app market valued at $4.2 billion in 2021 and projected to grow at a CAGR of 15.9% from 2022 to 2030” (Grand View Research).
“Over 26% of adults in the U.S. experience a mental health disorder each year, yet more than half do not receive adequate support due to barriers like cost, availability, and stigma” (National Institute of Mental Health).
Solace leverages artificial intelligence to analyze user input, detect emotional distress patterns, and provide personalized crisis interventions in real-time. The AI-driven system helps users identify the best coping strategies based on their emotional state, past behaviors, and urgency level—offering discreet, guided support when it’s needed most.
Research - Online Survey
To verify our hypothesis and understand the real challenges faced by individuals in crisis, I conducted an online survey to identify key pain points. Due to limited resources, I focused on targeted digital communities where people openly discuss mental health experiences and crisis support needs.
I developed a custom screener survey and distributed it through mental health forums, online support groups, and Reddit communities related to mental wellness, anxiety, and crisis intervention.
The survey included 13 questions, focusing on:
- Barriers to seeking help (e.g., stigma, cost, lack of trust in existing resources)
- Preferred support methods (AI chat, breathing exercises, professional guidance)
- Past experiences with crisis hotlines & mental health apps
- Features that would make them feel safer and more comfortable using an app
Since Solace aims to provide real-time, AI-driven crisis support, I also included questions about user comfort with AI-based mental health tools and their preferences for privacy, anonymity, and personalization.


From the survey, we discovered that:
- Fear of judgment (41.2%), lack of immediate crisis options (32.4%), and discomfort with traditional hotlines (23.5%) are the biggest barriers to seeking mental health support.
- The majority of users (61.8%) prefer a mobile app for crisis support over web-based platforms or messaging services.
- Anonymity is a key factor, with 50% of users stating they would feel safer using an app without requiring sign-up.
- People are open to AI-powered mental health support, with 41.2% trusting AI if backed by professionals and research, while 38.2% would consider it with more proof of effectiveness.
- Users prefer mental health apps that provide immediate access to professional resources (52.9%) and guided crisis intervention exercises (47.1%).
- Affordability is a concern, as 35.3% of users are willing to pay up to $5 per month for a premium version, while 26.5% would only use a free service.
Additional Insights from Open-Ended Responses:
- "I want something that doesn’t require me to talk on the phone when I’m in distress."
- "Anonymity is super important to me—I don’t want to feel like I’m being tracked."
- "I would trust AI to an extent, but I need to know that real professionals are involved in its development."
- "Sometimes I don’t feel comfortable calling a hotline, but I still need guidance in the moment."
- "It would be helpful if the app suggested coping strategies based on my emotions rather than just giving generic advice."
- "A daily check-in or mood tracker would be useful—not just for crisis moments, but to prevent reaching that point."
- "I want to see clear privacy policies before I use an AI-assisted mental health app."
The survey validated the need for a mental health crisis support app that prioritizes anonymity, AI-driven emotional support, and immediate access to professional resources. Privacy, accessibility, and personalization emerged as the most crucial factors for users, highlighting the opportunity for Solace to fill a major gap in digital mental health solutions.
Design Iteration 1 -
Quick Prototype and User Interview
After analyzing the survey results, the PM and I brainstormed key product features, focusing on how to provide real-time crisis support that feels human and responsive, how to ensure user privacy and security in an AI-driven mental health app, how to differentiate Solace from traditional crisis hotlines and existing mental health apps.
Since Solace is a consumer-facing app, we wanted to involve user insights early to ensure it met real needs. Instead of building a complete user flow, we created a basic prototype showcasing the app’s core functions and conducted user interviews to gather feedback for future iterations.


I began by sketching wireframes on paper, then refined them into high-fidelity designs in Figma, and finally used InVision to build an interactive prototype.

To test the prototype, we reached out to online mental health communities and interviewed users via Zoom and anonymous surveys. 6 people (3 female / 3 male) participated in interviews. Most had previously sought mental health support but felt existing crisis options were either too slow, impersonal, or intrusive.
Users responded positively to the app’s anonymity feature, saying it made them feel safer and more willing to seek help and the guided AI crisis support was well-received, with users liking the structured approach rather than having to “figure out what to do” during distressing moments. Moreover, users liked the journaling and check-in system, stating that it could help track emotions beyond crisis moments and prevent distress from escalating.
The Big Concern:
“Will the AI really understand what I need?”
Many users were hesitant about relying on AI alone, fearing it might not provide relevant or truly helpful advice.
Another key insight from user feedback was that mental health crises don’t happen in isolation—many users said they wanted the app to account for their daily emotional patterns, not just crisis moments.
Users suggested that the app could track emotional trends over time and provide preventative guidance before they reached a crisis and also wanted an optional "trusted contact" feature—a way to notify a friend or therapist when they needed help, without making a direct call. Some users requested location-based mental health resources, such as nearby crisis centers or safe spaces where they could decompress.
Design Iteration 2 -
Exploring New Opportunities
User feedback suggested that Solace could enhance user engagement by expanding beyond crisis intervention to serve as a comprehensive mental wellness companion. Rather than solely providing support during moments of distress, users expressed a need for continuous emotional tracking, proactive guidance, and seamless access to mental health resources.
Recognizing that no single mental health app currently combines real-time crisis support, AI-driven check-ins, and professional resources in one platform, I conducted a thorough analysis of user journeys across mental health services, therapy platforms, and self-care apps to pinpoint key gaps and opportunities for improvement.


User Flow
Based on insights from user interviews and brainstorming sessions, I created a user flow diagram to determine the most necessary features for Solace. After receiving feedback from the PM, I made several key revisions to the flow to enhance usability and accessibility.

Change 1: Predictive Crisis Support Instead of Manual Selection
- Before: Users had to manually select their distress level before receiving guidance.
- After: Instead of asking users to identify their emotional state during distress, Solace will use AI-driven mood tracking to preemptively detect emotional shifts and suggest support before users reach a crisis point.
For example, if a user’s journal entries or previous mood logs show a downward trend, Solace can proactively send a check-in notification or offer early-stage coping exercises to prevent escalation. Additionally, if AI detects a crisis situation (e.g., distress signals in typed responses), it will automatically suggest immediate crisis resources instead of requiring users to search manually.
Change 2: Immediate Support First, Then Customization
- Before: Users had to navigate multiple options before receiving crisis support, choosing their preferred coping method manually.
- After: The app will immediately provide a recommended coping exercise, with an option for users to adjust or explore different methods later.
This ensures users in distress receive immediate relief first, before navigating options. Instead of making users choose between guided breathing, grounding techniques, or journaling first, Solace will automatically recommend the most relevant method based on AI analysis of previous user behavior.

Wireframing
After finalizing the updated user flow, I sketched wireframes on paper to visualize the core user experience.
Once the paper sketches were refined, I moved to Figma to create high-fidelity wireframes, representing the app’s structure and interaction flow.



Design - Onboarding
Since Solace is designed for anonymous use, there is no login process. Instead, the onboarding flow focuses on immediate access to crisis support and emotional tracking, ensuring users get help quickly without barriers.
The onboarding process consists of two main phases:
Phase 1: Introduction Screens (Before First Use)
Since mental health apps require trust and clarity, Solace’s first-time user experience introduces its purpose in three concise screens:

Phase 2: First Daily Emotion Check-In + Guidance (After Introduction Screens)
Instead of a login screen, users are immediately taken to their first check-in, guiding them through Solace’s key features.
1. How Are You Feeling?
- Users select their current emotion (e.g., Calm, Stressed, Overwhelmed, Anxious, Panicked).
- Option to write a short reflection (optional journaling).
2. AI-Guided Support Based on Mood
- If the user selects mild distress (e.g., "Stressed"), Solace suggests guided breathing or grounding exercises.
- If the user selects severe distress (e.g., "Panicked"), Solace offers crisis resources immediately:
- One-tap access to a crisis hotline
- Trusted contact alert (if enabled)
- Guided self-regulation exercise (e.g., grounding techniques, emergency meditation)
3. Complete Check-in & Continue
- Users can return to the homepage or explore more mental health exercises.
- Solace remembers past mood trends (anonymously) and adjusts future AI support accordingly.

Design - Guided AI Crisis Support
Guided AI Crisis Support provides real-time, step-by-step assistance when users are experiencing distress. The feature quickly assesses the user’s emotional state and suggests an appropriate coping strategy, such as grounding exercises, breathing techniques, or emergency resource connections.
Before: Initial Guided AI Crisis Support Design
In the first iteration, the AI crisis support feature worked as follows:
- User selects “I Need Help Now.”
- AI suggests a coping strategy based on the selected distress level.
- User engages in a structured exercise (e.g., deep breathing, 5-4-3-2-1 grounding technique).
- Session ends, and users are given the option to contact professional help (e.g., crisis hotline).
Initial User Feedback:
- Users appreciated the step-by-step guidance but wanted more support options after the session beyond just professional help.
- Users did not understand why certain coping strategies were recommended—the AI suggested exercises without any explanation, which made them feel impersonal.
- Some users wanted a middle-ground option between self-help exercises and calling a crisis hotline, such as a trusted contact feature or location-based mental health resources.

After: Improved Guided AI Crisis Support Design
Based on user testing and feedback, we refined the Guided AI Crisis Support feature to make it more personalized, transparent, and comprehensive.
- More post-session support options added (call a crisis hotline (as before), notify a trusted contact, location-based crisis resources)
- AI now explains why a strategy is recommended (ensuring users understand why a method is recommended)
- More options for coping strategies (allowing users to choose between multiple techniques instead of following one fixed path)

User Testing

To evaluate the overall usability, effectiveness, and clarity, we conducted a comprehensive user testing session covering all major features:
- Onboarding & First-Time User Experience
- Daily Emotion Check-in & AI Mood Tracking
- Guided AI Crisis Support
- Post-Crisis Support Features (Trusted Contact, Location-Based Resources, Journaling)
The goal was to determine whether users found the platform intuitive, supportive, and effective in both crisis situations and long-term emotional tracking.
Test Plan
We conducted moderated in-person user testing with six participants to evaluate the usability and effectiveness of platform’s mobile experience. Each participant completed four key tasks on mobile devices to assess the onboarding process, daily emotion check-ins, crisis support flow, and post-session features.
Tasks for Users:
- Open platform and navigate through the onboarding screens. What do you think this app does?
- Complete a Daily Emotion Check-in and describe whether the AI’s response feels helpful and relevant.
- Use Guided AI Crisis Support and follow through with a suggested coping strategy. Did it feel effective and intuitive?
- Explore the post-session options. Are they clear? Would you use them in a real-life scenario?
Each session lasted 30 minutes, with screen recording and audio feedback collected for analysis to identify usability challenges and areas for improvement.
Participant Recruitment
We recruited six users from mental health communities and digital wellness platforms. Participants were selected based on the following criteria:
- Have experienced emotional distress in the past
- Prefer digital mental health support over traditional phone hotlines
- Comfortable testing an AI-driven mental health tool
- Diverse in age, gender, and familiarity with mental health apps
Recruitment was conducted via online outreach, with users voluntarily participating to provide feedback.
User Testing Sessions

Findings & Recommendations
Key Strengths
- Users found onboarding simple and clear, with minimal friction in accessing support.
- Daily Emotion Check-ins helped users reflect on their mental state, and AI responses felt supportive.
- Guided AI Crisis Support was highly effective, keeping users engaged through step-by-step coping exercises.
- Trusted Contact and Location-Based Mental Health Resources were well-received, providing users with post-crisis support options.
Key Areas for Improvement
- Some users felt AI-generated responses lacked warmth and wanted a more conversational, human-like tone.
- Explanations for coping strategies needed more clarity, as users were unsure why certain methods were suggested.
- A few users felt overwhelmed with too many post-crisis options and suggested a simplified flow for immediate relief.
- Privacy settings were not immediately noticeable—users wanted clearer reassurance that their data was truly anonymous.
Overall Metrics

Conclusion & Learning
Solace is one of the most meaningful projects I’ve worked on, as it gave me the opportunity to design for a deeply personal and sensitive user experience. Building an app focused on mental health crisis support came with unique challenges—ensuring the interface felt calming, the AI responses felt human, and the support options were intuitive and accessible.
One of the biggest learning experiences was balancing AI-driven support with user autonomy. While the AI-guided coping strategies provided structure and guidance, users wanted more control over their experience, leading to iterative improvements in how personalization, pacing, and explanation of strategies were incorporated.
I also learned how crucial privacy and trust are in designing for mental health. Many users expressed concerns about data security and anonymity, and refining the onboarding experience to clearly communicate Solace’s privacy policies was an important takeaway.
It’s rewarding to see that the design is making a real impact. Based on user testing and feedback, the combination of Guided AI Crisis Support, Daily Emotion Check-ins, and Post-Crisis Support Features has been well-received. Users have shared that Solace makes them feel less alone in distressing moments, which is one of the most fulfilling outcomes I could hope for in a design project.
This project has reinforced my passion for designing with empathy, understanding user psychology, and leveraging AI in a way that feels personal rather than mechanical. I’m excited to continue refining Solace, ensuring it becomes an even more effective, accessible, and supportive tool for those who need it most.