#Project 1

MindEase

Improving Navigation Clarity and AI Trust in a
Mental Health Application

Timeline: Jan 2025 – April 2025

Role: Sole UX Researcher (End-to-End Ownership)

Participants: 6 Young Adults (18-35)

Methods: Heuristic Evaluation, Moderated Usability Testing, Thematic Analysis, Accessibility Review

Placeholder image

#About Project

MindEase is a research-led mental wellbeing application redesign focused on improving structural clarity, accessibility, and perceived trust in AI-supported interactions.

Rather than introducing new wellbeing features, this project examined how navigation predictability, system feedback, and interaction transparency influence emotional comfort and sustained engagement.

# 1. Context & Core Problem

During early exploration of existing mental health applications, a consistent pattern emerged:

Users were not overwhelmed by a lack of features — they were overwhelmed by unclear structure.

Across evaluated applications, participants reported:

  • * Difficulty locating journaling tools
  • * Uncertainty about AI response reliability
  • * Hesitation during onboarding
  • * Confusion about whether actions were successfully completed

Ambiguous icon-only navigation, weak confirmation feedback, and inconsistent accessibility implementation increased cognitive load and reduced trust.

This project reframed the challenge:

User illustration asking a question

How might we improve emotional reassurance by strengthening structural clarity?

# 2. Research & Discovery

The project followed a structured research-through-design model.

Heuristic Evaluation

Multiple mental health applications were evaluated against established usability principles and accessibility best practices.

Recurring issues included:

  • * Poor visibility of system status
  • * Unclear navigation labelling
  • * Inconsistent interaction feedback
  • * Text contrast and scaling inconsistencies

Structural ambiguity was more damaging than missing features.

Heuristic evaluation post-its
Moderated Usability Testing

Six participants completed structured tasks including:

  • * Onboarding
  • * Locating journaling features
  • * Interacting with AI chatbot support

Observed behaviours included:

  • * Navigation hesitation
  • * Re-checking tabs
  • * Extended search time for core tools
  • * Pauses before submitting AI responses

Participants frequently verbalised uncertainty, even when technically completing tasks correctly.

Usability test notes
Thematic Analysis

Interview transcripts revealed three dominant themes:

  1. 1. Structural ambiguity increases emotional hesitation
  2. 2. Predictable flows reduce anxiety
  3. 3. Clear feedback strengthens trust in AI interactions

This insight shifted the redesign focus from feature expansion to interaction clarity.

Thematic analysis mapping

# 3. Key Insight

Users did not need more functionality. They needed reassurance.

Structural ambiguity creates hesitation. Predictable interaction reduces anxiety.

# 4. Design Principles & Strategy

The redesign was guided by four core principles:

Predictable Navigation

Users should always know where they are and how to move forward.

Reduced Cognitive Load

Limit simultaneous decisions and visual noise.

Transparent System Feedback

Users should always know where they are and how to move forward.

Integrated Accessibility

Accessibility embedded at structural level — not retrofitted later.

# 5. Prototype Evolution

#5.1 Onboarding Simplification

The onboarding experience was redesigned using progressive disclosure. Each screen introduces a single structured prompt with clear forward progression, reducing early decision fatigue.

Visual improvements include:

  • * Clear step indicators
  • * Single-call-to-action screens
  • * Reduced visual density
  • * Consistent confirmation transitions

Validation Results:

* 5/6 participants completed onboarding without clarification

* Reduced first-use hesitation

* Improved perceived flow clarity

Phone showing onboarding screens

#5.2 Navigation Restructure

Icon-only navigation was replaced with clearly labelled bottom tabs. Persistent navigation across all screens improved spatial predictability and reduced search effort.

Hand holding phone

Key structural changes:

  • * Text-labelled navigation tabs
  • * Repositioned journaling into primary navigation
  • * Clear visual hierarchy
  • * Consistent tab positioning

Observed Improvements:

  • * Faster journaling discovery
  • * Reduced navigation re-checking
  • * Participants described structure as "clearer"
  • and "straightforward"

#5.3 AI Transparency Improvements

The chatbot interface was redesigned to prioritise clarity over conversational novelty.

AI Bot Introduction

Rather than simulating therapeutic authority, the design emphasised transparency and structured guidance.

Enhancements include:

  • * Structured message formatting
  • * Clear user/system distinction
  • * Predictable response timing
  • * Reduced conversational ambiguity

Validation Indicators:

  • * Reduced hesitation before sending responses
  • * Improved perceived AI reliability
  • * Increased conversational confidence
AI Chat Interface

#5.4 Journaling Accessibility

Journaling was re positioned within primary navigation and redesigned to reduce friction.

These changes directly addressed discoverability and cognitive effort challenges identified in testing.

Journaling flow screens

Enhancements include:

  • * Larger input fields
  • * Optional voice entry
  • * Immediate save confirmation
  • * Clear timestamp feedback

Observed Improvements:

  • * Easier access to journaling tools
  • * Positive feedback on confirmation feedback
  • * Reduced emotional effort through voice option

#5.5 Professional Support Integration

The chatbot interface was redesigned to prioritise clarity over conversational novelty.

Therapist search and profile

The booking flow includes:

  • * Therapist profile overview
  • * Date/time selection
  • * Clear confirmation screen

This bridges self-guided support with visible professional access, reinforcing legitimacy without overwhelming users.

Booking and video call flow

# 6. Validation & Impact

Prototype validation demonstrated measurable behavioural improvements compared to baseline observations.

Users were not overwhelmed by a lack of features — they were overwhelmed by unclear structure.

Observed Impact:

  • * Reduced onboarding hesitation
  • * Improved feature discoverability
  • * Increased navigation confidence
  • * Stronger perceived AI clarity
  • * Improved overall system trust perception

Participants consistently described the redesigned interface as:

Clear
Structured
Easy to follow

# 7. Key Product Decision

A deliberate decision was made not to introduce additional wellbeing features during redesign.

Although research identified opportunities for expanded AI tracking and gamification, prioritising foundational
usability ensured structural friction was resolved first.

This prevented feature complexity from masking interaction weaknesses and positioned the system for scalable
future enhancement.

# 8. UX Capabilities Demonstrated

End-to-end research ownership
Translating qualitative insight into structural design decisions
Designing AI interactions using transparency principles
Connecting usability improvements to emotional trust
Embedding accessibility at system level