
The problem
The gap between data and understanding
Most people do not struggle to get their results. They struggle to understand them. Clinical data arrives in charts and ranges but without meaning. In that gap between data and understanding, trust begins to erode.
Inside Levy Health, results existed as accurate but detached reports. Patients searched the internet and chatbots for clarity, often ending up more confused. Clinics spent hours answering questions that should have been effortless to understand.
The real problem was not the data. It was translation. The system needed a way to speak, to bridge clinical accuracy with human comprehension. That became the foundation for Levy AI.
Research & insights
Turning information into intelligence
I spoke with patients and clinicians and studied how people interact with their health data when left on their own. Most tools explain results but few create understanding. People do not want to chat with a bot, they want to feel guided. They trust context more than convenience and clarity more than automation.
The insight was simple but defining. A conversational system should not speak at users but with them. It must layer information, surface what matters, explain what confuses, and cite what it knows.
Competitive Analysis
To ground the design direction, I conducted an in depth review of conversational and health platforms, mapping how they handle context, tone, safety, and credibility.
• ChatGPT showed linguistic fluency but limited medical reliability. It proved that intelligence without oversight creates risk.
• Claude demonstrated thoughtful tone and comprehension but lacked medical grounding. It highlighted the importance of empathy balanced with expertise.
• Medicus AI offered strong data visualization but limited conversational depth. It showed that accuracy alone does not build trust.
• Flo was emotionally resonant yet deliberately non diagnostic. It reinforced that emotional safety must coexist with clarity.
• Perplexity excelled at transparency through sourced answers. It inspired visible citations and confidence cues within Levy AI.
• Health portals and patient dashboards were precise but passive. They demonstrated the need for guided understanding instead of static data.
Process
Building meaning before building interaction
The goal was never to create a chatbot. It was to design a system that could think in context and respond with care. Before any interface came to life, the foundation was set for how Levy AI should reason, speak, and behave through experience design thinking.
Framing the system
Prototyping understanding
Defining principles for conversation
Framing the system
Levy AI was imagined as a translator. Every element needed to connect clinical accuracy with emotional intelligence. The design work began by mapping how users move from uncertainty to understanding. Tone, language, and structure were treated as design materials essential to the system’s foundation.
Prototyping understanding
I built low fidelity dialogue flows, interaction models, and interface prototypes focused on clarity and usability. Each testing round informed both the tone of responses and the structure of screens. Users preferred short, layered content that offered choice and reduced information overload. Collaborating with clinical, UX, and NLP teams, I refined how Levy AI balanced openness, precision, and visual clarity.
Defining principles for conversation
To make the system consistent and safe, I defined a set of design principles that guided every decision.
• Speak with empathy
• Explain to empower
• Be accurate without overwhelming
• Always cite
• Guide through clarity
Solution
Designing Conversation as Care
Levy AI turns static medical results into guided dialogue. It transforms data into understanding by meeting users where they are, not where the system is. Each element is designed to feel informed, intentional, and human.
Personalized pathways
Adaptive content logic tailors tone, vocabulary, and recommendations to each user’s diagnostic profile. The system uses modular conversation templates that adjust dynamically to context, maintaining consistency while personalizing guidance. This ensures equity in communication while preserving clinical integrity.

Clinically grounded logic
Every response is validated through a controlled content management system connected to clinical data sources. Design and engineering collaborated to implement traceable content mapping, allowing medical experts to audit and approve outputs. This safeguards the conversational model while preserving UX flow integrity.

System for scale
Levy AI’s conversational system extends across the full interaction flow, from active chats to history management. Each view reuses the same modular components and tone models, allowing new medical domains to be added without redesign. The shared framework ensures consistent structure, language, and interaction logic across the product while integrating seamlessly with existing architecture.
What changed
Usability studies and pilot feedback showed clearer user understanding and stronger engagement with results.
Highlights
• Improved comprehension of results among users
• Reduced time spent by the care team on follow up explanations
• Higher engagement within the results experience
• Users described the tone as clear, precise, and approachable
What it led to
Outcomes
• Levy AI evolved from a results feature into a communication framework that now informs the product’s interaction model and content architecture
• The project established conversation as a core design pattern for translating complex information into usable insight
• The work aligned design, clinical, and NLP teams around shared principles for comprehension, safety, and language consistency
• The conversational framework was designed for scalability and now supports future use cases such as onboarding for additional testing and treatment guidance
Reflection
This work reaffirmed that design in healthcare is an act of translation. It is not about simplification but about making complexity understandable and usable. Levy AI showed that when information is structured to think and respond with intent, clarity becomes a form of care.






