Oracle Health Health Design System Desktop EHR 10+ Teams

EHR Data Presentation
Consistency Initiative

Establishing a shared data presentation language across the Oracle Health EHR — from patient overview to individual detail pages — across every clinical role and care venue.

Role
Principal UX Designer
Platform
Desktop EHR
Deliverables
Taxonomy · Components · Governance
EHR data consistency hero image

One visual language,
across every surface clinicians touch.

The Oracle Health EHR had grown across years and acquisitions. Each product team made reasonable decisions in isolation. The result was an environment where the same patient data looked and behaved differently depending on which surface a clinician happened to be working in.

The Health Design System team's mandate was to solve that — to establish a shared data presentation framework that was clinically accurate, technically adoptable, and durable across a large, distributed product organization. As the principal UX designer embedded in the team, the work spanned direct physician collaboration, cross-team governance, and hands-on component specification.

10+
Product teams brought into alignment
14
Card variants consolidated into one system
4
Core data forms defined and governed
3
Charting libraries replaced by a shared standard

Inconsistency isn't a cosmetic problem
in a clinical environment.

A clinician who cannot instantly scan and interpret data is a clinician who pauses, double-checks, and sometimes misses what matters.

A structured audit across all ten product teams surfaced three compounding failures — each individually tolerable, together actively working against the people using the system.

"The same lab value could appear as a chart in one module, a plain list in another, and a dense table in a third. There was no shared grammar."
Pattern audit finding
"Severity was encoded in color in some places and text in others. Clinicians couldn't build a reliable mental model for what danger looked like."
Discovery synthesis
"Components that looked actionable were sometimes read-only. Clinicians were learning module by module which things they could act on."
Cross-team interview finding
Clinical stakes
In a clinical environment, visual inconsistency is not a UX inconvenience. It is a source of error. Cognitive load spent parsing the interface is cognitive load stolen from the patient.

When to use a list, a card, a table, or a chart — and who decides.

Before any component work could begin, the team needed a foundational answer: what are the right ways to present clinical data, and when does each apply?

The taxonomy work ran in two parallel tracks — a bottom-up audit of existing patterns and a top-down physician collaboration process. The two fed each other iteratively. No form definition was stable until both tracks agreed.

Form When to use What it communicates
ListSequential data without comparative value; status scans; low-density contextsOrder, sequence, membership
CardDiscrete entities requiring action or navigation; patient-level summariesActionability, identity, priority grouping
TableMulti-attribute data requiring comparison across rows; detailed recordsRelationships between attributes, comparative value
ChartTemporal trends or values requiring visual pattern recognitionChange over time, deviation from normal, magnitude

Each data field was assigned a visibility tier — always visible, default visible, or on-demand — and every tier assignment required physician sign-off before it could be considered stable. The tier framework was then mapped across role and venue combinations to determine default hierarchy configurations for shared components.

Design principle
Tier classification was not a design judgment call. Every tier assignment was validated with physicians before it could be considered stable. Designers proposed; clinicians confirmed or corrected.

What teams could configure —
and what they couldn't.

Building the right components was necessary. Getting them used was the actual job.

The most contentious governance decision was defining the boundary between configurable and fixed. Every request for more configuration was evaluated against a single test: would this undermine cross-surface consistency for clinicians?

Teams could configure
Which columns appeared in tables
Data fields in card secondary rows
Time window for trend charts
Action labels in alert banners
Empty state copy
Teams could not configure
How table headers were styled
Structure and order of the card primary row
Reference range rendering behavior
Alert severity color coding
Component spacing and density defaults

Teams were owners of their data and their copy. The design system team was the owner of the visual and behavioral contract.

On the anti-pattern library
Rather than telling teams they were doing something wrong, the design system team could point to a shared document explaining why a given approach created problems for clinicians — backed by physician input. Clinical stakes land differently than design opinions.

Four principles for getting 10+ teams to adopt components they didn't build.

There is a structural tension in design system work: the teams being asked to adopt shared components are the same teams that built the patterns being replaced.

01
Framing
Make the problem visible before proposing the solution
The pattern audit was shared with each product team before any component work began — framed as a question, not a diagnosis. Every team said yes. Every team had stories about clinician confusion and workarounds that had calcified into permanent solutions. The audit created shared acknowledgment of the problem before the design system team had any specific asks.
02
Participation
Involve teams in the work, not just the output
Each product team was invited into working sessions during the taxonomy and hierarchy phases. Teams that helped build the framework were less likely to resist implementing it — and they produced a better taxonomy, surfacing edge cases the design system team would not have encountered on its own.
03
Adoption cost
Reduce migration cost wherever possible
Ready-to-use component code shipped alongside every design specification. Migration office hours, incremental onboarding milestones, and legacy documentation during the transition window were all part of the support model. Every adoption conversation that stalled on engineering concern was one that would have to happen again later.
04
Leverage point
Governance, not police work
The design system team's leverage was at the point of design approval, not code review. Teams were responsible for their own adherence. New pattern requests were reviewed before they were built, not after — keeping the relationship with product teams collaborative rather than adversarial.

Designers proposed.
Clinicians confirmed — or corrected.

Physician input did not refine the system at the margins. It changed it substantively.

A standing physician advisory group spanning multiple specialties and care settings was embedded into every phase of the work — not sequenced as a final validation step. No data tier assignment, no visualization form decision, and no behavioral specification was considered stable until reviewed by at least two physicians with relevant clinical context.

PhaseInput soughtWhat changed
DiscoveryConfirmation that observed inconsistencies created real workflow frictionElevated scope from cosmetic to clinical-safety initiative
Data hierarchyTier assignment review for every data field in scope32% of proposed tier assignments were revised
TaxonomyValidation of form-to-context mapping for clinical workflowsChart form scoped more narrowly; card permitted in specific medication workflows
Component reviewEvaluation of prototype components against real clinical tasksAlert banner dismiss redesigned with role-based permissions
Anti-pattern libraryReview of proposed anti-patterns for clinical accuracyThree anti-patterns added based on physician-identified failure modes
Physician review session
Photo from a physician advisory session showing a clinician interacting with a prototype EHR screen on a desktop monitor, with sticky note or digital annotation feedback visible — ideally in a clinical or simulation lab setting to establish authenticity

Physician advisors reviewed components against real clinical tasks — not design screenshots.

Tier revision example
Before/after showing a specific data field elevated after physician review — e.g. advance directive status moving from Tier 2 (default visible) to Tier 1 (always visible) in the patient summary card, with the physician's stated rationale annotated alongside the spec change

32% of tier assignments changed after physician review. Some of the most important were fields designers hadn't considered at all.

Credibility as leverage
Several physician advisors subsequently supported the design system team in cross-team conversations where product teams were resistant to adoption. A physician explaining why a component behavior was clinically necessary was more persuasive than a designer making the same argument. That credibility was earned through the rigor of the collaboration process, not assumed.

What I'd do differently

The most important outcome is not the components. It is the process by which components are now created and validated — a shared language, a governance model, and a physician validation process that can hold as the system grows.

What worked well

Running the pattern audit before proposing any solutions created shared ownership of the problem — teams came to us, not the other way around.
Embedding physicians into every phase meant the work was grounded in clinical reality from the start, not retrofitted at the end.
The configuration boundary — content is yours, structure is ours — gave teams real agency while protecting what needed to be consistent.
The anti-pattern library was the single most effective cross-team communication tool. Clinical stakes as written policy land differently than verbal pushback.

What's next

Role-based personalization at the component level — the tier framework needs to evolve from static defaults to dynamic configuration as EHR personalization matures.
AI-assisted data surfacing will introduce new data types and presentation challenges the governance model needs to account for alongside structured EHR data.
A formal accessibility audit of data presentation components — particularly chart and table patterns at high density — against WCAG 2.1 AA standards.
Expanding physician advisory representation to include nursing and pharmacy at the same depth, not just as secondary reviewers.
Let's talk about your
next big idea.

Whether you're exploring new solutions or improving what you already have, I am here to help.

Get in touch