Concept Validation for Source Apps Redesign

Championing usability testing to transform product direction and design culture.

The Challenge

The Challenge

The Western Star and Freightliner Source apps showed critically low engagement and poor user feedback despite offering useful features. How might we inform our redesign to avoid the same fate?

My Role

My Role

Lead UX Researcher responsible for gathering requirements, creating test plans, conducting interviews, analyzing findings, and presenting actionable insights to the product owner.

Project Details

Project Details

Timeline: 2 weeks


Team: 1 UX Lead, 1 UX Researcher, 1 UI Designer

TLDR;

When our truck driver apps were struggling with low engagement, I stepped in as Lead UX Researcher to uncover why. Through strategic user testing with actual drivers, I discovered our assumptions about their needs were off-target. My research completely redirected our product strategy, prevented investment in features drivers wouldn't use, and established UX research as a core part of our development process.

The biggest win? Redirecting our product strategy from building features based on assumptions to solving problems based on evidence.

Getting a Lay of the land

I conducted a comprehensive assessment of the current state which revealed two critical issues: confusing UI with an outdated design and limited value proposition for drivers.

Most importantly, it highlighted that we didn't truly understand our users' needs or how they might use the app in their daily work.

Heuristic Evaluation

  • Fails to guide users towards key tasks

  • Lots of text & poor scannability

  • Blank, empty pages

User Reviews & Feedback

  • Relevant resources but no added user value

  • Poor performance issues

GA4 Metrics

  • Improper implementation

  • Incomplete, unreliable data

Alignment before action

I introduced our product team to Opportunity Solution Trees (OSTs) - a tool they'd never used before. This helped us:

  • Identify assumptions we were making about driver behavior to test

  • Get stakeholders aligned on priorities before jumping to solutions

Getting a Lay of the land

I conducted a comprehensive assessment of the current state to inform our business and research goals:

- Navigation fails to guide users towards key tasks

- Lots of info but poor scannability

- Blank, empty pages

Heuristic Evaluation

User Reviews & Feedback

Heuristic Evaluation

- Relevant resources but no real value added for users

- Poor performance issues

- Relevant resources but no real value added for users

- Poor performance issues

Heuristic Evaluation

- Relevant resources but no real value added for users

- Poor performance issues

Getting a Lay of the land

I conducted a comprehensive assessment of the current state to inform our business and research goals:

Heuristic Evaluation

- Navigation fails to guide users towards key tasks

- Lots of info but poor scannability

- Blank, empty pages

User Reviews & Feedback

- Relevant resources but no real value added for users

- Poor performance issues

GA4 Metrics

- Improper implementation

- Incomplete, unreliable data

Designing our research Plan

Based on our knowledge gaps, I developed four core questions to guide our research:

⭐️

Do the features provide value provide value to users?

πŸ“–

Is the app’s design and content clear?

πŸ”

How would the app integrate into users' current routines & preferences?

πŸ“ˆ

How could we improve?

For our test sessions, I developed a mixed method approach that would yield actionable insights while working within our constraints of time and our participants' unpredictable schedules:

  • 30 minute interviews

  • 1 Interviewer, 1 Notetaker, 1 Screen driver

Semi-Structured Script

~30 minute interviews

- 1 Interviewer, 1 Notetaker, 1 Screen driver

Concept Testing

Concept screens shown while we gathered feedback on impressions and usefulness.

Concept Testing

Concept screens shown while we gathered feedback on impressions and usefulness.

12-question survey on current workflows and preferences relating to app features.

Digital Survey

12-question survey on current workflows and preferences relating to app features.

Understanding Our Participants

Our 5 research participants gave us a lens into a specific driver experience:

  • All employed by DTNA

  • All between 45-60+ years old with 20+ years of industry experience

  • All drove Freightliner and/or Western Star trucks within Swan Island, Oregon

This demographic profile became crucial context for interpreting our findings, particularly around technology adoption hesitancy and physical interaction limitations.

For our test sessions, I developed a mixed method approach taking into account our constraints of time and our participants' unpredictable schedules:

Reality Check: Our app was not it

After hours of rewatching interviews, sifting through notes, and tagging patterns in driver comments, several clear insights emerged:

🚚

Experienced drivers didn't find the app features useful. After 20+ years, they knew the job and their trucks like the back of their hand.

🚚

Drivers rarely used phones on the job due to safety, time constraints, and poor visibility.

🚚

We were lacking features they would actually find helpful (route planning and maintenance scheduling).

The hard truth: continuing with our original plan would have created a beautifully redesigned app that still wouldn't get used.

The multi-tagging system I created in Dovetail to organize key themes from driver quotes.

From Insights to Strategy

After analyzing our findings, I presented insights alongside a clear action framework with prioritized recommendations. This approach gave the team immediate direction rather than leaving them wondering "now what?" in the face of challenging user feedback.

Action framework of "must do's", "should do's", and "can do's" that drove our strategic pivot.

I also facilitated a workshop to help the team process the insights and revise our product direction. From the workshop came several major product pivots:

πŸ“Œ

Feature Reassessment - Based on drivers' interest, we re-prioritized overlooked features and shelved ones that showed little value.

πŸ“Œ

Key User Exploration - Our discovery that experienced drivers already knew their trucks inside out prompted us to shift testing to new drivers and owner-operators who may need more support.

πŸ“Œ

Usability Improvements - Responding to drivers' visibility challenges and limited phone usage, we aimed to equip the UI with larger text, higher contrast, and simplified navigation for quick access.

πŸ“Œ

Research Integration - The disconnect between our assumptions and driver realities led us to establish mandatory usability testing checkpoints throughout our product development process.

Product Opportunity Tree 2.0

Feature Brainstorm

Feature Brainstorm

Prioritization Mapping

Key Learning: Bridging the Familiar & the New

This project was a huge step in the right direction for building UX maturity in our orgβ€”one of my (secret) long-term goals. By amplifying our users, I shifted our project from assumption-driven to user-centered thinking. (woo!)

Business Impact

πŸ’Ό

Pivoted product strategy to focus on real driver needs.

πŸ’Ό

Exposed flawed metrics, leading to an analytics re-implementation for better, useful insights.

πŸ’Ό

Inspired the product owner to expand user research across different kinds of drivers.

Personal Wins

🌱

Got comfortable delivering "bad" news and led (my first) product strategy workshop!

🌱

Created reusable research templates that have already been used in 2 other projects.

🌱

Presented this case study to multiple UX teams, sparking an awesome discussion on building better UX processes. πŸŽ‰

Concept Validation for Source Apps Redesign

Championing usability testing to transform product direction and design culture.

The Challenge

The Western Star and Freightliner Source apps showed critically low engagement and poor user feedback despite offering useful features. How might we inform our redesign to avoid the same fate?

My Role

Lead UX Researcher responsible for gathering requirements, creating test plans, conducting interviews, analyzing findings, and presenting actionable insights to the product owner.

Project Details

Timeline: 2 weeks


Team: 1 UX Lead, 1 UX Researcher, 1 UI Designer

TLDR;

When our truck driver apps were struggling with low engagement, I stepped in as Lead UX Researcher to uncover why. Through strategic user testing with actual drivers, I discovered our assumptions about their needs were off-target. My research completely redirected our product strategy, prevented investment in features drivers wouldn't use, and established UX research as a core part of our development process.

The biggest win? Redirecting our product strategy from building features based on assumptions to solving problems based on evidence.

Getting a Lay of the land

I conducted a comprehensive assessment of the current state which revealed two critical issues: confusing UI with an outdated design and limited value proposition for drivers.

Most importantly, it highlighted that we didn't truly understand our users' needs or how they might use the app in their daily work.

Heuristic Evaluation

  • Fails to guide users towards key tasks

  • Lots of text & poor scannability

  • Blank, empty pages

User Reviews & Feedback

  • Relevant resources but no added user value

  • Poor performance issues

GA4 Metrics

  • Improper implementation

  • Incomplete, unreliable data

Alignment before action

I introduced our product team to Opportunity Solution Trees (OSTs) - a tool they'd never used before. This helped us:

  • Identify assumptions we were making about driver behavior to test

  • Get stakeholders aligned on priorities before jumping to solutions

Getting a Lay of the land

I conducted a comprehensive assessment of the current state to inform our business and research goals:

- Navigation fails to guide users towards key tasks

- Lots of info but poor scannability

- Blank, empty pages

Heuristic Evaluation

User Reviews & Feedback

Heuristic Evaluation

- Relevant resources but no real value added for users

- Poor performance issues

- Relevant resources but no real value added for users

- Poor performance issues

Heuristic Evaluation

- Relevant resources but no real value added for users

- Poor performance issues

Getting a Lay of the land

I conducted a comprehensive assessment of the current state to inform our business and research goals:

Heuristic Evaluation

- Navigation fails to guide users towards key tasks

- Lots of info but poor scannability

- Blank, empty pages

User Reviews & Feedback

- Relevant resources but no real value added for users

- Poor performance issues

GA4 Metrics

- Improper implementation

- Incomplete, unreliable data

Designing our research Plan

Based on our knowledge gaps, I developed four core questions to guide our research:

⭐️

Do the features provide value provide value to users?

πŸ“–

Is the app’s design and content clear?

πŸ”

How would the app integrate into users' current routines & preferences?

πŸ“ˆ

How could we improve?

For our test sessions, I developed a mixed method approach that would yield actionable insights while working within our constraints of time and our participants' unpredictable schedules:

  • 30 minute interviews

  • 1 Interviewer, 1 Notetaker, 1 Screen driver

Semi-Structured Script

~30 minute interviews

- 1 Interviewer, 1 Notetaker, 1 Screen driver

Concept Testing

Concept screens shown while we gathered feedback on impressions and usefulness.

Concept Testing

Concept screens shown while we gathered feedback on impressions and usefulness.

12-question survey on current workflows and preferences relating to app features.

Digital Survey

12-question survey on current workflows and preferences relating to app features.

Understanding Our Participants

Our 5 research participants gave us a lens into a specific driver experience:

  • All employed by DTNA

  • All between 45-60+ years old with 20+ years of industry experience

  • All drove Freightliner and/or Western Star trucks within Swan Island, Oregon

This demographic profile became crucial context for interpreting our findings, particularly around technology adoption hesitancy and physical interaction limitations.

For our test sessions, I developed a mixed method approach taking into account our constraints of time and our participants' unpredictable schedules:

Reality Check: Our app was not it

After hours of rewatching interviews, sifting through notes, and tagging patterns in driver comments, several clear insights emerged:

🚚

Experienced drivers didn't find the app features useful. After 20+ years, they knew the job and their trucks like the back of their hand.

🚚

Drivers rarely used phones on the job due to safety, time constraints, and poor visibility.

🚚

We were lacking features they would actually find helpful (route planning and maintenance scheduling).

The hard truth: continuing with our original plan would have created a beautifully redesigned app that still wouldn't get used.

The multi-tagging system I created in Dovetail to organize key themes from driver quotes.

From Insights to Strategy

After analyzing our findings, I presented insights alongside a clear action framework with prioritized recommendations. This approach gave the team immediate direction rather than leaving them wondering "now what?" in the face of challenging user feedback.

Action framework of "must do's", "should do's", and "can do's" that drove our strategic pivot.

I also facilitated a workshop to help the team process the insights and revise our product direction. From the workshop came several major product pivots:

πŸ“Œ

Feature Reassessment - Based on drivers' interest, we re-prioritized overlooked features and shelved ones that showed little value.

πŸ“Œ

Key User Exploration - Our discovery that experienced drivers already knew their trucks inside out prompted us to shift testing to new drivers and owner-operators who may need more support.

πŸ“Œ

Usability Improvements - Responding to drivers' visibility challenges and limited phone usage, we aimed to equip the UI with larger text, higher contrast, and simplified navigation for quick access.

πŸ“Œ

Research Integration - The disconnect between our assumptions and driver realities led us to establish mandatory usability testing checkpoints throughout our product development process.

Product Opportunity Tree 2.0

Feature Brainstorm

Feature Brainstorm

Prioritization Mapping

Key Learning: Bridging the Familiar & the New

This project was a huge step in the right direction for building UX maturity in our orgβ€”one of my (secret) long-term goals. By amplifying our users, I shifted our project from assumption-driven to user-centered thinking. (woo!)

Business Impact

πŸ’Ό

Pivoted product strategy to focus on real driver needs.

πŸ’Ό

Exposed flawed metrics, leading to an analytics re-implementation for better, useful insights.

πŸ’Ό

Inspired the product owner to expand user research across different kinds of drivers.

Personal Wins

🌱

Got comfortable delivering "bad" news and led (my first) product strategy workshop!

🌱

Created reusable research templates that have already been used in 2 other projects.

🌱

Presented this case study to multiple UX teams, sparking an awesome discussion on building better UX processes. πŸŽ‰