Case Studies
Context & Challenge
The RTC360 is a core product for Leica Geosystems. With the next hardware generation, the existing interface could no longer support the increasing functional complexity. New capabilities such as cloud collaboration and IoT connectivity introduced additional system demands, while alignment with the design system and the Field360 companion app became essential for a coherent product experience.
The Problem
The existing interface carried significant legacy debt:
-
Deep sub-menus and scrolling traps on a 4-inch screen
-
Feature-driven navigation that no longer matched real-world workflows
-
Unclear system feedback during active scanning
This made it difficult for users to understand what the scanner was doing, especially under time pressure.
The Strategic Goal
Shift from a feature-heavy interface to a task- and state-oriented system that scales with future hardware and cloud capabilities.




Phase 1: Scoping & Problem Definition
Current Workflow Analysis
The existing workflow was documented to understand how users currently operate the scanner during critical tasks.
Key findings:
Core actions were buried across multiple menu levels
Users had to repeatedly stop and check the device to confirm system status
The workflow prioritized configuration over execution
This analysis exposed misalignment between the system structure and real-world usage.
Problem Statement Mapping
Insights from the current workflow were mapped into clear problem statements connecting user friction to system behavior:
Lack of immediate state visibility increased cognitive load
Navigation depth slowed down time-critical actions
System feedback did not reflect mechanical reality
These problem statements framed the strategic direction of the project.




Phase 2: Research Synthesis & Opportunity Definition
Proto Personas
Research findings were synthesized into proto personas representing key user types operating the scanner in high-pressure environments.
The focus was not on demographic detail, but on:
Responsibility and risk
Environmental constraints
Dependency on system feedback
User Journey Mapping
A detailed user journey map was created to visualize how users move through a scanning session from setup to capture and completion.
Moments requiring immediate system feedback
Gaps between system behavior and user confidence
Opportunities to reduce interaction during active scanning
This journey map became the backbone for system-level decisions.
Opportunity Areas & Requirements Definition
Opportunity Areas
Improve at-a-glance understanding of system state
Reduce interaction during active scanning
Align device and companion app behavior
Requirements
Persistent visibility of scan state
State-driven navigation logic
Consistent behavior across device, app, and cloud
Happy Path Definition
A clear happy path was defined to represent the ideal scanning flow with minimal interaction and maximum system clarity.




Phase 3: Strategic Framing & Ideation
How-Might-We
Based on the defined opportunity areas and happy path, the project was reframed through How-Might-We questions:
How might the current scan state be understood instantly, even from a distance?
How might cloud workflows be integrated without increasing operational complexity?
How might device and companion app behave as one coherent system?
Scenario Definition
Scenarios were developed to stress-test the happy path across:
Remote app interaction
Cloud synchronization
Edge cases and interruptions
Ideation & Design Studio
A collaborative design studio explored multiple system concepts and interaction models.
Focus areas included:
Task-first navigation structures
Alternative state representations
Reducing interaction during execution
Promising directions were selected based on feasibility and strategic alignment.
Strategy Before Design: System-Level Decisions
Before creating screens, the focus was on defining system behavior.
Shifted from feature-based navigation to state-driven workflows
Defined scan phases as primary system states
Reduced hierarchy depth to eliminate unnecessary scrolling
Treated visual feedback and motion as operational signals




Phase 4: Design, Validation & Implementation
Information Architecture & System Mapping
Wireflows and system maps defined:
State transitions across scan phases
Cross-device interaction between scanner and app
System behavior during sync and error states
Visual Communication as Functional Feedback
A key requirement was that the current scan phase must be recognizable from a distance.
Visual feedback and motion were designed to communicate mechanical system states clearly, making each scan phase (tilting, scanning, and capturing) immediately understandable without interaction.
This reduced the need for close inspection, minimized uncertainty during operation, and improved trust in the system during active scans.
High-Fidelity Prototyping, Validation, and Iteration
Complex high-fidelity prototypes simulated real system behavior, including state changes, transitions, and multi-step interactions.
Usability testing evaluated:
Scan-state recognition
Interaction clarity
Confidence and clarity while monitoring scans from a distance
Insights from testing informed iterative refinements to system logic and interaction behavior.
Engineering Collaboration & Implementation Reviews
I worked closely with engineering to align system behavior, technical constraints, and edge cases.
I led UX reviews during implementation to ensure the final product reflected the defined system logic and validated behaviors.




Outcome & Impact
Systemic Results
Clear and glanceable scan-state communication
Reduced need for close inspection during scanning
Elimination of menu-diving for core actions
Stronger alignment between scanner and companion app logic
Long-Term Value
A scalable system foundation for future hardware and cloud features
A shared mental model across devices
A clearer product language for users and engineering teams




