Intro
Overview
Research
Wireframes
High Fidelity
Outcomes
Next Steps

UX Case Study

OpenROV

A mobile controller for the OpenROV Trident — evolved by field research at Monterey Bay and shipped as the production MVP to Trident's early backers.

Role
UX Designer & Researcher
Client
OpenROV
Methods
Field Research · User Testing
50%
AV
Forward
Back
Tilt UpTilt DownLeftRight
+
Shipped as the production MVP controller for the OpenROV Trident — in the hands of 1,300+ Kickstarter backers

Overview

Project Overview

The Problem

Piloting an underwater drone from a phone is disorienting. You can't see the drone, you're dealing with glare, wet hands, and zero spatial feedback — and there was no existing interface to learn from. We needed to design controls that felt intuitive from the very first session, with no onboarding safety net.

Business Goals

Design controls accessible to first-time pilots without sacrificing depth for experienced operators
Ship an MVP architecture extensible enough to grow with a rapidly expanding feature roadmap
Design for Android's varied screen sizes and touch targets — reliable under outdoor conditions, with wet hands and glare

User Goals

Capture and share high-quality underwater footage with minimal friction
Maintain clear spatial awareness of the drone's orientation at all times
Monitor battery life and session time without breaking focus from the live feed

My Role

My design team and I conducted user testing both with and without the physical drone — including engineering stakeholders on-site so they could observe friction points firsthand. I designed and built the MVP prototype used for all subsequent testing. Having devs in the room meant change requests came with shared context, which is what drove the reduction in development time.

Overview Photo 1 — Team at dockOverview Photo 2 — Monterey Bay

Research

Initial Research

Field Research

"Removing the screen entirely forced users to rely on muscle memory and spatial instinct — revealing which control mappings felt natural versus which ones had to be learned."
75%
of respondents planned to use the drone recreationally, establishing beginner-accessible controls as the design priority
45%
already owned an aerial drone, giving us a strong mental model baseline to design against

Organizing Our Findings

Research — Affinity map whiteboard

Affinity mapping of field observations and pre-order survey responses surfaced consistent behavioral patterns and unmet needs across both user segments.

Creating Personas

S
Sammy
Age 35 · Recreational

Use: Scout and film potential scuba diving sites

Wants: Simple recording controls she can operate without looking away from the live feed

J
Jack
Age 42 · Enthusiast

Use: Explore underwater environments with his kids

Wants: A layout familiar enough that his children can pick it up without instruction

Research confirmed two distinct user segments: recreational beginners and experienced enthusiasts. The MVP was scoped around the beginner persona — with Expert Mode identified as a priority for the next iteration.

Wireframes

Wireframes & Testing

V1
Wireframe V1
V2
Wireframe V2
V3— Selected
Wireframe V3

Three Controller Concepts

Three distinct layout directions explored how joystick placement, directional controls, and HUD hierarchy could be balanced on a small screen.

Simulated Testing Without Hardware

A paper airplane acted as a real-time drone surrogate — letting us evaluate spatial intuition and control feel before a single line of code was written.

V3 Validated as the Strongest Design

V3 emerged as the clear winner across all sessions, consistently matching users' existing mental models from gaming and aerial drone experience.

✓ V3 selected for engineering
Wireframe testing — pool session photo
OpenROV Trident underwater

OpenROV Trident · Monterey Bay

Where the design met the real world.

High Fidelity

Field Testing & Micro Interactions

High Fidelity — phone in hand

The MVP in Users' Hands

With V3 approved, engineering built a functional interface to our prototype spec. Putting a live app in users' hands for the first time revealed friction points no studio test could have predicted — and gave us precise, actionable direction for the next iteration.

MVP AppLive TestingField Validated

Record & Star

One-tap starring lets users flag key moments mid-dive without interrupting the recording — dramatically reducing post-session review time.

Side Nav Controls

A transparent directional overlay keeps the live feed unobstructed while giving pilots precise, real-time spatial control.

Hidden Menu

Secondary controls tucked behind a single tap — preserving the clean cockpit view while keeping advanced options accessible within reach.

Live HUD

With a 3-hour dive window, battery visibility was non-negotiable — surfaced at a glance alongside depth and recording status, with visual weight kept deliberately subordinate to the live feed.

Field testing — drone in poolField testing — kid with controller

Field Test Summary

Pilots navigated the joystick layout intuitively from the first session — no onboarding required
Control sensitivity was miscalibrated: small inputs produced disproportionately large movements — flagged and escalated to engineering for recalibration
The starring feature saw immediate, unprompted adoption — users cited it as the standout feature post-session
The hidden menu achieved its goal: secondary controls were discoverable within seconds without cluttering the primary view

See the Drone in Action

Footage from our user testing day at Monterey Bay.

Outcomes

What We Delivered

Three rounds of iterative testing yielded a beginner-first controller that shipped as the MVP — giving Trident's early backers an experience that felt immediately familiar, and giving engineering a validated spec that reduced rework and accelerated delivery.

16×
Kickstarter goal
3
Wireframe iterations
2
Field test locations
Dev time

Next Steps

What's Next

Key Takeaway

"The most valuable design decisions came from being in the field — not the studio."

Watching real users pilot the drone in real water — dealing with glare, wet hands, and the chaos of a live environment — surfaced insights that no amount of controlled testing could have replicated.

Run follow-up sessions to measure how each micro-interaction performs and track usability improvements across releases.

Design and validate Expert Mode — layering advanced controls into a separate mode without disrupting the proven beginner-first baseline.

Introduce user-adjustable sensitivity settings so pilots can tune control response to match their experience level and operating environment.

Build a media workflow — in-app clipping, annotation, and sharing — so the value of a dive doesn't end when you surface.

© 2026 Amy Vorchheimer. All rights reserved.
Designed with intention