SoftBank Robotics America · Web Portal + Mobile

"Confirmed Cleans"

PROJECT TYPE
0 → 1 Product
Role
Design Lead · Sole Designer
SKills
Product design, systems design, strategy
timeline
2018 – 2020
Robots generate data. Data alone doesn't build trust.

As the sole design lead at SoftBank Robotics America, I built Whiz Connect from 0 to 1 — the first web portal and mobile concept for managing fleets of autonomous commercial cleaning robots.

The brief was simple: prove the robots work.

The Problem

We started solving the wrong problem

The initial brief was straightforward: build a dashboard that shows ROI. Give clients data that proves the robots are worth the investment. Coverage area, runtime, routes completed. We built it. It wasn't enough.

What customer interviews actually revealed

Sitting in on customer-facing calls, a different problem kept surfacing. Distributors weren't losing deals because clients doubted the robot's efficiency — they were losing deals because clients had no way to defend the purchase internally. A facilities manager couldn't prove a cleaning happened to a skeptical tenant. A building owner couldn't dispute a false complaint from a lessee.

The gap wasn't ROI. It was verifiability. Cleaning has always been taken on faith — no paper trail, no timestamps, no record. Whiz units generate exactly that data. The design problem shifted: not "show efficiency metrics" but "create a defensible record of every clean."

the constraint

The business problem and the design problem weren't the same thing

The company needed ROI proof. I needed to make the product worth buying. Three operator problems drove every major design decision.
01
design decision

Did the robot run the right route?

Operators had no feedback loop. A completed route didn't mean a good route. This drove the decision to surface completion percentage and map overlays together. Gaps were visible at a glance, not buried in raw numbers. This helped the customer success team improve route making and efficiency over time.

Completion % and map overlay gave operators an instant read on coverage gaps without interpreting raw data

Grouping assists by location surfaced physical obstructions customer success could flag proactively.

02
design decision

Why did it stop?

Assists were logged but not legible. Initially customer success teams were troubleshooting blind. This drove the decision to group assists by location and reason transforming a data dump into a pattern facility managers and the customer success team could act on.

03
design decision

Can we prove it worked?

Facilities managers had data but couldn't defend it upward. This drove the decision to make actual vs. target coverage the lead metric — the one number a facilities manager could bring into a conversation with a skeptical tenant.

Actual vs. target coverage became the metric facilities managers could bring to their own stakeholders.

impact

The data existed. The language to make it meaningful didn't.

What started as a reporting dashboard became the platform that changed how the entire US business understood robot performance.

Fleet managed
200+

Autonomous robots across enterprise accounts with 10+ clients, each with 3–5 locations.

daily operators
20+

Internal sales and customer success team members using Whiz Connect as a daily operational tool.

Operational impact

Repeat assists reduced

Customer success teams identified problem route areas through the assist data layer, turning reactive support into proactive intervention.

Operational impact

Became a live sales tool

Demonstrated at trade shows side-by-side with Whiz hardware and giving distributors concrete ROI evidence where none existed before.

beyond the product

Design decisions fed directly into hardware conversations

The mobile app surfaced the 30-foot beeper range as a critical operator pain point before it was on the hardware roadmap. Software didn't wait for hardware — it defined what hardware needed to solve next.

The platform established the first shared vocabulary for how Whiz performance was understood across the entire US business.

Mobile Support App · Design Exploration

The gap software could fill while hardware caught up

The mobile app was never shipped — but it was grounded in real operator research and presented to stakeholders as the next step for the ecosystem. It embodies a principle I believe in: software can bridge UX gaps while hardware catches up.

The research finding

Through interviews with building cleaning managers, we discovered operators were averaging 4+ floor changes per shift just to locate robots. The onboard beeper only worked within 30 feet. In a multi-story building, that's useless.

The app solved this with a real-time robot map, full fleet visibility, and a ping feature that triggered the robot's turn signal and chime so operators could locate it from a distance.The pain points it surfaced went straight into hardware conversations.

Design informed product. Product informed hardware.

Final reflection

The robot does the work. The interface decides if anyone believes it.

01
The design came last

Spending time upfront to define what "good" even meant for an autonomous vacuum before touching a single screen.

02
Software moved faster than hardware

The ping feature came directly from operators spending shifts searching floors for a robot that needed help. It triggered the robot's turn signal lights and a chime so operators could locate it from a distance.

03
Vocabulary is infrastructure

The design system wasn't just visual. It established how robot performance was talked about across the entire business.