Project Name
[ UX — 2025 ]
A short description of the project goes here.
[ featured work ]
[ about ]
I'm a UX/UI designer based in Los Angeles, working at the intersection of craft and clarity. Since 2022 I've been creating digital products that are purposeful and interactive, from interactive builds to brand systems to complex SaaS tools.
[ process ]
Understanding the problem space, users, and constraints before touching a frame.
Synthesising research into a clear design direction and measurable outcomes.
Iterating through concepts at pace — from rough wireframes to polished interfaces.
Handing off production-ready assets with documentation that developers actually use.
[ ← → to drive ]
I helped design R-EYE-DER, a hands-free, gaze-driven safety tool for AR glasses that coaches drivers to maintain safer visual attention while navigating hazards. Built on the Raven AR glasses Python-based SDK, the project explored how real-time visual guidance could reduce panic-driven fixation and improve driver awareness.
Working within a hackathon timeframe, I collaborated with developers to design lightweight, non-distracting interface elements and interaction flows that remained usable while users were in motion. My focus was on minimizing cognitive load and ensuring the system provided guidance without adding visual clutter.
Motorists under stress or fatigue often fixate on hazards such as potholes, vehicles, or pedestrians. This visual fixation can unintentionally steer the vehicle toward the danger.
Existing driver assistance systemsWe designed a gaze-driven coaching system that subtly redirects driver attention away from hazards and toward safer navigation paths. The interface uses minimal overlays and lightweight visual cues to reduce cognitive load while maintaining clarity.
The prototype demonstrated that real-time visual attention coaching is technically feasible and comfortable enough to use while moving, while preserving driver focus.
Drivers and riders using AR glasses in real-world navigation scenarios.
Core NeedsDesigning for safety required restraint. Adding more features risked distracting the user, so the focus was on simplifying visual language, prioritizing clarity over functionality, and designing subtle coaching rather than explicit instruction.
Pocket Tarot is a gesture-driven AR experience for Snap Spectacles that blends constellation exploration with interactive tarot card reveals. The experience places spatial visuals directly into the user's environment, allowing them to discover constellations and trigger a tarot reading through hand gestures.
Working with a collaborator who focused on tarot ritual design and content research, I led the interaction and spatial UX design, translating narrative concepts into an intuitive, hands-free AR experience. The project explores how reflective storytelling can unfold naturally in spatial computing without relying on traditional menus or screen-based UI.
Many AR experiences rely on floating menus and screen-based metaphors that interrupt immersion. For reflective experiences like tarot, this creates friction and makes interactions feel unintuitive.
Three core UX issuesI designed a spatial, gesture-driven interaction flow that guides users from constellation exploration to tarot card reveal. The experience uses pinch gestures and world-anchored prompts to reduce visual clutter and maintain immersion while still providing clear guidance.
My collaborator focused on tarot ritual structure and narrative content, while I translated the experience into spatial interaction design and implemented the AR prototype.
I helped Samoi Tech streamline their complex multi-projector setup process into a single, easy-to-use application. By simplifying playback workflows for immersive video content across 5 walls, I made the system usable by both technical and non-technical users.
Collaborating closely with a lead engineer and refining designs across several iteration cycles, I used internal feedback to ensure each update balanced user needs and technical constraints.
Samoi Tech's XR projection installations required running multiple disconnected systems and hardware setups. This made playback and wall mapping error-prone, especially under time pressure.
Three core UX issuesI designed a scalable, dual-mode playback system that consolidated multi-screen projection workflows into a single application. It supports 360° panoramic video and wall-specific playback, adapting to different venue types and user needs.
To the left are hi-fi mock-ups of the design.
The app is structured into two main modes: