Hand-Tracking–Based VR Therapy for Motor Control and Learning for Autistic Individuals
A sensory-adaptive, gamified VR training module designed to improve spatial motor skills in neurodivergent users, using simple, satisfying feedback that enhances focus and motivation.
The Problem We're Solving
Problem Statement
💡 Real-World Context:
We're collaborating with HeartShare New York, a nonprofit supporting neurodivergent adults.
🏥 Systemic Barriers:
Traditional therapy environments often face:Limited funding
Lack of private, quiet spaces
Overstimulating group settings
🖌️ Origin Story:
My partner, Rowynn Dumont, previously taught art to autistic adults.She noticed many struggled to hold a paintbrush due to motor control challenges.
This inspired us to reimagine therapeutic activities as more accessible and engaging.
🧠 User Challenges:
Short attention spans
Difficulty with fine motor tasks
Repetitive tasks perceived as boring or frustrating
🎮 Our Design Goal:
To create a VR experience that:Uses hand tracking (no controllers needed)
Incorporates gamification to make therapy fun
Provides instant audiovisual feedback (e.g., “snap & quack”)
Helps users stay engaged while practicing motor coordination
Research Insights
We grounded our VR therapeutic design in existing research on motor learning, sensory feedback, and neurodivergent-centered design. Key insights from the literature include:
📊 Why VR Works for Therapy
Immersive virtual environments provide precise control over stimuli, making them ideal for therapeutic use.
VR allows for safe, repeatable, and adjustable simulations, offering real-time feedback and performance tracking (Carpio-Alfsen et al., 2023).
Gamified tasks in VR enhance motivation, especially when tailored to user needs (Irshad et al., 2021).
🔊 The Role of Multimodal Feedback
Auditory, visual, and haptic cues significantly improve engagement and learning in VR (Levac & Galvin, 2013).
Immediate, multisensory feedback supports skill acquisition and motor performance by helping users recognize success or errors in real time.
🧩 The Gap for Neurodivergent Populations
Despite its potential, most VR therapy tools are not designed with neurodivergent users in mind.
Individuals with autism and related conditions often have distinct sensory-motor profiles, including:
Difficulty with spatial motor control (e.g., fast, imprecise movements)
High sensitivity to stimuli (sound, light, clutter)
Greater distractibility in overstimulating environments (Dahlstrom-Hakki et al., 2024)
💡 Design Implications
Neurodivergent users benefit from clear, simple, and consistent feedback, avoiding ambiguous or overwhelming interactions (Parsons & Cobb, 2011).
Participatory research emphasizes the need for user-informed design, particularly for inclusive therapies.
XR UX Principles Applied
Spatial UI
Interactive elements are thoughtfully embedded within the 3D environment, aligning with users’ natural spatial awareness to reduce cognitive load and support intuitive navigation.Gesture Clarity
Hand gestures are intentionally designed to be clear, minimal, and easy to perform—helping reduce user frustration and build confidence, especially for those with motor challenges.Immediate Feedback
Every user action is met with real-time visual and auditory cues (like snaps and sounds), reinforcing understanding and creating a satisfying, responsive experience.Simple Onboarding
The experience begins with gentle, intuitive guidance—no complex menus or verbal instructions—ensuring that users can start interacting comfortably from the very first moment.
Role: UX Design, Prototyping, User-testing
Tools: Unity, Meta SDK, OpenXR
Project type + duration: Research , Collaborative | 6 weeks
The Design
We designed the experience to feel calming, intuitive, and rewarding. Here’s how users progress through it:
🔄 Experience Flow
Onboarding
The user enters a minimal, distraction-free VR space.
Spatial cues (like gentle arrows or glowing targets) guide the first steps—no text, no menus.
Exploration & Movement
Users reach out with open hands to grab floating duck objects.
Gesture-based input removes the need for controllers, lowering barriers for users with motor challenges.
Interaction & Placement
Ducks can be placed in four directions (N, S, E, W).
When placed correctly, ducks snap into position and emit a “quack” sound as feedback.
Incorrect placements result in a silent drop—no punishment, just subtle trial-and-error learning.
Creative Drawing (Optional Mode)
Users can also draw shapes using ribbon-like trails that follow their fingertip—no UI tools needed.
Ambient Feedback Loop
Each action gives users visual, spatial, and auditory reinforcement, creating a rhythm of gentle discovery.
Data
Testing & Iteration
We tested early prototypes with users at HeartShare New York, along with feedback from mentors, peers, and therapists familiar with neurodivergent needs. Their input directly shaped the final experience.
👂 What We Heard → 🔧 What We Changed
We heard:
“The default pinching gesture was hard to perform consistently.”
We changed:
Replaced it with a broader open-hand grab, which felt more natural and forgiving for users with motor challenges.We heard:
“It wasn’t always clear where to place the ducks.”
We changed:
Added gentle directional arrows and snap zones that provide clear visual affordances without cluttering the space.We heard:
“The experience felt quiet or unsure when something didn’t work.”
We changed:
Introduced audio feedback (a friendly “quack”) and subtle visual confirmation to make successful actions more rewarding.
✅ What Improved
Ease of Use: Hand gestures became more accessible and consistent
Attention Span: Multimodal feedback helped users stay focused longer
Enjoyment: Gamified elements like snapping and quacking made the experience feel fun, not clinical
Reflection: 2–3 things you learned (about accessibility, iteration, or working in XR)
Reflection
Reflection: 2–3 things you learned (about accessibility, iteration, or working in XR)