Crafting experience...
3/29/2026
A Project Made By
Submitted for
Built At
Hardware Hack
Hosted By
In environments where reliability and safety matter most, traditional control systems fail.
In sterile medical and lab settings, touching interfaces introduces contamination risk
In industrial environments, physical controls can be unsafe or impractical
For people with mobility impairments, touchscreens and voice assistants are often unreliable or unusable
In smart homes, gesture systems are expensive ($300+) and cloud-dependent
Across all these cases, the same issue appears:
existing solutions are too expensive, too slow, or too dependent on infrastructure.
Wavelink is a low-cost, wearable gesture control system that runs a neural network entirely on-device.
A wrist-mounted device detects hand motion
A TinyML model classifies gestures in 29ms
Commands are sent wirelessly to control real-world devices
No cloud, no internet, no external dependencies
Instead of relying on expensive hubs or cloud APIs, Wavelink creates a self-contained system:
→ Gesture → On-device AI → Wireless command → Physical action
This makes it:
Fast (sub-50ms response)
Private (no data leaves the device)
Reliable (works anywhere, even offline)
Affordable ($23 total build cost)
Wavelink is built as a two-node system: a wearable controller and a local actuator station.
Microcontroller: Raspberry Pi Pico 2 WH
Sensor: MPU6050 (6-axis IMU @ 100Hz)
Collects motion data (acceleration + rotation)
Runs DSP + neural network inference directly on-chip
Sliding window of motion data (2 seconds, 1,200 values)
FFT extracts frequency-domain features
Data compressed into 33 features
2-layer neural network (20 → 10 neurons)
~1,500 parameters (6.2 KB)
Outputs 5 gesture classes with confidence score
Runs in ~1ms inference time
Sends gesture via UDP over WiFi
Arduino acts as a self-hosted access point
No router or internet required
<1ms transmission time
Arduino Uno R4 WiFi receives commands
Controls:
Servo arm
Relay (lamp)
Motor (bidirectional)
LCD + LEDs + buzzer
End-to-end system:
→ Wrist motion → AI classification → UDP packet → Physical action (<50ms total)
Limited RAM and compute
Solved by:
Using feature extraction (FFT) instead of raw data
Designing a small, efficient neural network
Similar gestures (e.g., flick up vs down) overlapped
Solved by:
Using frequency-domain features
Designing gestures that activate different sensor axes
Needed instant response to feel natural
Solved by:
On-device inference (no cloud latency)
UDP instead of TCP (no handshake overhead)
First time using CAD for enclosure
Completed 3 design iterations in 24 hours
Achieved a compact, wearable form factor
Built a fully working AI-powered wearable system in 24 hours
Achieved:
29ms inference latency
<50ms total system response
83.3% classification accuracy
Zero false triggers in idle state
Designed and printed a custom enclosure from scratch
Created a fully offline, infrastructure-independent system
Most importantly:
Proved that real-time AI can run on a $7 microcontroller without cloud support.
Wavelink has strong potential beyond a prototype.
Increase dataset size → improve accuracy
Add more gesture classes (5 → 15+)
Improve flick gesture reliability
Switch from WiFi → Bluetooth Low Energy (BLE) (10× lower power)
Design a custom PCB for smaller form factor
Reduce size to match a fitness band
Add per-user fine-tuning (personalized models)
Implement hierarchical gesture classification
Improve robustness across users
Multi-device control (one wearable → multiple outputs)
Mesh network for collaborative environments (e.g., surgical teams)