The $2 Gyroscope That Solved What the Neural Network Couldn’t

Cochabamba/21/March/2026

This week at our lab we hit a wall and broke through it with a $2 sensor.

For two weeks we tried to make a single camera track a performer’s full 360° body rotation. We tested 9 different algorithms. MediaPipe (3.5M parameters), depth silhouette PCA, shoulder swap detection, face visibility tracking, every approach failed past 180°.


After discussing with Eilif B. Muller, we strapped an MPU-6050 gyroscope to a belt. Full 360°..:Continuous. No blind spots. The gyroscope gives the creature what the camera can’t: a sense of proprioception, a felt sense of which way it faces, even when facing away. When the performer turns back toward the camera, MediaPipe face tracking gently corrects the gyroscope’s drift. Two contradictory sensing systems, each filling the other’s gaps.


This is sensor fusion applied to real-time non-humanoid character animation. Eight creature pipelines active. Music leads the performance, the performer’s body translates it, the sensors capture that translation, and the creatures embody it live.


The $2 gyroscope solved what the neural network couldn’t. Sometimes the answer isn’t a bigger model, it’s a different sense.