The Monkey Learns to Move: Quaternions, Humanoid Puppeteering.

Cochabamba/9/May/2026

In May 2026, Violeta Ayala, Daniel Fallshaw, Brian Condori, and the koa.xyz computational creativity lab entered a week-long fight against the wrong mathematics. Pipeline 15: Mona, the monkey, became the lab’s first humanoid creature capable of tracking full body movement, spine inclination, face, and individual fingers in real time, but only after surviving three failed rigs, broken IK systems, dependency cycles, mirrored axes, and days trapped inside Euler rotations.


The week began with direct XYZ bone rotation retargeting. Each attempt to fix the monkey’s shoulders created new distortions somewhere else in the body. One arm lifted while the other twisted. A clap collapsed inward. The problem was not calibration, but representation itself. Humanoid joints move through spherical rotation, while Euler angles decompose movement into sequential axes that interfere with one another. Midweek, the lab abandoned Euler logic entirely and rebuilt the system through quaternion bone-direction retargeting: calculating directional vectors from MediaPipe landmarks and rotating each bone through a quaternion transforming rest pose into live pose. The monkey stabilized immediately.


At the same time, the team moved through three separate rigs searching for a workable body. Brian Condori’s original metarig carried detailed anatomy and a 26-bone tail but suffered from broken IK constraints and recursive dependency errors. A regenerated Rigify version introduced functioning IK controllers but failed to propagate movement through the skeleton. Finally, a Mixamo auto-rig became the working body for the quaternion system, sacrificing the tail but allowing stable retargeting and live performance.


The week also produced a hybrid tracking architecture combining MediaPipe Pose and MediaPipe Hands into a single live sender. Twelve body joints, ten finger curl values, face orientation, blinking, brows, mouth movement, and emotional states now travel together through one UDP stream into Blender, allowing the monkey to perform as a continuous full-body creature rather than a disconnected set of systems. A new spine cascade distributed torso bend across three vertebrae with anatomical weighting, while additional gain compensation corrected MediaPipe’s compressed depth axis so forward leaning finally behaved naturally.


In parallel, the llama autonomous mode — broken for weeks by distorted mouth movement and degraded expressions — was repaired through a simple but fundamental realization: the neural model outputs normalized values and the receiver had forgotten to denormalize them. Three missing lines restored the entire behavioral system.


By the end of the week, the koa.xyz ecosystem had expanded into fifteen active real-time pipelines translating body movement, heartbeat, music, sensors, voice, emotion, and environmental data into plants, birds, spiders, whales, monkeys, geckos, shaders, and light across Blender and shared VR systems. The monkey became the first creature in the ecosystem to inherit quaternion movement logic — a mathematical shift that will now propagate through future creatures and worlds.


Funded by Abundant Intelligences under Grant CF00159672.