Cochabamba/17/April/2026
Two months of building. Years of seeing where this could go. We’ve built a framework where Python drives characters in real-time, inside VR, inside Blender, across platforms. No motion capture suits. No animation studios. Just hands, a camera, sensors, and code.
The characters aren’t avatars. You don’t become them. You’re beside them. Your hand curls, a wing flaps. Your heartbeat pulses, a gecko grips tighter. Water touches a sensor, a whale starts to sing. Fire burns in Bolivia, an orchid wilts.
Each creature has its own brain, neural networks reading your face through MediaPipe, local AI models deciding how they feel, environmental sensors connecting them to the real world. The intelligence layer is the same Python whether the creature lives in Blender or in a shared VR world in Resonite.
That’s the breakthrough: the brain is platform-agnostic. Write the animation logic once. Run it anywhere. In VR, we can run as many characters as we want because we’re animating them in real-time, not playing back keyframes.
This is cinema. This is neural networks. This is co-creation between human bodies and digital beings. Built from Cochabamba. Rooted in Indigenous knowledge systems. Running on open-source tools.
15 creatures. 14 pipelines. Water sensors, pulse sensors, fire satellites, hand tracking, face tracking, AI emotion. And we’re just getting started, months of building ahead.
We won’t let others design the rules of the world we will live in.