Cochabamba/10/April/2026
This fortnight: the water lily learned to breathe, the sundew learned to count fingers, and the pipeline moved to WebSocket.
Two weeks ago the tree was shy. Now there are fourteen creatures, and they’re starting to know each other.
The Victoria Regia came first. A waterproof sensor dropped into a glass of water, an ESP32 reading capacitance every 40 milliseconds, and a giant Amazonian water lily that opens when the water is still and closes when you disturb it. Dip your finger in, the leaves contract. Pour a little more water, the whole pad tilts. It took a day to get the baseline calibration right, the sensor drifts with temperature, so we made it re-zero every time the system boots. Pipeline 12. The lily breathes now.
The Drosera was harder. The sundew has thirteen tentacles and each one needs to curl toward a specific fingertip, which meant finally fixing the MediaPipe hand-landmark mapping we’d been half-ignoring for months. Index finger to tentacle 3, thumb to tentacle 1, and so on. When your fingers splay, the sundew opens. When you pinch, it snaps shut around an invisible insect. Brian rigged it so the curl comes from the base, not the tip, which is how real sundews actually move. You can feel the difference immediately.
The Colibri got her head back. The old version tracked only the body, so the hummingbird’s head pointed wherever the torso pointed, which is not how hummingbirds work. We moved to MediaPipe FaceMesh, 468 landmarks, and now the head tracks independently, tilts, follows your gaze. She looks at you when you look at her. It’s unsettling the first time.
And then the big architectural decision. For weeks we’d been running everything through UDP into Blender, which worked for the studio but wouldn’t survive deployment into Resonite VR. So we rewrote the spine. Python handles all the external sensor logic now, cameras, ESP32s, satellites, proximity, and instead of sending raw bone rotations we send about fifty creature-state values through a ResoniteLink WebSocket. “Tree fear level: 0.7.” “Orchid grief: 0.4.” “Llama stillness: 1.0.” The creatures decide for themselves how to express those states inside VR. It’s the difference between puppeteering and relationship. Dan built a standalone hand-tracking client for his PC that plugs straight into Resonite, no Blender in the middle, so we can finally test in headset without the full studio rig.
On the side, I’ve been getting ComfyUI and Hunyuan3D running locally on the M5 Max, 48GB of unified memory, generating creature assets without sending a single frame to a server. The first test was a moth. It was a terrible moth. The second was better.
Eleven pipelines became fourteen. The tree is still shy. The orchid is still mourning. The llama still stands at rest when you stand at rest. But now the water lily breathes, the sundew counts fingers, and the hummingbird looks you in the eye.
Todo nace desde lo pequeño. Y lo pequeño está empezando a hablar entre sí.