Somatic Puppeteering Across Plants, Birds, and Heartbeats.

Cochabamba/8/March/2026


In the first two weeks of March 2026, the koa.xyz computational creativity lab entered an intensive build phase across its real-time human-to-non-human expression pipelines. Week one opened with the Bromelias face puppet, a system of 31 plant armatures driven by facial tracking via OAK-D Pro camera, where we resolved bone influence tuning and linked emotion detection directly to shader color nodes, establishing the three-code architecture (Python tracking, Blender shading, Blender bones) that now underpins every pipeline in the lab.


That same week we brought the Cattleya orchid audio-reactive system online: a pipeline where live music is decomposed through FFT into six frequency bands sub-bass through high, each driving a different quality of plant movement, from deep root sway to fine tip shimmer, with beat detection triggering visible pulses of apertura through the bone chain. We wired an ESP32 microcontroller to a pulse sensor and built a derivative-based cardiac analysis engine that extracts systole, diastole, BPM, HRV, and beat events from raw photoplethysmography data, feeding these into a 1,081-bone plant optimized with stride-based rotation to maintain real-time frame rates.


Brian Condori rebuilt the Condor rig from scratch, replacing an animator-centered armature with a performer-centered FK-only architecture designed specifically for somatic tracking, with manually reoriented bones, single-layer world-space structure, and hand-painted weights across feathers, tail, legs, and head. He also generated five new procedural plant species using Blender’s Sapling addon (Acai Palm, Guarana, Cacao, Patuju, Sangre de Drago) and refined the Flor Boca rig with renamed lip bones, new shape keys (alegre, triste, mueca), and removed constraints for independent lip movement. By the end of week one, the Boquita Ch’ixi pipeline was running: a mouth-shaped plant where lip bones mimic your mouth geometry in real time while shape keys express the opposite emotion, you smile and the plant opens wide but looks sad, you frown and the plant narrows but looks happy, implementing the Aymara concept of ch’ixi (the coexistence of contradictory states) as a computational system, with blow detection creating simulated wind through the branches, a speech filter preventing involuntary head nods during talking, and individual fingers controlling individual branches.


Week two turns to integration and documentation: connecting the rebuilt condor to full-body tracking, polishing Boquita for public demonstration, stabilizing the heartbeat-to-plant breathing connection, texturing the new procedural species, expanding the emotion-to-color shader system across all pipelines, and capturing demo videos of each system in action, building toward a body of work that now encompasses twelve distinct real-time pipelines translating human face, hands, body, voice, heartbeat, and music into the movement of plants, birds, creatures, and light.