Cochabamba/18/April/2026
Week 8 brought the gecko (Pipeline 14) from a skeletal rig into a sensing, tasting, listening, color-shifting creature with 12 emotional states, 500+ persistent flavor memories, and a brain that chooses its own moods. Brian Condori posed the gecko for each of her twelve moods; chill, spicy, curiosa, coqueta, celosa, cansada, enojada, hambrienta, fria, horny, asustada, juguetona, creating a movement vocabulary of full-body target poses translated into a POSES dictionary. The same architecture proven with the whale in Week 7 now drives a second creature. The gecko's sensory stack reached six channels: MediaPipe face and hand tracking, microphone voice detection, moondream 1.8B vision model, ESP32 optical pulse sensor, typed and spoken narration via local Whisper, and scene color detection. She sees you, hears you in English and Spanish, feels your heartbeat, tastes the color of your clothes, and reads your facial expressions. She responds by choosing a mood, inventing a synaesthetic flavor, generating an inner thought, shifting her skin color to match what she sees, and moving her 218 bones accordingly.
The brain architecture evolved through three stages: Qwen 2.5:7b (too small to vary moods), sensor-driven rule scoring (functional but mechanical), and finally Qwen 3.6 (23GB, latest generation, smart enough to choose moods contextually with sensor overrides only for reflexes). Qwen 3.6 produced poetic output: "They offer words that taste of dust and unspoken questions; I shall listen with my entire skin."
The week's most consequential technical move was the training data recording system. Every brain call now saves a complete sensor→mood→flavor→narration sample to ~/.gecko_recordings/. By session end: 274 training samples. This data will train two models: a LoRA fine-tune of Qwen 3.6 (personality,what she says and feels) and a tiny somatic neural net (body, how she moves, running at 1000Hz).
An ethics conversation with Eilif B. Muller and Yasmeen Hitti on April 13 grounded the week's technical work in the larger questions of affective computing, agentic AI regulation, emotional data sovereignty, and the difference between reading emotions and owning the dataset. A Resonite co-creation session on April 16 with Molly, Dan, and Yasmeen articulated the framework: platform-agnostic character brains, 15 creatures, 14 pipelines, real-time puppeteering across Blender and VR.
sensor→mood→flavor→narration