The project interprets strokes made from a tablet pen. When a stroke is completed (or closed into a loop) it manifests an organism based on stroke length, speed, and pressure. While not exactly gesture recognition, it was a quick and dirty solution to what otherwise might have been a nightmare to develop (gesture recognition).
Procedural animation drives all of the creatures, including the physics and kinematics. At the lowest level, physics runs everything from the fluid simulation, masses, and springs. One level higher and you get the autonomous motions of a creature such as articulation (moving the tail in opposition of the head).
At the highest structure you get the behavior level, the “AI” of the creature that drives them to go in particular directions, follow certain targets, and do certain things with their bodies. Everything is clamped down into a point of “interest”, and the organism simply follows that. The behavior routine simply directs this “interest”, and the rest of the body articulates itself. In addition, behaviors can change due to their orientation or distance between themselves and other organisms. I’ll let you discover these behaviors yourself, since that’s half the fun of this project.