@cyrus
If you wave at an AI, can it wave back?
in our second encounter, the AI agent started to develop the body vocabulary we had discussed in our first session.
Using the neoFORM shape display of 900 individually motorised pins, it began to articulate via motion.
I gave it eyes + ears (computer vision + live transcription). Audio conversation felt surprisingly natural, despite the lag, which is now something to fix.
We iterated like choreography. I asked for lots of variations of a feeling/action until one ‘felt right’ for the agent.
We prototyped in Python (rough + laggy), then ported to C++ where the movements became fluid.
Then I noticed it was logging its thoughts in memory.md, but not its movements. Living in its head. A bit like us.
So it created body-memory.md
The movements are beginning to feel natural.