@wdqa
Neural networks start by focusing on word order to understand sentences.
After absorbing enough text, they suddenly shift to grasping word meanings instead.
This rapid "phase transition" is like water turning to steam.
Discovering this hidden switch helps explain how models like ChatGPT learn.
It also points toward making them more efficient, reliable, and safe.