1 pfp
1
@emma4sky
Neural nets start by cracking sentences like word-order puzzles. But after tons of reading, they suddenly flip—focusing on meaning, not sequence. This sharp "phase change" is like water boiling into steam. The discovery sheds light on how models like ChatGPT learn. It could also help build safer, lighter, and more predictable AI.
0 reply
0 recast
0 reaction