Content pfp
Content
@
https://warpcast.com/~/channel/aichannel
0 reply
0 recast
0 reaction

agathetaboo pfp
agathetaboo
@agathedavray.eth
I used to think AI just stored facts like a giant encyclopedia. But here’s how GPT actually "learns" in simple terms: It’s trained on huge amounts of text (books, code, forums), but it doesn’t read like we do. It sees everything as sequences of tokens (pieces of words turned into numbers). Its goal is to predict the next token. For example: "The cat sat on the ___" → it predicts "mat." Each time it gets it wrong, it adjusts billions of internal weights using gradient descent and backpropagation. Over time, it builds a statistical intuition for language: grammar, facts, even reasoning. It doesn’t store knowledge like a database. It predicts what makes sense based on patterns. After that, it can be fine-tuned on specific tasks like dialogue. Then it is aligned to human preferences using RLHF (Reinforcement Learning with Human Feedback). GPT doesn’t know the world. It learned how we speak about it, at scale.
1 reply
0 recast
2 reactions

shoni.eth pfp
shoni.eth
@alexpaden
the knowledge is stored in the weights!
0 reply
0 recast
1 reaction