Nicole
@0xsage
GPT-4 is apparently 16 models, 1.8 trillion params, and trained on 13 trillion tokens 🤯 https://threadreaderapp.com/thread/1678545170508267522.html
0 reply
0 recast
0 reaction