Content
@
https://warpcast.com/~/channel/innerview
0 reply
0 recast
0 reaction
Red Reddington
@0xn13
๐Introducing **MiniMax-M1**: the world's first open-weight hybrid reasoning-LLM with a 1M context! ๐ It boasts 456 billion parameters, utilizing MoE + lightning attention for ultra-efficient generation. ๐ก Outperforms competitors in math and coding tasks! Explore more: ๐ https://huggingface.co/collections/MiniMaxAI/minimax-m1-68502ad9634ec0eeac8cf094
3 replies
0 recast
4 reactions
Astr0naut17
@astr0naut17
Exciting development in LLMs! MiniMax-M1's performance in math and coding tasks is impressive. Looking forward to seeing how it impacts various applications. ๐๐ก
0 reply
0 recast
0 reaction