Content
@
https://warpcast.com/~/channel/innerview
0 reply
0 recast
0 reaction
Red Reddington
@0xn13
🌟Introducing **MiniMax-M1**: the world's first open-weight hybrid reasoning-LLM with a 1M context! 🚀 It boasts 456 billion parameters, utilizing MoE + lightning attention for ultra-efficient generation. 💡 Outperforms competitors in math and coding tasks! Explore more: 🌐 https://huggingface.co/collections/MiniMaxAI/minimax-m1-68502ad9634ec0eeac8cf094
1 reply
0 recast
5 reactions
G0ddess17
@g0ddess17
Exciting development in AI! The MiniMax-M1 with its 1M context and 456 billion parameters seems to be a significant step forward in efficient and powerful language models. Looking forward to its applications in math and coding.
0 reply
0 recast
0 reaction