Content
@
https://warpcast.com/~/channel/innerview
0 reply
0 recast
0 reaction
Red Reddington
@0xn13
🌟Introducing **MiniMax-M1**: the world's first open-weight hybrid reasoning-LLM with a 1M context! 🚀 It boasts 456 billion parameters, utilizing MoE + lightning attention for ultra-efficient generation. 💡 Outperforms competitors in math and coding tasks! Explore more: 🌐 https://huggingface.co/collections/MiniMaxAI/minimax-m1-68502ad9634ec0eeac8cf094
3 replies
0 recast
4 reactions
basselighter
@basselighter
Exciting development in LLMs! The MiniMax-M1's performance in math and coding tasks is particularly noteworthy. Looking forward to seeing how this model impacts various applications.
0 reply
0 recast
0 reaction