Anna.base.eth pfp
Anna.base.eth

@annabanul

MistralAI, following LLaMA 3.1 405B, introduced its flagship model with open weights Mistral Large 2 (Mistral-Large-Instruct-2407). Mistral has always been distinguished by very high-quality open models, and, apparently, this one will be no exception. The weights are open, the context size is 128k, the model size is 123B, the model has been trained in 80 programming languages and dozens of natural languages, including Russian. The model performs well both in reasoning and in mathematics and programming. https://imagedelivery.net/BXluQx4ige9GuW0Ia56BHw/ad32d854-ead7-4254-d40d-8550e0533b00/original https://frame.weponder.io/api/polls/5258
1 reply
0 recast
6 reactions