MistralAI debuts Mixtral 8x22B, one of the most powerful open-source AI models yet

From SiliconANGLE: 2024-04-10 20:11:20

Mistral AI, a Paris-based AI startup, has launched the new Mixtral 8x22B model, aiming to compete with industry giants like OpenAI’s GPT-3.5 and Meta’s Llama 2. With a 65,000-token context window and 176 billion parameters, it’s one of Mistral’s most powerful models yet, available for retraining on platforms like Hugging Face and Together AI.

In response to competitors, Mistral released Mixtral 8x22B after OpenAI unveiled GPT-4 Turbo with Vision and Google made Gemini Pro 1.5 LLM available. Meta also announced plans for Llama 3. Mixtral 8x22B is expected to surpass Mistral’s previous model, offering an efficient “mixture-of-experts” architecture for high performance across tasks.

Despite generating positive reactions from the AI community for its open-source model, Mistral AI faces criticism for the potential misuse of its “frontier models.” With no way to prevent harmful use of its technology, the startup’s advancements in open-source generative AI, exemplified by Mixtral 8x22B, provide researchers and developers with advanced models under a permissive license, sparking hope for applications in customer service, drug discovery, and climate modeling.



Read more at SiliconANGLE: MistralAI debuts Mixtral 8x22B, one of the most powerful open-source AI models yet