LLM·Dex
Open weightsMistralopentext

Mixtral 8×22B

Mistral's largest open-weight MoE, Apache-2.0, still widely deployed.

Updated


Quick facts

Released
Apr 2024
Context
64K tokens
Output / 1M
Pricing not published
License
Apache-2.0

About Mixtral 8×22B

Mixtral 8×22B was Mistral's mid-2024 open-weight flagship and the largest Apache-2.0-licensed model of its era. It remains in production use for teams that need clean licensing terms and strong inference economics.

Benchmarks

Published scores from Mistral's model card or independent leaderboards. We do not publish numbers we cannot source, see methodology.

HumanEval
,
Python coding pass@1
MMLU
77.8
Broad academic knowledge
GPQA
,
Graduate-level reasoning
SWE-bench
,
Real software-engineering tasks
  • MMLU77.8

Capabilities

Strengths

  • Apache-2.0
  • MoE economics
  • Mature

Tracked weaknesses

  • Older generation
  • 64k context

Pricing

Per-million-token rates as published by Mistral.

Per-token pricing not yet published for Mixtral 8×22B. Check the official provider site for current tiers.

Call Mixtral 8×22B from your code

Drop-in snippet for the Mistral SDK. Set your API key in the environment and run.

typescript
import OpenAI from "openai";

const client = new OpenAI({
  // Use OPENAI_API_KEY for OpenAI, or your provider's key + baseURL.
  apiKey: process.env.OPENAI_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "mixtral-8x22b",
  messages: [
    { role: "user", content: "What's the time complexity of quicksort?" },
  ],
});

console.log(completion.choices[0].message.content);

Best for

Tasks where Mixtral 8×22B ranks among LLMDex's top picks.

Compare Mixtral 8×22B with…

Frequently asked

  • How much does Mixtral 8×22B cost?
    Mistral has not published per-token API pricing for Mixtral 8×22B at the time of writing. Check the official site for current pricing tiers, or compare against alternative models on LLMDex.
  • What is Mixtral 8×22B's context window?
    Mixtral 8×22B supports a context window of 64K tokens.
  • Is Mixtral 8×22B open source?
    Mixtral 8×22B ships with open weights under the Apache-2.0 license. You can self-host it, fine-tune it, and (subject to the license terms) deploy it commercially.
  • When was Mixtral 8×22B released?
    Mixtral 8×22B was released on Apr 10, 2024 by Mistral.
Friday digest

Intelligence, distilled weekly.

One short email every Friday, new model launches, leaderboard moves, and pricing drops. Curated by hand. Free, no spam.