LLM·Dex
Open weightsMistralsmalltext

Ministral 8B

Mistral's 8B edge model, designed specifically for on-device and on-prem deployment.

Updated


Quick facts

Released
Oct 2024
Context
128K tokens
Output / 1M
Pricing not published
License
Mistral Research License

About Ministral 8B

Ministral 8B is purpose-built for edge inference: optimised memory footprint, strong throughput on consumer GPUs, and a permissive license for commercial use within Mistral's offer terms.

For on-device chat and lightweight reasoning, Ministral 8B routinely tops blind-quality comparisons against other 8B-class models, particularly for European languages.

Benchmarks

Published scores from Mistral's model card or independent leaderboards. We do not publish numbers we cannot source, see methodology.

HumanEval
,
Python coding pass@1
MMLU
,
Broad academic knowledge
GPQA
,
Graduate-level reasoning
SWE-bench
,
Real software-engineering tasks
Benchmark scores not yet available. We only publish numbers we can source from official model cards or independent leaderboards, see methodology.

Capabilities

Strengths

  • Edge-optimized
  • Strong 8B-class quality

Tracked weaknesses

  • Research license restricts unmodified commercial deployment

Pricing

Per-million-token rates as published by Mistral.

Per-token pricing not yet published for Ministral 8B. Check the official provider site for current tiers.

Call Ministral 8B from your code

Drop-in snippet for the Mistral SDK. Set your API key in the environment and run.

typescript
import OpenAI from "openai";

const client = new OpenAI({
  // Use OPENAI_API_KEY for OpenAI, or your provider's key + baseURL.
  apiKey: process.env.OPENAI_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "ministral-8b",
  messages: [
    { role: "user", content: "What's the time complexity of quicksort?" },
  ],
});

console.log(completion.choices[0].message.content);

Best for

Tasks where Ministral 8B ranks among LLMDex's top picks.

Compare Ministral 8B with…

Frequently asked

  • How much does Ministral 8B cost?
    Mistral has not published per-token API pricing for Ministral 8B at the time of writing. Check the official site for current pricing tiers, or compare against alternative models on LLMDex.
  • What is Ministral 8B's context window?
    Ministral 8B supports a context window of 128K tokens.
  • Is Ministral 8B open source?
    Ministral 8B ships with open weights under the Mistral Research License license. You can self-host it, fine-tune it, and (subject to the license terms) deploy it commercially.
  • When was Ministral 8B released?
    Ministral 8B was released on Oct 16, 2024 by Mistral.
Friday digest

Intelligence, distilled weekly.

One short email every Friday, new model launches, leaderboard moves, and pricing drops. Curated by hand. Free, no spam.