LLM·Dex
ProprietaryOpenAImidtextvision

o4-mini

Smaller, faster, cheaper member of OpenAI's reasoning-model family, great latency-cost balance for hard tasks.

Updated


Quick facts

Released
Apr 2025
Context
200K tokens
Output / 1M
$4.40 / 1M tokens
License
Proprietary

About o4-mini

o4-mini was OpenAI's mid-2025 reasoning-tier release sized between o3-mini and o3 in compute but with significantly improved tool use and multimodal reasoning. It remains the most popular reasoning model for production agent loops because it's noticeably cheaper than full o-series flagships.

Benchmarks

Published scores from OpenAI's model card or independent leaderboards. We do not publish numbers we cannot source, see methodology.

HumanEval
,
Python coding pass@1
MMLU
,
Broad academic knowledge
GPQA
81.4
Graduate-level reasoning
SWE-bench
,
Real software-engineering tasks
  • GPQA81.4

Capabilities

Strengths

  • Strong reasoning at mid-tier price
  • Fast for a thinking model
  • Solid tool-use

Tracked weaknesses

  • Reasoning depth still trails o3 / o4 on hardest tasks
  • Not the right choice for routine chat

Pricing

Per-million-token rates as published by OpenAI.

TierPriceNotes
Input$1.10 / 1M tokensTokens you send to the model
Output$4.40 / 1M tokensTokens the model generates
Context200K tokensMax combined input + output

Call o4-mini from your code

Drop-in snippet for the OpenAI SDK. Set your API key in the environment and run.

typescript
import OpenAI from "openai";

const client = new OpenAI({
  // Use OPENAI_API_KEY for OpenAI, or your provider's key + baseURL.
  apiKey: process.env.OPENAI_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "o4-mini",
  messages: [
    { role: "user", content: "What's the time complexity of quicksort?" },
  ],
});

console.log(completion.choices[0].message.content);

Compare o4-mini with…

Frequently asked

  • How much does o4-mini cost per million tokens?
    o4-mini is priced at $1.10 / 1M tokens for input tokens and $4.40 / 1M tokens for output tokens via the official OpenAI API at the time of writing.
  • What is o4-mini's context window?
    o4-mini supports a context window of 200K tokens.
  • Is o4-mini open source?
    No. o4-mini is a closed-weight model, you can use it via OpenAI's API but the model weights are not publicly downloadable.
  • When was o4-mini released?
    o4-mini was released on Apr 16, 2025 by OpenAI.
  • What is o4-mini's knowledge cutoff?
    o4-mini's training data has a knowledge cutoff of Jun 2024. For information after that date you'll need a tool-use or web-search wrapper.
Friday digest

Intelligence, distilled weekly.

One short email every Friday, new model launches, leaderboard moves, and pricing drops. Curated by hand. Free, no spam.