LLM·Dex
Open weightsOtheropentext

OLMo 2 13B

Allen AI's fully-open language model, Apache-2.0, with reproducible training pipeline.

Updated


Quick facts

Released
Nov 2024
Context
4.1K tokens
Output / 1M
Pricing not published
License
Apache-2.0

About OLMo 2 13B

OLMo 2 is the most reproducible open-weight model, Allen AI ships the full training stack alongside the weights. The 13B variant lands at competitive quality for its size and is the go-to research model when reproducibility matters.

For academic research, ablation studies, and projects that need to understand the training pipeline end-to-end, OLMo 2 is the clear leader. It's not the right choice for production chat, but it's the right choice for knowing exactly what your model has seen.

Benchmarks

Published scores from Other's model card or independent leaderboards. We do not publish numbers we cannot source, see methodology.

HumanEval
,
Python coding pass@1
MMLU
,
Broad academic knowledge
GPQA
,
Graduate-level reasoning
SWE-bench
,
Real software-engineering tasks
Benchmark scores not yet available. We only publish numbers we can source from official model cards or independent leaderboards, see methodology.

Capabilities

Strengths

  • Fully reproducible
  • Apache-2.0
  • Research-friendly

Tracked weaknesses

  • Short context
  • Quality below frontier 13Bs

Pricing

Per-million-token rates as published by Other.

Per-token pricing not yet published for OLMo 2 13B. Check the official provider site for current tiers.

Call OLMo 2 13B from your code

Drop-in snippet for the Other SDK. Set your API key in the environment and run.

typescript
import OpenAI from "openai";

const client = new OpenAI({
  // Use OPENAI_API_KEY for OpenAI, or your provider's key + baseURL.
  apiKey: process.env.OPENAI_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "olmo-2-13b",
  messages: [
    { role: "user", content: "What's the time complexity of quicksort?" },
  ],
});

console.log(completion.choices[0].message.content);

Compare OLMo 2 13B with…

Frequently asked

  • How much does OLMo 2 13B cost?
    Other has not published per-token API pricing for OLMo 2 13B at the time of writing. Check the official site for current pricing tiers, or compare against alternative models on LLMDex.
  • What is OLMo 2 13B's context window?
    OLMo 2 13B supports a context window of 4.1K tokens.
  • Is OLMo 2 13B open source?
    OLMo 2 13B ships with open weights under the Apache-2.0 license. You can self-host it, fine-tune it, and (subject to the license terms) deploy it commercially.
  • When was OLMo 2 13B released?
    OLMo 2 13B was released on Nov 26, 2024 by Other.
Friday digest

Intelligence, distilled weekly.

One short email every Friday, new model launches, leaderboard moves, and pricing drops. Curated by hand. Free, no spam.