LLM·Dex

Command R+ (08-2024) vs Llama 4 405B

A complete head-to-head: pricing, context window, benchmarks, modality coverage, and openness, with a programmatic verdict synthesized from the underlying data.

Verdict by category
  • PriceCommand R+ (08-2024)

    Command R+ (08-2024) publishes pricing ($10.00 / 1M output tokens) while Llama 4 405B does not.

  • Context windowLlama 4 405B

    Llama 4 405B accepts 256K tokens vs 128K, 2.0× the room for long documents and codebases.

  • BenchmarksTie

    No directly comparable public benchmarks are available for both models, check the spec sheets for individual scores.

  • ModalitiesLlama 4 405B

    Llama 4 405B supports 2 modalities (text, vision) vs 1 for Command R+ (08-2024).

  • OpennessTie

    Both ship open weights, self-host either one.

On balance Llama 4 405B edges ahead, winning 2 of 5 categories against Command R+ (08-2024)'s 1. Command R+ (08-2024) publishes pricing ($10.00 / 1M output tokens) while Llama 4 405B does not. Llama 4 405B accepts 256K tokens vs 128K, 2.0× the room for long documents and codebases.

No directly comparable public benchmarks are available for both models, check the spec sheets for individual scores. They differ in modality coverage, Command R+ (08-2024) handles text while Llama 4 405B handles text, vision, which can be the deciding factor before you even look at benchmarks. Both ship open weights, self-host either one.

Llama 4 405B is the newer of the two, released 7 months after Command R+ (08-2024), which usually means a more recent knowledge cutoff and updated safety post-training. Command R+ (08-2024) is usually picked for rag and data extraction workloads, while Llama 4 405B sees more deployments in open source llm and fine tuning. If pricing matters more than every last benchmark point, run the numbers in the calculator below before committing.

Side-by-side specs

SpecCommand R+ (08-2024)Llama 4 405B
ProviderCohereMeta
ReleasedAug 2024Apr 2025
Modalitiestexttext, vision
Context window128K tokens256K tokens
Max output,,
Input · 1M$2.50 / 1M tokensPricing not published
Output · 1M$10.00 / 1M tokensPricing not published
Knowledge cutoff,2024-12
Open weightsYes (CC-BY-NC 4.0)Yes (Llama 4 Community License)
API availableYesYes

Pricing at scale

What you'd actually pay at typical workloads. Numbers come from each model's published per-million-token rates.

  • Light usage, 100k in / 50k out per day$22.50 vs ,
  • Heavy usage, 1M in / 500k out per day$225 vs ,
  • RAG workload, 5M in / 200k out per day$435 vs ,

Light usage, 100k in / 50k out per day: pricing not directly comparable (one or both models are missing public per-token rates). Heavy usage, 1M in / 500k out per day: pricing not directly comparable (one or both models are missing public per-token rates). RAG workload, 5M in / 200k out per day: pricing not directly comparable (one or both models are missing public per-token rates).

Price calculator

Estimated spend for the listed models at your usage. Numbers are derived from each model's published per-million-token rates.

  • Command R+ (08-2024)$0.750
  • Llama 4 405BPricing unavailable

Benchmarks compared

Only sourced numbers. Where a benchmark is missing for one model we show the available value rather than fabricating the other.

Command R+ (08-2024)Llama 4 405B
  • MMLU75.7
Pick Command R+ (08-2024) if

Command R+ (08-2024) fits when…

  • RAG-tuned
  • Citation generation
  • Enterprise tooling
Pick Llama 4 405B if

Llama 4 405B fits when…

  • Strongest open-weight model at launch
  • Multimodal (text + vision)
  • Wide hosting availability
  • Long-context tasks, handles 256K tokens vs 128K for Command R+ (08-2024).
  • Multimodal needs covering vision.
Don't want either?

Consider Command R

Cohere's mid-tier RAG-optimized model, affordable and reliable on retrieval workloads.

Frequently asked

  • Is Command R+ (08-2024) or Llama 4 405B cheaper?
    Per-token pricing isn't published for at least one of these models, check each model's spec page for current rates.
  • Which has the larger context window?
    Llama 4 405B accepts 256K tokens vs 128K for Command R+ (08-2024).
  • Is Command R+ (08-2024) or Llama 4 405B better for coding?
    Both Command R+ (08-2024) and Llama 4 405B are competitive on coding benchmarks. See each model's individual spec page for HumanEval and SWE-bench scores where published. For an opinionated pick, consult our Best LLM for Coding ranking.
  • Are either of these models open source?
    Both ship with open weights. Command R+ (08-2024) is licensed under CC-BY-NC 4.0; Llama 4 405B under Llama 4 Community License.
  • When were Command R+ (08-2024) and Llama 4 405B released?
    Command R+ (08-2024) was released by Cohere on 2024-08-30. Llama 4 405B was released by Meta on 2025-04-05.
Friday digest

The week's AI launches, in your inbox.

One short email every Friday, new models, leaks, and quietly-shipped APIs you missed.