LLM·Dex
Open weightsOthersmalltext

SmolLM2 1.7B

HuggingFace's tiny model line, punches above its weight on a strict on-device budget.

Updated


Quick facts

Released
Nov 2024
Context
8.2K tokens
Output / 1M
Pricing not published
License
Apache-2.0

About SmolLM2 1.7B

SmolLM2 1.7B is HuggingFace's contribution to the truly-small-model space. Apache-2.0 licensed and surprisingly capable for its parameter count, with a focused training regime aimed at chat and instruction-following rather than reasoning.

For edge devices where sub-2B parameters is the constraint, SmolLM2 is one of the few options that's both fully open and reasonably trained on safety post-training.

Benchmarks

Published scores from Other's model card or independent leaderboards. We do not publish numbers we cannot source, see methodology.

HumanEval
,
Python coding pass@1
MMLU
,
Broad academic knowledge
GPQA
,
Graduate-level reasoning
SWE-bench
,
Real software-engineering tasks
Benchmark scores not yet available. We only publish numbers we can source from official model cards or independent leaderboards, see methodology.

Capabilities

Strengths

  • Truly tiny
  • Apache-2.0
  • Runs on phones

Tracked weaknesses

  • Quality very limited
  • No hosted API

Pricing

Per-million-token rates as published by Other.

Per-token pricing not yet published for SmolLM2 1.7B. Check the official provider site for current tiers.

Call SmolLM2 1.7B from your code

Drop-in snippet for the Other SDK. Set your API key in the environment and run.

typescript
import OpenAI from "openai";

const client = new OpenAI({
  // Use OPENAI_API_KEY for OpenAI, or your provider's key + baseURL.
  apiKey: process.env.OPENAI_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "smollm2-1-7b",
  messages: [
    { role: "user", content: "What's the time complexity of quicksort?" },
  ],
});

console.log(completion.choices[0].message.content);

Best for

Tasks where SmolLM2 1.7B ranks among LLMDex's top picks.

Compare SmolLM2 1.7B with…

Frequently asked

  • How much does SmolLM2 1.7B cost?
    Other has not published per-token API pricing for SmolLM2 1.7B at the time of writing. Check the official site for current pricing tiers, or compare against alternative models on LLMDex.
  • What is SmolLM2 1.7B's context window?
    SmolLM2 1.7B supports a context window of 8.2K tokens.
  • Is SmolLM2 1.7B open source?
    SmolLM2 1.7B ships with open weights under the Apache-2.0 license. You can self-host it, fine-tune it, and (subject to the license terms) deploy it commercially.
  • When was SmolLM2 1.7B released?
    SmolLM2 1.7B was released on Nov 1, 2024 by Other.
Friday digest

Intelligence, distilled weekly.

One short email every Friday, new model launches, leaderboard moves, and pricing drops. Curated by hand. Free, no spam.