DeepSeek AIOpen-weight·66K context·671B params·MIT

DeepSeek R1deepseek/deepseek-r1

DeepSeek R1 is an open-weight reasoning model in the o1-class trained with reinforcement learning. Exposes visible chain-of-thought with top-tier math and competitive-coding benchmarks. MIT licensed.

Cheapest blended:$0.96 / 1M tokenson DeepSeek · 2 providers listed

Pricing across providers

Sort by:
ProviderInput /1MOutput /1MBlended /1MLatency p50FormatFreshnessAction
DeepSeek
deepseek-reasoner
$0.55$2.19$0.96920msOpenAI-compatibleVerified 3d agoTry →
Together.ai
deepseek-ai/DeepSeek-R1
$3.00$7.00$4.00350msOpenAI-compatibleVerified 3d agoTry →

Affiliate disclosure: We may earn a commission from qualified signups. Pricing independence is enforced at the data layer — see our Editorial Independence Policy.

Works with

Point any of these clients at a hosting's base URL — they all speak at least one of this model's endpoint protocols (OPENAI_COMPATIBLE).

Capabilities

  • reasoning
  • math
  • coding
  • chain_of_thought

Languages: en, zh

Benchmarks

  • MMLU5-shot · official · source
    90.3%
  • HumanEval0-shot pass@1 · official · source
    89.1%

Code samples

Example using DeepSeek — the cheapest hosting for this model as of last verification. Swap base_url and model to use a different provider from the matrix above.

from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_KEY",
    base_url="https://api.deepseek.com/v1",
)

response = client.chat.completions.create(
    model="deepseek-reasoner",
    messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)

Technical specs

Context
66K
Max output
8K
Parameters
671B
Release
2025-01-20
Training cutoff
2024-10-01
License
MIT

Similar models

Compare with

Frequently asked

How much does DeepSeek R1 cost?+
The cheapest public hosting is $0.96 per 1M blended tokens on DeepSeek. 2 total providers are listed above with per-input / per-output / cached pricing.
How do I access DeepSeek R1 from outside China?+
All hostings listed above support global access. The official API (e.g. api.deepseek.com, dashscope-intl.aliyuncs.com) accepts international credit cards and does not require a Chinese mobile number. For privacy-sensitive workloads, third-party aggregators like Together.ai host the model on US/EU infrastructure.
Is DeepSeek R1 open-source? Can I fine-tune it?+
Yes. DeepSeek R1 is open-weight under the MIT license. Weights are available on Hugging Face for local inference, fine-tuning, and commercial use (see license for specific terms).
Is DeepSeek R1 OpenAI-compatible?+
Most listed hostings expose an OpenAI-compatible API, so you can point an existing openai SDK client at the Provider's base_url and use the Provider's model name. See the Code Samples above for a copy-pasteable example.
What's the maximum context window for DeepSeek R1?+
The model supports up to 65,536 tokens of context (input + output). Some hosted versions may impose a smaller limit — check the "Context" column in the pricing matrix for each provider.