TencentOpen-weight·33K context·389B params·Tencent Hunyuan Community License

Hunyuan Largetencent/hunyuan-large

Hunyuan Large is Tencent's 389B-parameter MoE (52B active) — largest open-weight Chinese MoE as of late 2024. Strong on bilingual reasoning and code.

Pricing across providers

No pricing data yet. Verification in progress — check back soon.

Capabilities

  • chat
  • reasoning
  • code

Languages: zh, en

Code samples

Technical specs

Context
33K
Max output
4K
Parameters
389B
Release
Training cutoff
License
Tencent Hunyuan Community License

Similar models

Compare with

  • Hunyuan Large vs Llama 3.1 405B Instruct
    Comparison planned — not yet published
  • Hunyuan Large vs MiniMax-Text-01
    Comparison planned — not yet published
  • Hunyuan Large vs Hunyuan Turbo
    Comparison planned — not yet published

Frequently asked

How much does Hunyuan Large cost?+
No public pricing listed yet. Check back — we verify provider pricing on a 14-day cadence.
Is Hunyuan Large open-source? Can I fine-tune it?+
Yes. Hunyuan Large is open-weight under the Tencent Hunyuan Community License license. Weights are available on Hugging Face for local inference, fine-tuning, and commercial use (see license for specific terms).
Is Hunyuan Large OpenAI-compatible?+
Most listed hostings expose an OpenAI-compatible API, so you can point an existing openai SDK client at the Provider's base_url and use the Provider's model name. See the Code Samples above for a copy-pasteable example.
What's the maximum context window for Hunyuan Large?+
The model supports up to 32,768 tokens of context (input + output). Some hosted versions may impose a smaller limit — check the "Context" column in the pricing matrix for each provider.