InternLM logo

InternLM

Open-source LLM family (书生·浦语) from Shanghai AI Laboratory supporting chat, math, coding and multimodal (InternVL) workloads.

-
CN Est. 2023 Active AI API / SDK for Developers

Our Verdict

A strong open-weight Chinese LLM family for teams that want to self-host multimodal.

Pros

  • Fully open weights including multimodal InternVL
  • Competitive on reasoning and code benchmarks
  • Backed by credible research lab

Cons

  • Inference infra is your problem — not hosted API
  • English performance lags on some tasks
  • Ecosystem smaller than Qwen or Llama
Best for: Research teams and China-based developers self-hosting open LLMs Not for: Product teams wanting a managed API like OpenAI or Anthropic

When to Use InternLM

Good fit if you need

  • Running state-of-the-art Chinese open-source LLMs locally
  • Fine-tuning InternLM models for domain-specific tasks
  • Building long-context reasoning chains with 1M token support
  • Benchmarking InternLM2 against GPT-4 on Chinese reasoning tasks

Lock-in Assessment

Low 5/5
Lock-in Score
5/5

InternLM Pricing

Pricing Model
free
Free Tier
Yes
Entry Price
Enterprise Available
No
Transparency Score

Beta — estimates may differ from actual pricing

1,000
1001K10K100K1M

Estimated Monthly Cost

$25

Estimated Annual Cost

$300

Estimates are approximate and may not reflect current pricing. Always check the official pricing page.

Community Discussion

Comments powered by Giscus (GitHub Discussions). You need a GitHub account to comment.