LLM Comparison

DeepSeek V3.2 vs MiniCPM

Side-by-side specs, pricing & capabilities · Updated May 2026

Add to comparison

2/6 models
Same tier:
 DeepSeekDeepSeek V3.2OpenBMBMiniCPM
OrganizationDeepSeekDeepSeekOpenBMBOpenBMB
OpenTools Score
53
156
FamilyDeepSeekMiniCPM
StatusCurrentCurrent
Release DateDec 2025
Context Window164K tokens128K tokens
Input Price$0.26/M tokensFree
Output Price$0.42/M tokensFree
Pricing NotesCache read: $0.1350/M tokensOpen-weight GitHub and Hugging Face model family. There is no fixed vendor API price; runtime cost depends on the host, hardware, or inference provider.
Capabilities
textcode
textcodereasoninglocal-inference
Training CutoffNot publicly specified in queued source
Max Output164K tokens33K tokens
API Identifierdeepseek/deepseek-v3.2OpenBMB/MiniCPM
Benchmarks
MMLU-Pro
85deepseek
GPQA Diamond
82.4deepseek
AIME 2025
93.1deepseek
LiveCodeBench
83.3deepseek
SWE-bench Verified
73.1deepseek
HLE
25.1deepseek
BrowseComp
51.4deepseek
MiniCPM-SALA standard benchmark average
76.53official-github-readme
MiniCPM-SALA long-context average
38.97official-github-readme
MiniCPM-SALA 2048K extrapolation score
81.6official-github-readme
MiniCPM4.1 reasoning decoding speedup
3official-github-readme
MiniCPM4 Jetson AGX Orin decoding speedup vs Qwen3-8B
7official-github-readme
 View DeepSeek V3.2View MiniCPM

Cost Calculator

Enter your expected monthly token usage to compare costs.

ModelInputOutputTotal / movs Best
MiniCPMCheapest$0.00$0.00$0.00
DeepSeek V3.2$0.26$0.21$0.47+0%

Looking for more AI models?

Browse All LLMs