LLM Comparison

MiniCPM vs Trinity Mini

Side-by-side specs, pricing & capabilities · Updated May 2026

Add to comparison

2/6 models
Same tier:
 OpenBMBMiniCPMArcee AITrinity Mini
OrganizationOpenBMBOpenBMBArcee AIArcee AI
OpenTools Score
FamilyMiniCPMTrinity
StatusCurrentCurrent
Release DateDec 2025
Context Window128K tokens131K tokens
Input PriceFree$0.04/M tokens
Output PriceFree$0.15/M tokens
Pricing NotesOpen-weight GitHub and Hugging Face model family. There is no fixed vendor API price; runtime cost depends on the host, hardware, or inference provider.
Capabilities
textcodereasoninglocal-inference
textcode
Training CutoffNot publicly specified in queued source
Max Output33K tokens131K tokens
API IdentifierOpenBMB/MiniCPMarcee-ai/trinity-mini
Benchmarks
MiniCPM-SALA standard benchmark average
76.53official-github-readme
MiniCPM-SALA long-context average
38.97official-github-readme
MiniCPM-SALA 2048K extrapolation score
81.6official-github-readme
MiniCPM4.1 reasoning decoding speedup
3official-github-readme
MiniCPM4 Jetson AGX Orin decoding speedup vs Qwen3-8B
7official-github-readme
 View MiniCPMView Trinity Mini

Cost Calculator

Enter your expected monthly token usage to compare costs.

ModelInputOutputTotal / movs Best
MiniCPMCheapest$0.00$0.00$0.00
Trinity Mini$0.05$0.08$0.12+0%

Looking for more AI models?

Browse All LLMs