Text Generation - MiniMax API Docs (original) (raw)
Documentation Index
Fetch the complete documentation index at: https://platform.minimax.io/docs/llms.txt
Use this file to discover all available pages before exploring further.
Model Overview
MiniMax offers multiple text models to meet different scenario requirements. MiniMax-M2.7 achieves or sets new SOTA benchmarks in programming, tool calling and search, office productivity and other scenarios, while MiniMax-M2 is built for efficient coding and Agent workflows.
Supported Models
| Model Name | Context Window | Description |
|---|---|---|
| MiniMax-M2.7 | 204,800 | Beginning the journey of recursive self-improvement (output speed approximately 60 tps) |
| MiniMax-M2.7-highspeed | 204,800 | M2.7 Highspeed: Same performance, faster and more agile (output speed approximately 100 tps) |
| MiniMax-M2.5 | 204,800 | Peak Performance. Ultimate Value. Master the Complex (output speed approximately 60 tps) |
| MiniMax-M2.5-highspeed | 204,800 | M2.5 highspeed: Same performance, faster and more agile (output speed approximately 100 tps) |
| MiniMax-M2.1 | 204,800 | Powerful Multi-Language Programming Capabilities with Comprehensively Enhanced Programming Experience (output speed approximately 60 tps) |
| MiniMax-M2.1-highspeed | 204,800 | Faster and More Agile (output speed approximately 100 tps) |
| MiniMax-M2 | 204,800 | Agentic capabilities, Advanced reasoning |
MiniMax M2.7 Key Highlights
URL Configuration
Before calling MiniMax models, prepare the following:
| Field | Value |
|---|---|
| base_url (Anthropic-compatible, recommended) | https://api.minimax.io/anthropic |
| base_url (OpenAI-compatible) | https://api.minimax.io/v1 |
| api_key | Get Token Plan API Key |
| model | See Supported Models above |
Calling Example
MiniMax accepts both Anthropic-style and OpenAI-style request formats. The two examples below are equivalent non-streaming calls; flip stream to true to switch to streaming responses.
Anthropic-Compatible (Recommended)
Supports thinking blocks, interleaved thinking, and other advanced features — this is the default path.
OpenAI-Compatible
Already wired up to the OpenAI SDK? Swap base_url and model for the values below and you can keep using your existing client without migrating to a new SDK.
API Reference
If you encounter any issues while using MiniMax models:
- Contact our technical support team through official channels such as email [email protected]
- Submit an Issue on our Github repository
- Anthropic SDK Documentation
- OpenAI SDK Documentation
- MiniMax M2.7