Skip to content

feat: add MiniMax as a first-class chat model provider#4466

Open
octo-patch wants to merge 1 commit intoTabbyML:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as a first-class chat model provider#4466
octo-patch wants to merge 1 commit intoTabbyML:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class chat model provider with a dedicated minimax/chat kind that handles MiniMax-specific API constraints.

Changes

  • Add minimax/chat kind to the chat routing in http-api-bindings
  • Add MiniMax-specific request processing in ExtendedOpenAIConfig:
    • Temperature clamping to valid range (0.0, 1.0] (defaults to 1.0)
  • Add 6 unit tests for temperature clamping and model fallback
  • Add documentation page with configuration examples (global and China endpoints)

Supported Models

Model Context Length Max Output
MiniMax-M2.7 204K 192K
MiniMax-M2.7-highspeed 204K 192K

Configuration Example

[model.chat.http]
kind = "minimax/chat"
model_name = "MiniMax-M2.7"
api_endpoint = "https://api.minimax.io/v1"
api_key = "your-minimax-api-key"

API Documentation

- Add "minimax/chat" kind with temperature clamping (0.0, 1.0]
- Support MiniMax-M2.7 and MiniMax-M2.7-highspeed models
- Add unit tests for request processing
- Add documentation page with configuration examples
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant