Skip to content

feat: add MiniMax AI provider support#1802

Open
octo-patch wants to merge 1 commit intoCodePhiliaX:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax AI provider support#1802
octo-patch wants to merge 1 commit intoCodePhiliaX:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Add MiniMax as an AI provider option for intelligent SQL generation in Chat2DB. MiniMax offers OpenAI-compatible API endpoints, enabling seamless integration with the existing provider architecture.

Changes

Backend

  • New provider package (controller/ai/minimax/):
    • MiniMaxAIClient.java - Singleton factory for client management
    • MiniMaxAIStreamClient.java - OkHttp-based streaming client with Builder pattern
    • MiniMaxAIEventSourceListener.java - SSE event listener for streaming responses
    • MiniMaxChatCompletions.java - Response DTO model
  • AiSqlSourceEnum.java - Add MINIMAXAI enum value
  • ChatController.java - Add MINIMAXAI dispatch case and chatWithMiniMaxAi() method
  • ConfigController.java - Add save/load config logic for MiniMax (API key, host, model)

Frontend

  • ai.ts - Add MINIMAXAI to AIType enum
  • aiTypeConfig.ts - Add MiniMax display name and form config with default API host and model

Supported Models

Model Context Window Description
MiniMax-M2.5 (default) 204,800 tokens Peak Performance. Ultimate Value
MiniMax-M2.5-highspeed 204,800 tokens Same performance, faster and more agile

API Configuration

  • Default API Host: https://api.minimax.io/v1/chat/completions
  • Auth: Bearer token via MINIMAX_API_KEY
  • Compatibility: OpenAI-compatible API format

API Documentation

Add MiniMax as an AI provider option for SQL generation. MiniMax provides
OpenAI-compatible API endpoints, making integration straightforward.

Changes:
- Backend: Add MiniMax provider with client, stream client, event listener,
  and response model classes following the existing provider pattern
- Backend: Register MiniMax in AiSqlSourceEnum, ChatController dispatch,
  and ConfigController save/load logic
- Frontend: Add MINIMAXAI to AIType enum, display name, and form config
  with default API host and model settings

Supported models:
- MiniMax-M2.5 (default) - 204K context window
- MiniMax-M2.5-highspeed - Same performance, faster

API Documentation:
- OpenAI Compatible: https://platform.minimax.io/docs/api-reference/text-openai-api
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant