Open
Conversation
Add MiniMax as a first-class external model provider alongside OpenAI, Ollama, Foundry Local, and Lemonade. MiniMax uses OpenAI-compatible API via api.minimax.io/v1 with models M2.7, M2.5, and M2.5-highspeed. Changes: - MiniMaxModelProvider: OpenAI-compat client with static model list - MiniMaxPickerView: XAML + code-behind for API key entry and model selection - HardwareAccelerator.MINIMAX enum value in all three locations - ExternalModelHelper: registered MiniMax provider in provider list - ModelDetailsHelper: added MINIMAX to IsLanguageModel check - ModelPickerDefinitions: registered MiniMax picker tab - Icon assets: SVG (dark/light) and PNG variants - Unit tests: 10 tests covering properties, models, client creation - Integration tests: 3 tests for API connectivity and streaming
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Add MiniMax as a first-class cloud LLM provider, following the existing external model provider pattern alongside OpenAI, Ollama, Foundry Local, and Lemonade.
MiniMax offers OpenAI-compatible API endpoints at
api.minimax.io/v1, enabling seamless integration using the existingMicrosoft.Extensions.AI.OpenAINuGet package (no additional dependencies needed).Models included
Changes (18 files, 790 additions)
Core provider:
MiniMaxModelProvider.cs: Provider implementingIExternalModelProvider, uses OpenAI SDK with custom endpointHardwareAccelerator.MINIMAX: Enum value added in all 3 locations (main, source generator, project template)ExternalModelHelper.cs: Registered MiniMax in provider listModelDetailsHelper.cs: Added toIsLanguageModel()checkUI:
MiniMaxPickerView.xaml+.xaml.cs: API key entry and model selection view (mirrors OpenAI picker pattern)ModelPickerDefinitions.cs: Registered MiniMax picker tabTests:
MINIMAX_API_KEYenv var)Docs:
AddingSamples.md: Added MINIMAX to HardwareAccelerator values listTest plan