-
Notifications
You must be signed in to change notification settings - Fork 611
feat: add o3.fan provider #1259
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
📝 WalkthroughWalkthroughAdds O3.fan as a new OpenAI-compatible provider: model DB entries, a provider implementation, registration in the provider factory/config, and a UI icon; also normalizes the BurnCloud provider name in the model DB. Changes
Sequence Diagram(s)sequenceDiagram
participant UI as Client/UI
participant Registry as ProviderRegistry
participant Provider as O3fanProvider
participant DB as ProviderDB (resources/model-db)
participant OpenAI as OpenAI-Compatible API
Note over UI,Registry: Flow when user selects or uses an O3.fan model
UI->>Registry: Request provider instance for "o3fan"
Registry->>Provider: instantiate/return O3fanProvider
Provider->>DB: fetch provider & model metadata (providerDbLoader)
DB-->>Provider: return model definitions
Provider->>Provider: map models -> MODEL_META (vision, functionCall, reasoning, etc.)
UI->>Provider: completions / generateText / summaries request (modelId, messages/prompt)
Provider->>OpenAI: openAICompletion(...) (mapped payload)
OpenAI-->>Provider: completion response
Provider-->>UI: formatted LLMResponse
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
📜 Recent review detailsConfiguration used: defaults Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
🚧 Files skipped from review as they are similar to previous changes (2)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In @src/main/presenter/configPresenter/providers.ts:
- Around line 750-764: The o3fan provider entry (id 'o3fan') has inconsistent
URLs: set defaultBaseUrl to match baseUrl by changing it to
'https://api.o3.fan/v1', and set official to the bare homepage 'https://o3.fan'
(remove the '/v1'); update the fields defaultBaseUrl and official in the
provider object so defaultBaseUrl equals baseUrl and the official field is the
root domain.
In @src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts:
- Around line 132-133: The PROVIDER_TYPE_MAP currently includes an entry
['o3fan', O3fanProvider] which is incorrect because O3fan uses
provider.apiType='openai' and thus will never be matched by a lookup by apiType;
remove the ['o3fan', O3fanProvider] entry from PROVIDER_TYPE_MAP and ensure
O3fan remains registered only in PROVIDER_ID_MAP (consistent with other
OpenAI-compatible providers like TokenFlux/PPIO/JieKou) so lookups by
provider.apiType will work correctly.
- Around line 99-100: Remove the ['o3fan', O3fanProvider] entry from the ACP
providers list (leave the existing ['acp', AcpProvider] entry untouched) because
O3fan uses apiType='openai'; then add the O3fan registration instead under the
OpenAI-compatible providers section (or in PROVIDER_ID_MAP) in alphabetical
order with other OpenAI-compatible entries (near 'modelscope'/'moonshot') as
['o3fan', O3fanProvider]. Ensure you reference the O3fanProvider symbol and
update only the provider registration locations, not provider implementation.
🧹 Nitpick comments (2)
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts (1)
58-75: Minor: Chinese full-width colon in prompt.Line 68 uses a Chinese full-width colon ":" instead of a regular colon ":". While this is a minor cosmetic issue and appears in other providers, using consistent punctuation would be preferred.
♻️ Optional fix for punctuation consistency
- content: `You need to summarize the user's conversation into a title of no more than 10 words, with the title language matching the user's primary language, without using punctuation or other special symbols:\n${text}` + content: `You need to summarize the user's conversation into a title of no more than 10 words, with the title language matching the user's primary language, without using punctuation or other special symbols:\n${text}`resources/model-db/providers.json (1)
61721-61725: Consider aligningdisplay_namewith the correctednamecasing.The
namefield was updated to "BurnCloud" butdisplay_nameremains "burncloud". If the intent is to show proper capitalization to users,display_nameshould likely be updated as well for consistency.Suggested fix
"burncloud": { "id": "burncloud", "name": "BurnCloud", - "display_name": "burncloud", + "display_name": "BurnCloud", "models": [
📜 Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
src/renderer/src/assets/llm-icons/o3-fan.pngis excluded by!**/*.png
📒 Files selected for processing (5)
resources/model-db/providers.jsonsrc/main/presenter/configPresenter/providers.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/renderer/src/components/icons/ModelIcon.vue
🧰 Additional context used
📓 Path-based instructions (16)
src/renderer/**/*.vue
📄 CodeRabbit inference engine (CLAUDE.md)
src/renderer/**/*.vue: Use Vue 3 Composition API for all components
Use Tailwind CSS for styling with scoped styles
All user-facing strings must use i18n keys via vue-i18n
Files:
src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**/*.{ts,tsx,vue}
📄 CodeRabbit inference engine (CLAUDE.md)
Use
usePresenter.tscomposable for renderer-to-main IPC communication via direct presenter method callsEnsure all code comments are in English and all log messages are in English, with no non-English text in code comments or console statements
Use VueUse composables for common utilities like
useLocalStorage,useClipboard,useDebounceFnVue 3 renderer app code should be organized in
src/renderer/srcwith subdirectories forcomponents/,stores/,views/,i18n/, andlib/
Files:
src/renderer/src/components/icons/ModelIcon.vue
**/*.{js,ts,tsx,jsx,vue,mjs,cjs}
📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)
All logs and comments must be in English
Files:
src/renderer/src/components/icons/ModelIcon.vuesrc/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.ts
**/*.{js,ts,tsx,jsx,vue,json,mjs,cjs}
📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)
Use Prettier as the code formatter
Files:
src/renderer/src/components/icons/ModelIcon.vuesrc/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.tsresources/model-db/providers.json
src/renderer/src/**/*.{vue,ts,tsx}
📄 CodeRabbit inference engine (.cursor/rules/i18n.mdc)
src/renderer/src/**/*.{vue,ts,tsx}: Use vue-i18n framework for internationalization located at src/renderer/src/i18n/
All user-facing strings must use i18n keys, not hardcoded text
src/renderer/src/**/*.{vue,ts,tsx}: Usereffor primitives and references,reactivefor objects in Vue 3 Composition API
Prefercomputedproperties over methods for derived state in Vue components
Import Shadcn Vue components from@/shadcn/components/ui/path alias
Use thecn()utility function combining clsx and tailwind-merge for dynamic Tailwind classes
UsedefineAsyncComponent()for lazy loading heavy Vue components
Use TypeScript for all Vue components and composables with explicit type annotations
Define TypeScript interfaces for Vue component props and data structures
UseusePresentercomposable for main process communication instead of direct IPC calls
Files:
src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**/*.vue
📄 CodeRabbit inference engine (.cursor/rules/i18n.mdc)
Import useI18n from vue-i18n in Vue components to access translation functions t and locale
src/renderer/src/**/*.vue: Use<script setup>syntax for concise Vue 3 component definitions with Composition API
Define props and emits explicitly in Vue components usingdefinePropsanddefineEmitswith TypeScript interfaces
Useprovide/injectfor dependency injection in Vue components instead of prop drilling
Use Tailwind CSS for all styling instead of writing scoped CSS files
Use mobile-first responsive design approach with Tailwind breakpoints
Use Iconify Vue with lucide icons as primary choice, following patternlucide:{icon-name}
Usev-memodirective for memoizing expensive computations in templates
Usev-oncedirective for rendering static content without reactivity updates
Use virtual scrolling withRecycleScrollercomponent for rendering long lists
Subscribe to events usingrendererEvents.on()and unsubscribe inonUnmountedlifecycle hook
Files:
src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/components/**/*.vue
📄 CodeRabbit inference engine (.cursor/rules/vue-stack-guide.mdc)
Name Vue components using PascalCase (e.g.,
ChatInput.vue,MessageItemUser.vue)
Files:
src/renderer/src/components/icons/ModelIcon.vue
**/*.vue
📄 CodeRabbit inference engine (AGENTS.md)
Vue components must be named in PascalCase (e.g.,
ChatInput.vue) and use Vue 3 Composition API with Pinia for state management and Tailwind for styling
Files:
src/renderer/src/components/icons/ModelIcon.vue
**/*.{ts,tsx,vue}
📄 CodeRabbit inference engine (AGENTS.md)
**/*.{ts,tsx,vue}: Use camelCase for variable and function names; use PascalCase for types and classes; use SCREAMING_SNAKE_CASE for constants
Configure Prettier with single quotes, no semicolons, and line width of 100 characters. Runpnpm run formatafter completing features
Files:
src/renderer/src/components/icons/ModelIcon.vuesrc/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.ts
**/*.{ts,tsx,js,jsx}
📄 CodeRabbit inference engine (CLAUDE.md)
Use English for logs and comments in TypeScript/JavaScript code
Files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.ts
**/*.{ts,tsx}
📄 CodeRabbit inference engine (CLAUDE.md)
Use TypeScript with strict type checking enabled
Use OxLint for linting JavaScript and TypeScript files; ensure lint-staged hooks and typecheck pass before commits
Files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.ts
src/main/presenter/**/*.ts
📄 CodeRabbit inference engine (CLAUDE.md)
src/main/presenter/**/*.ts: Use EventBus to broadcast events from main to renderer viamainWindow.webContents.send()
Implement one presenter per functional domain in the main process
Files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.ts
src/main/**/*.ts
📄 CodeRabbit inference engine (CLAUDE.md)
src/main/**/*.ts: Use EventBus fromsrc/main/eventbus.tsfor decoupled inter-process communication
Context isolation must be enabled with preload scripts for secure IPC communicationElectron main process code should reside in
src/main/, with presenters organized inpresenter/subdirectory (Window, Tab, Thread, Mcp, Config, LLMProvider), and app events managed viaeventbus.ts
Files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.ts
src/main/presenter/llmProviderPresenter/providers/*.ts
📄 CodeRabbit inference engine (CLAUDE.md)
LLM provider implementations must follow the standardized event interface with
coreStreammethod
Files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
**/*.{js,ts,tsx,jsx,mjs,cjs}
📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)
Use OxLint as the linter
Files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.tssrc/main/presenter/configPresenter/providers.ts
src/main/presenter/configPresenter/**/*.ts
📄 CodeRabbit inference engine (CLAUDE.md)
Custom prompts are managed independently of MCP through config data source using
configPresenter.getCustomPrompts()
Files:
src/main/presenter/configPresenter/providers.ts
🧠 Learnings (4)
📚 Learning: 2026-01-05T02:41:31.619Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/vue-stack-guide.mdc:0-0
Timestamp: 2026-01-05T02:41:31.619Z
Learning: Applies to src/renderer/src/**/*.vue : Use Iconify Vue with lucide icons as primary choice, following pattern `lucide:{icon-name}`
Applied to files:
src/renderer/src/components/icons/ModelIcon.vue
📚 Learning: 2026-01-05T02:40:52.831Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-05T02:40:52.831Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : LLM provider implementations must follow the standardized event interface with `coreStream` method
Applied to files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.tssrc/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
📚 Learning: 2025-08-28T05:55:31.482Z
Learnt from: zerob13
Repo: ThinkInAIXYZ/deepchat PR: 804
File: src/main/presenter/llmProviderPresenter/providers/tokenfluxProvider.ts:153-156
Timestamp: 2025-08-28T05:55:31.482Z
Learning: TokenFlux models generally support function calling by default, so it's reasonable to assume hasFunctionCalling = true for TokenFlux provider implementations in src/main/presenter/llmProviderPresenter/providers/tokenfluxProvider.ts
Applied to files:
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
📚 Learning: 2026-01-05T02:40:52.831Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-05T02:40:52.831Z
Learning: Applies to src/main/presenter/mcpPresenter/inMemoryServers/**/*.ts : MCP tool implementations must be registered in `mcpPresenter/index.ts` after implementation
Applied to files:
src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
🧬 Code graph analysis (1)
src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts (1)
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts (1)
O3fanProvider(13-95)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: build-check (x64)
🔇 Additional comments (9)
src/renderer/src/components/icons/ModelIcon.vue (2)
70-70: LGTM!The icon import follows the established pattern and uses an appropriate asset path.
77-77: LGTM!The icon registration correctly maps the 'o3fan' key to the imported asset, enabling icon resolution for O3.fan models via the iconKey computed property logic.
src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts (1)
37-37: LGTM!The import follows the established pattern for provider implementations.
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts (4)
13-16: LGTM!The constructor follows the standard pattern for OpenAI-compatible providers.
18-47: LGTM!The model fetching logic is well-implemented with proper fallbacks and sensible defaults:
- Safe handling when provider or models are missing
- Reasonable defaults for contextLength (8192) and maxTokens (4096)
- Correct derivation of model capabilities from metadata
- Proper model type classification based on modalities
This follows the same pattern as other providers like SiliconcloudProvider.
49-56: LGTM!The completions method correctly delegates to the base class implementation.
77-94: LGTM!The generateText method correctly delegates to the base class implementation.
resources/model-db/providers.json (2)
95156-95191: O3.fan provider structure looks good.The provider entry follows the established pattern with proper id, name, display_name, and model definitions. The model metadata (modalities, limits, tool_call support, reasoning capabilities, costs) appears comprehensive.
95218-95256: Verify zero cost values forgemini-3-pro-previewandgpt-5.2.Both models have
inputandoutputcosts set to0. If these are placeholder values rather than intentionally free models, consider updating them with accurate pricing to ensure cost estimation features work correctly.
src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
Outdated
Show resolved
Hide resolved
src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
Show resolved
Hide resolved
* feat: add o3.fan provider * fix: update o3.fan provider apiType and official URL * chore: provider orders and defaultBaseUrl --------- Co-authored-by: zerob13 <[email protected]>
feat: add o3.fan provider
Close #610
Summary by CodeRabbit
New Features
Chores
Misc
✏️ Tip: You can customize this high-level summary in your review settings.