Skip to content

Conversation

@CodeYHJ
Copy link
Contributor

@CodeYHJ CodeYHJ commented Jan 7, 2026

feat: add o3.fan provider

Close #610

Summary by CodeRabbit

  • New Features

    • Added support for a new o3fan provider with many models spanning text, image, audio, video, and PDF modalities; several models support reasoning and tool-calling.
  • Chores

    • Registered the new provider in the app (disabled by default) and added its provider icon.
  • Misc

    • Normalized provider naming: "burncloud" renamed to "BurnCloud".

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 7, 2026

📝 Walkthrough

Walkthrough

Adds O3.fan as a new OpenAI-compatible provider: model DB entries, a provider implementation, registration in the provider factory/config, and a UI icon; also normalizes the BurnCloud provider name in the model DB.

Changes

Cohort / File(s) Summary
Provider model DB
resources/model-db/providers.json
Adds a new o3fan provider object with a comprehensive list of model definitions (text, image, audio, video, PDF; modalities, limits, costs, tool_call flags) and updates burncloud.name to BurnCloud.
Config: provider list
src/main/presenter/configPresenter/providers.ts
Inserts o3fan entry into DEFAULT_PROVIDERS (id:o3fan, apiType:o3fan, baseUrl:https://api.o3.fan/v1, enabled:false, websites map).
Provider registry / factory
src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
Registers O3fanProvider in PROVIDER_ID_MAP and PROVIDER_TYPE_MAP so it can be instantiated/looked up by id and apiType.
Provider implementation
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
New exported O3fanProvider (extends OpenAICompatibleProvider) with fetchOpenAIModels(), completions(), summaries(), and generateText(); maps DB model defs to internal MODEL_META and delegates to OpenAI-compatible completion calls.
UI icon mapping
src/renderer/src/components/icons/ModelIcon.vue
Imports o3fan color icon asset and registers "o3fan" key in the icons registry for model icon resolution.

Sequence Diagram(s)

sequenceDiagram
  participant UI as Client/UI
  participant Registry as ProviderRegistry
  participant Provider as O3fanProvider
  participant DB as ProviderDB (resources/model-db)
  participant OpenAI as OpenAI-Compatible API

  Note over UI,Registry: Flow when user selects or uses an O3.fan model
  UI->>Registry: Request provider instance for "o3fan"
  Registry->>Provider: instantiate/return O3fanProvider
  Provider->>DB: fetch provider & model metadata (providerDbLoader)
  DB-->>Provider: return model definitions
  Provider->>Provider: map models -> MODEL_META (vision, functionCall, reasoning, etc.)
  UI->>Provider: completions / generateText / summaries request (modelId, messages/prompt)
  Provider->>OpenAI: openAICompletion(...) (mapped payload)
  OpenAI-->>Provider: completion response
  Provider-->>UI: formatted LLMResponse
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Poem

🐇 I hopped through JSON rows and code, a tiny, gentle fan,
O3.fan now joined the garden—icons, models, all at hand.
I stitched the registry, mapped the models, nudged BurnCloud's name just right,
Now hop I do, with carrot-cheer, beneath the coding light.

🚥 Pre-merge checks | ✅ 4 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Out of Scope Changes check ⚠️ Warning All changes are directly related to adding the O3.fan provider: provider configuration, implementation class, provider registration, and icon asset. The reformatting of 'burncloud' to 'BurnCloud' is unrelated to the O3.fan feature request. Remove the 'burncloud' provider name reformatting from this PR and address it in a separate pull request focused on that specific correction.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title 'feat: add o3.fan provider' clearly and concisely summarizes the main change, which is adding a new O3.fan LLM provider to the application.
Linked Issues check ✅ Passed The pull request fully implements issue #610 requirements: adds O3.fan provider with correct API base URL (https://api.o3.fan/v1), OpenAI-compatible implementation, Bearer token authentication support, and integration following existing provider patterns.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

📜 Recent review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c779139 and 299f30c.

📒 Files selected for processing (2)
  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-check (x64)

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In @src/main/presenter/configPresenter/providers.ts:
- Around line 750-764: The o3fan provider entry (id 'o3fan') has inconsistent
URLs: set defaultBaseUrl to match baseUrl by changing it to
'https://api.o3.fan/v1', and set official to the bare homepage 'https://o3.fan'
(remove the '/v1'); update the fields defaultBaseUrl and official in the
provider object so defaultBaseUrl equals baseUrl and the official field is the
root domain.

In @src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts:
- Around line 132-133: The PROVIDER_TYPE_MAP currently includes an entry
['o3fan', O3fanProvider] which is incorrect because O3fan uses
provider.apiType='openai' and thus will never be matched by a lookup by apiType;
remove the ['o3fan', O3fanProvider] entry from PROVIDER_TYPE_MAP and ensure
O3fan remains registered only in PROVIDER_ID_MAP (consistent with other
OpenAI-compatible providers like TokenFlux/PPIO/JieKou) so lookups by
provider.apiType will work correctly.
- Around line 99-100: Remove the ['o3fan', O3fanProvider] entry from the ACP
providers list (leave the existing ['acp', AcpProvider] entry untouched) because
O3fan uses apiType='openai'; then add the O3fan registration instead under the
OpenAI-compatible providers section (or in PROVIDER_ID_MAP) in alphabetical
order with other OpenAI-compatible entries (near 'modelscope'/'moonshot') as
['o3fan', O3fanProvider]. Ensure you reference the O3fanProvider symbol and
update only the provider registration locations, not provider implementation.
🧹 Nitpick comments (2)
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts (1)

58-75: Minor: Chinese full-width colon in prompt.

Line 68 uses a Chinese full-width colon ":" instead of a regular colon ":". While this is a minor cosmetic issue and appears in other providers, using consistent punctuation would be preferred.

♻️ Optional fix for punctuation consistency
-          content: `You need to summarize the user's conversation into a title of no more than 10 words, with the title language matching the user's primary language, without using punctuation or other special symbols:\n${text}`
+          content: `You need to summarize the user's conversation into a title of no more than 10 words, with the title language matching the user's primary language, without using punctuation or other special symbols:\n${text}`
resources/model-db/providers.json (1)

61721-61725: Consider aligning display_name with the corrected name casing.

The name field was updated to "BurnCloud" but display_name remains "burncloud". If the intent is to show proper capitalization to users, display_name should likely be updated as well for consistency.

Suggested fix
     "burncloud": {
       "id": "burncloud",
       "name": "BurnCloud",
-      "display_name": "burncloud",
+      "display_name": "BurnCloud",
       "models": [
📜 Review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0b0d392 and 00d514a.

⛔ Files ignored due to path filters (1)
  • src/renderer/src/assets/llm-icons/o3-fan.png is excluded by !**/*.png
📒 Files selected for processing (5)
  • resources/model-db/providers.json
  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/renderer/src/components/icons/ModelIcon.vue
🧰 Additional context used
📓 Path-based instructions (16)
src/renderer/**/*.vue

📄 CodeRabbit inference engine (CLAUDE.md)

src/renderer/**/*.vue: Use Vue 3 Composition API for all components
Use Tailwind CSS for styling with scoped styles
All user-facing strings must use i18n keys via vue-i18n

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Use usePresenter.ts composable for renderer-to-main IPC communication via direct presenter method calls

Ensure all code comments are in English and all log messages are in English, with no non-English text in code comments or console statements

Use VueUse composables for common utilities like useLocalStorage, useClipboard, useDebounceFn

Vue 3 renderer app code should be organized in src/renderer/src with subdirectories for components/, stores/, views/, i18n/, and lib/

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
**/*.{js,ts,tsx,jsx,vue,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

All logs and comments must be in English

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
**/*.{js,ts,tsx,jsx,vue,json,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

Use Prettier as the code formatter

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
  • resources/model-db/providers.json
src/renderer/src/**/*.{vue,ts,tsx}

📄 CodeRabbit inference engine (.cursor/rules/i18n.mdc)

src/renderer/src/**/*.{vue,ts,tsx}: Use vue-i18n framework for internationalization located at src/renderer/src/i18n/
All user-facing strings must use i18n keys, not hardcoded text

src/renderer/src/**/*.{vue,ts,tsx}: Use ref for primitives and references, reactive for objects in Vue 3 Composition API
Prefer computed properties over methods for derived state in Vue components
Import Shadcn Vue components from @/shadcn/components/ui/ path alias
Use the cn() utility function combining clsx and tailwind-merge for dynamic Tailwind classes
Use defineAsyncComponent() for lazy loading heavy Vue components
Use TypeScript for all Vue components and composables with explicit type annotations
Define TypeScript interfaces for Vue component props and data structures
Use usePresenter composable for main process communication instead of direct IPC calls

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**/*.vue

📄 CodeRabbit inference engine (.cursor/rules/i18n.mdc)

Import useI18n from vue-i18n in Vue components to access translation functions t and locale

src/renderer/src/**/*.vue: Use <script setup> syntax for concise Vue 3 component definitions with Composition API
Define props and emits explicitly in Vue components using defineProps and defineEmits with TypeScript interfaces
Use provide/inject for dependency injection in Vue components instead of prop drilling
Use Tailwind CSS for all styling instead of writing scoped CSS files
Use mobile-first responsive design approach with Tailwind breakpoints
Use Iconify Vue with lucide icons as primary choice, following pattern lucide:{icon-name}
Use v-memo directive for memoizing expensive computations in templates
Use v-once directive for rendering static content without reactivity updates
Use virtual scrolling with RecycleScroller component for rendering long lists
Subscribe to events using rendererEvents.on() and unsubscribe in onUnmounted lifecycle hook

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/components/**/*.vue

📄 CodeRabbit inference engine (.cursor/rules/vue-stack-guide.mdc)

Name Vue components using PascalCase (e.g., ChatInput.vue, MessageItemUser.vue)

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
**/*.vue

📄 CodeRabbit inference engine (AGENTS.md)

Vue components must be named in PascalCase (e.g., ChatInput.vue) and use Vue 3 Composition API with Pinia for state management and Tailwind for styling

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (AGENTS.md)

**/*.{ts,tsx,vue}: Use camelCase for variable and function names; use PascalCase for types and classes; use SCREAMING_SNAKE_CASE for constants
Configure Prettier with single quotes, no semicolons, and line width of 100 characters. Run pnpm run format after completing features

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use English for logs and comments in TypeScript/JavaScript code

Files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use TypeScript with strict type checking enabled

Use OxLint for linting JavaScript and TypeScript files; ensure lint-staged hooks and typecheck pass before commits

Files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
src/main/presenter/**/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

src/main/presenter/**/*.ts: Use EventBus to broadcast events from main to renderer via mainWindow.webContents.send()
Implement one presenter per functional domain in the main process

Files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
src/main/**/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

src/main/**/*.ts: Use EventBus from src/main/eventbus.ts for decoupled inter-process communication
Context isolation must be enabled with preload scripts for secure IPC communication

Electron main process code should reside in src/main/, with presenters organized in presenter/ subdirectory (Window, Tab, Thread, Mcp, Config, LLMProvider), and app events managed via eventbus.ts

Files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
src/main/presenter/llmProviderPresenter/providers/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

LLM provider implementations must follow the standardized event interface with coreStream method

Files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
**/*.{js,ts,tsx,jsx,mjs,cjs}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

Use OxLint as the linter

Files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
  • src/main/presenter/configPresenter/providers.ts
src/main/presenter/configPresenter/**/*.ts

📄 CodeRabbit inference engine (CLAUDE.md)

Custom prompts are managed independently of MCP through config data source using configPresenter.getCustomPrompts()

Files:

  • src/main/presenter/configPresenter/providers.ts
🧠 Learnings (4)
📚 Learning: 2026-01-05T02:41:31.619Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: .cursor/rules/vue-stack-guide.mdc:0-0
Timestamp: 2026-01-05T02:41:31.619Z
Learning: Applies to src/renderer/src/**/*.vue : Use Iconify Vue with lucide icons as primary choice, following pattern `lucide:{icon-name}`

Applied to files:

  • src/renderer/src/components/icons/ModelIcon.vue
📚 Learning: 2026-01-05T02:40:52.831Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-05T02:40:52.831Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : LLM provider implementations must follow the standardized event interface with `coreStream` method

Applied to files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
📚 Learning: 2025-08-28T05:55:31.482Z
Learnt from: zerob13
Repo: ThinkInAIXYZ/deepchat PR: 804
File: src/main/presenter/llmProviderPresenter/providers/tokenfluxProvider.ts:153-156
Timestamp: 2025-08-28T05:55:31.482Z
Learning: TokenFlux models generally support function calling by default, so it's reasonable to assume hasFunctionCalling = true for TokenFlux provider implementations in src/main/presenter/llmProviderPresenter/providers/tokenfluxProvider.ts

Applied to files:

  • src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts
📚 Learning: 2026-01-05T02:40:52.831Z
Learnt from: CR
Repo: ThinkInAIXYZ/deepchat PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-05T02:40:52.831Z
Learning: Applies to src/main/presenter/mcpPresenter/inMemoryServers/**/*.ts : MCP tool implementations must be registered in `mcpPresenter/index.ts` after implementation

Applied to files:

  • src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts
🧬 Code graph analysis (1)
src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts (1)
src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts (1)
  • O3fanProvider (13-95)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-check (x64)
🔇 Additional comments (9)
src/renderer/src/components/icons/ModelIcon.vue (2)

70-70: LGTM!

The icon import follows the established pattern and uses an appropriate asset path.


77-77: LGTM!

The icon registration correctly maps the 'o3fan' key to the imported asset, enabling icon resolution for O3.fan models via the iconKey computed property logic.

src/main/presenter/llmProviderPresenter/managers/providerInstanceManager.ts (1)

37-37: LGTM!

The import follows the established pattern for provider implementations.

src/main/presenter/llmProviderPresenter/providers/o3fanProvider.ts (4)

13-16: LGTM!

The constructor follows the standard pattern for OpenAI-compatible providers.


18-47: LGTM!

The model fetching logic is well-implemented with proper fallbacks and sensible defaults:

  • Safe handling when provider or models are missing
  • Reasonable defaults for contextLength (8192) and maxTokens (4096)
  • Correct derivation of model capabilities from metadata
  • Proper model type classification based on modalities

This follows the same pattern as other providers like SiliconcloudProvider.


49-56: LGTM!

The completions method correctly delegates to the base class implementation.


77-94: LGTM!

The generateText method correctly delegates to the base class implementation.

resources/model-db/providers.json (2)

95156-95191: O3.fan provider structure looks good.

The provider entry follows the established pattern with proper id, name, display_name, and model definitions. The model metadata (modalities, limits, tool_call support, reasoning capabilities, costs) appears comprehensive.


95218-95256: Verify zero cost values for gemini-3-pro-preview and gpt-5.2.

Both models have input and output costs set to 0. If these are placeholder values rather than intentionally free models, consider updating them with accurate pricing to ensure cost estimation features work correctly.

@zerob13 zerob13 merged commit ef2d62c into ThinkInAIXYZ:dev Jan 8, 2026
2 checks passed
zerob13 added a commit that referenced this pull request Jan 8, 2026
* feat: add o3.fan provider

* fix: update o3.fan provider apiType and official URL

* chore: provider orders and defaultBaseUrl

---------

Co-authored-by: zerob13 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] Add O3.fan as Provider

2 participants