Skip to content

Bring LiteLLM support #264

@amanjaiswal73892

Description

@amanjaiswal73892

Add support for LiteLLM in the llm/response_api.py for better extensibility and maintainability of newer LLMs.

  • Add new Model class for LiteLLM similar to ChatCompletionModel.
  • Include support for anthropic related parameters (cache_breakpoints etc).
  • Include other parameters like (reasoning effort) etc.
  • Add cost tracking in tracking.py for litellm and use litellm helper functions.
  • Write tests using 'mock' param of liteLMM completion.
  • Test function calling support.
  • Replicate OAI/Anthropic Agents using LiteLLM and compare outputs.

Sub-issues

Metadata

Metadata

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions