Contact Details
[email protected]
What problem does this solve?
Currently the AI champion can only connect to openai with an openai key. A setting that allows us to connect to any openai compatible api should be possible. This would allow users to use locally hosted AI models or other compatible services.
The benefit would be not being locked to using openai, having freedom and choice to use any AI models to resolve the issues. Even specialist security models could be fine tuned or downloaded from hugging face to provide better results.
Proposed Solution
Many providers make their LLM api's compayvle with the same endpoints as openai so only the hostname is required as new setting. It should be quite a straight forward change.
Enthropic Claude models.
https://docs.anthropic.com/en/api/openai-sdk
Self Hosted
https://lmstudio.ai/docs/app/api/endpoints/openai
Importance Level
Nice to have
Additional Information
No response