-
Notifications
You must be signed in to change notification settings - Fork 6
Implement multi-provider support for chat functionality and enhance configuration management #26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Implement multi-provider support for chat functionality and enhance configuration management #26
Conversation
|
|
||
| def connection | ||
| @connection ||= Faraday.new do |faraday| | ||
| faraday.adapter Faraday.default_adapter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adapter should come last
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
its just an copy+paste from client, so i couldn't know that it should come last
| response = connection.post do |req| | ||
| req.url Configuration::PROVIDERS[@provider] || Configuration::BASE_URL | ||
| req.headers.merge!(headers) | ||
| req.body = body.to_json |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Faraday can manage Json automatically
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the same thing as in the previous message
| when 'anthropic' | ||
| { | ||
| 'model' => Configuration::MODELS[provider][model], | ||
| 'max_tokens' => 1024, # Required parameter for Anthropic API |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why you stay with 1024 tokens? Can't we increase this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
my bad, will be fixed in future commits, because in one of the next PR i've changed configuration of providers
I simply forgot about this thing.
| case provider | ||
| when 'openai' | ||
| { | ||
| 'Content-Type': 'application/json', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Faraday will add this header automatically
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this line will be removed in future commits
All Submissions:
Summary of Changes
This pull request includes three main commits:
What’s New?
Let me know if you’d like tests for the Anthropic integration once I get access.