-
-
Notifications
You must be signed in to change notification settings - Fork 332
feat(adapters): support extra fields in chat-completion response in OpenAI adapter #2359
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
This is still WIP (needs tests, typing and better docs), but @olimorris would you accept this as an alternative to #1938? @SDGLBL, since you wrote the original PR, I'd love to know your thoughts on this too (there's an example snippet in the docs about how to configure this for the openrouter format). |
|
Totally down for this. Be a welcome addition. |
|
Nice. I'll also try to refactor the deepseek adapter to use this design. If that works, we'd automatically have unit tests for On the deepseek documentation, they're using the openai python SDK with the |
|
The tests are failing because the openai |
e7ccab1 to
7a77871
Compare
6035eb5 to
42474d4
Compare
Description
Add an optional
parse_extrahandler that parses extra (non-standard) fields in the OpenAIchat/completionsresponses. This can be used by downstream adapters (openrouter, gemini, deepseek) to render their in-house reasoning format. This handler will be defined in either the downstream adapter (deepseek, gemini, etc.) or the user config (when they useextendto customise the adapter), and will only be called when there's a non-nilextra field.Related Issue(s)
Compared to #1938, this is a more flexible and less invasive design that can be made to work with more OpenAI-based APIs.
If this PR is accepted, I'll update #2306 to use this too.
Checklist
make allto ensure docs are generated, tests pass and my formatting is appliedCodeCompanion.hasin the init.lua file for my new feature