Skip to content

Conversation

edwardzjl
Copy link
Owner

Pull Request Checklist

  • Test cases added for changed code.

Describe what you have changed in this PR.

This change replaces the Pydantic model used for the LLM configuration (llm) with a plain dictionary, improving flexibility by exposing all parameters directly to the deployer.

While it would be preferable to annotate llm as langchain_openai.ChatOpenAI, there is a limitation: Pydantic raises an error ("cannot pickle '_thread.RLock' object"), likely due to incompatibility within langchain_openai.ChatOpenAI.

Using a plain dictionary removes validation at startup, but the default empty dict is sufficient to create a ChatOpenAI instance, making this an acceptable tradeoff for now.

@edwardzjl edwardzjl force-pushed the llmc branch 2 times, most recently from bace08b to 7554263 Compare October 11, 2024 09:34
This change replaces the Pydantic model used for the LLM configuration (`llm`) with a plain dictionary, improving flexibility by exposing all parameters directly to the deployer.

While it would be preferable to annotate `llm` as `langchain_openai.ChatOpenAI`, there is a limitation: Pydantic raises an error ("cannot pickle '_thread.RLock' object"), likely due to incompatibility within `langchain_openai.ChatOpenAI`.

Using a plain dictionary removes validation at startup, but the default empty dict is sufficient to create a `ChatOpenAI` instance, making this an acceptable tradeoff for now.
@edwardzjl
Copy link
Owner Author

I'm wrong. As I have chat_model in app_state, which will be initialized during import, we still have validation on startup, even before startup.

@edwardzjl edwardzjl merged commit 93ecddb into main Oct 11, 2024
6 checks passed
@edwardzjl edwardzjl deleted the llmc branch October 11, 2024 09:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant