-
Notifications
You must be signed in to change notification settings - Fork 10
Open
Description
https://ollama.com/ is a way to run llama locally. It serves models at a local url.
I'd like to use it in dspy_nodes but maybe some modifications would be needed
i'd be happy to make a pull request if you have an idea of how it could be done and talk me through it a little.
There's an object for it in DSPy: https://dspy-docs.vercel.app/api/local_language_model_clients/Ollama
tom-doerr
Metadata
Metadata
Assignees
Labels
No labels