Skip to content

Conversation

@shatfield4
Copy link
Collaborator

Pull Request Type

  • ✨ feat
  • πŸ› fix
  • ♻️ refactor
  • πŸ’„ style
  • πŸ”¨ chore
  • πŸ“ docs

Relevant Issues

resolves #4529

What is in this change?

  • Switched embed() call to use input param vs the old prompt param to support batch processing
  • Adds new advanced options item to allow for adjusting batch size in Ollama provider (default is 1)
  • Updated logging to show batch sizes and progress

Additional Information

Developer Validations

  • I ran yarn lint from the root of the repo & committed changes
  • Relevant documentation has been updated
  • I have tested my code functionality
  • Docker build succeeds locally

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

PR:needs review Needs review by core team

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEAT]: OllamaEmbedder Support Batched or Parallel Embeddings to Improve Performance

3 participants