Skip to content

Out of memory - What are sensible upper bounds? #18

@the42

Description

@the42

I am using the NLP package for similarity search.

Input is a corpus of 300000 entries of []string, each consisting of about 10-50 words.

When I naively follow the example
https://pkg.go.dev/github.com/james-bowman/nlp#example-package
, this panics with Out of Memory.

I reduced the set to 10000 entries and FitTransform takes a very long time on my laptop (Minutes) to complete.

Am I doing something wrong or is this task not the correct way of using this package?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions