You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do you already have an implementation?
Tiktoken has a basic implementation in their docs, e.g. for gpt-3.5.-turbo. We just need to calculate the number of tokens of a text input.
Additional context
Also use this to calculate the complexity of a paragraph for RAG; for instance, > 1000 tokens means a high complexity.
The text was updated successfully, but these errors were encountered:
Those are two bricks, if I think about it. one to just calculate the number of tokens (i.e. returning an integer), and one brick that categorizes the text into discrete classes like "short", "medium", "long" etc.
Please describe the module you would like to add to bricks
See here: https://platform.openai.com/tokenizer
Do you already have an implementation?
Tiktoken has a basic implementation in their docs, e.g. for gpt-3.5.-turbo. We just need to calculate the number of tokens of a text input.
Additional context
Also use this to calculate the complexity of a paragraph for RAG; for instance, > 1000 tokens means a high complexity.
The text was updated successfully, but these errors were encountered: