Releases: shcherbak-ai/contextgem
v0.6.1
v0.6.0
v0.6.0
Added LabelConcept - a classification concept type that categorizes content using predefined labels.
v0.5.0
Fixed params handling for reasoning (CoT-capable) models other than OpenAI o-series. Enabled automatic retry of LLM calls with dropping unsupported params if such unsupported params were set for the model. Improved handling and validation of LLM call params.
Migrated to wtpsplit-lite - a lightweight version of wtsplit that only retains accelerated ONNX inference of SaT models with minimal dependencies.
v0.4.1
Comprehensive docs on extracting aspects, extracting concepts, and LLM extraction methods.
v0.4.0
Support for local SaT model paths in Document's sat_model_id
parameter.
v0.3.0
Expanded JsonObjectConcept to support nested class hierarchies, nested dictionary structures, lists containing objects, and literal types.
v0.2.4
Fixed
- Removed 'think' tags and content from LLM outputs (e.g. when using DeepSeek R1 via Ollama) which was breaking JSON parsing and validation
Added
- Documentation for cloud/local LLMs and LLM configuration guide
v0.2.3
Updated litellm dependency version after encoding bug has been fixed upstream. Updated README.
v0.2.2
Refactored DOCX converter internals for better maintainability. Updated README. Added CHANGELOG.
v0.2.1
Fix: encoding bug in litellm > v1.67.1. Docs update.