-
Notifications
You must be signed in to change notification settings - Fork 72
feat: add usage command #79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds comprehensive token usage tracking and visualization capabilities to the application. It introduces accurate tokenization for different LLM providers, persistent usage data storage, and a new /usage command to display token statistics.
Key Changes:
- Implements provider-specific tokenizers (OpenAI/tiktoken, Anthropic, Llama) with fallback support
- Adds models.dev integration for fetching model metadata (context limits, costs)
- Creates usage tracking infrastructure with session storage and daily aggregates
- Introduces
/usagecommand with detailed breakdown and progress visualization
Reviewed Changes
Copilot reviewed 27 out of 28 changed files in this pull request and generated 12 comments.
Show a summary per file
| File | Description |
|---|---|
| source/usage/types.ts | Defines TypeScript interfaces for token breakdown, session usage, and aggregate data |
| source/usage/tracker.ts | Implements SessionTracker class for tracking current session token usage |
| source/usage/storage.ts | Handles persistent storage of usage data with session history and daily aggregates |
| source/usage/calculator.ts | Provides token calculation utilities and formatting helpers |
| source/usage/index.ts | Re-exports usage tracking functionality |
| source/tokenization/types.ts | Defines Tokenizer interface and provider types |
| source/tokenization/tokenizers/*.ts | Implements provider-specific tokenizers (OpenAI, Anthropic, Llama, fallback) |
| source/tokenization/tokenizer-factory.ts | Factory for creating appropriate tokenizer based on provider/model |
| source/tokenization/index.ts | Exports tokenizer creation function |
| source/models/models-types.ts | Type definitions for models.dev API data structures |
| source/models/models-dev-client.ts | Client for fetching and querying model metadata from models.dev |
| source/models/models-cache.ts | Cache management for models.dev data with 7-day expiration |
| source/models/index.ts | Exports model context limit lookup function |
| source/hooks/useAppState.tsx | Integrates tokenizer into app state with memoization |
| source/hooks/useAppInitialization.tsx | Registers the new usage command |
| source/config/paths.ts | Simplifies app data path to use xdg-basedir |
| source/components/usage/usage-display.tsx | React component for displaying detailed usage statistics |
| source/components/usage/progress-bar.tsx | ASCII progress bar component for usage visualization |
| source/commands/usage.tsx | Implementation of /usage command handler |
| source/commands/index.ts | Exports the new usage command |
| source/ai-sdk-client.ts | Adds cached context size fetching from models.dev |
| scripts/fetch-models.js | Post-install script to pre-fetch models.dev data |
| package.json | Adds new dependencies and postinstall script |
| pnpm-lock.yaml | Updates lockfile with new dependencies |
| knip.json | Updates configuration to ignore usage module files |
Files not reviewed (1)
- pnpm-lock.yaml: Language not supported
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Description
Solves #12
Brief description of what this PR does
Type of Change
Testing
Automated Tests
.spec.ts/tsxfilespnpm test:allcompletes successfully)Manual Testing
Checklist