You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: adding num_ctx to set the size of the model context window (#75)
* feat: adding num_ctx to set the size of the model context window.
* docs: updating main README.md
* feat: update help removing ollama num_ctx default, since this can change in the future
- 🎨 **Rich Terminal Interface**: Interactive console UI with modern styling
60
60
- 🌊 **Streaming Responses**: View model outputs in real-time as they're generated
61
61
- 🛠️ **Tool Management**: Enable/disable specific tools or entire servers during chat sessions
62
62
- 🧑💻 **Human-in-the-Loop (HIL)**: Review and approve tool executions before they run for enhanced control and safety
63
-
- 🎮 **Advanced Model Configuration**: Fine-tune 10+ model parameters including temperature, sampling, repetition control, and more
63
+
- 🎮 **Advanced Model Configuration**: Fine-tune 15+ model parameters including context window size, temperature, sampling, repetition control, and more
64
64
- 💬 **System Prompt Customization**: Define and edit the system prompt to control model behavior and persona
65
+
- 🧠 **Context Window Control**: Adjust the context window size (num_ctx) to handle longer conversations and complex tasks
65
66
- 🎨 **Enhanced Tool Display**: Beautiful, structured visualization of tool executions with JSON syntax highlighting
66
67
- 🧠 **Context Management**: Control conversation memory with configurable retention settings
67
68
- 🤔 **Thinking Mode**: Advanced reasoning capabilities with visible thought processes for supported models (e.g., gpt-oss, deepseek-r1, qwen3, etc.)
68
69
- 🗣️ **Cross-Language Support**: Seamlessly work with both Python and JavaScript MCP servers
69
70
- 🔍 **Auto-Discovery**: Automatically find and use Claude's existing MCP server configurations
70
71
- 🔁 **Dynamic Model Switching**: Switch between any installed Ollama model without restarting
71
-
- 💾 **Configuration Persistence**: Save and load tool preferences between sessions
72
+
- 💾 **Configuration Persistence**: Save and load tool preferences and model settings between sessions
72
73
- 🔄 **Server Reloading**: Hot-reload MCP servers during development without restarting the client
73
74
- ✨ **Fuzzy Autocomplete**: Interactive, arrow-key command autocomplete with descriptions
74
75
- 🏷️ **Dynamic Prompt**: Shows current model, thinking mode, and enabled tools
@@ -298,6 +299,8 @@ The `model-config` (`mc`) command opens the advanced model settings interface, a
298
299
299
300
#### Key Parameters
300
301
302
+
-**System Prompt**: Set the model's role and behavior to guide responses.
303
+
-**Context Window (num_ctx)**: Set how much chat history the model uses. Balance with memory usage and performance.
301
304
-**Keep Tokens**: Prevent important tokens from being dropped
302
305
-**Max Tokens**: Limit response length (0 = auto)
303
306
-**Seed**: Make outputs reproducible (set to -1 for random)
@@ -309,7 +312,7 @@ The `model-config` (`mc`) command opens the advanced model settings interface, a
309
312
310
313
#### Commands
311
314
312
-
- Enter parameter numbers `1-14` to edit settings
315
+
- Enter parameter numbers `1-15` to edit settings
313
316
- Enter `sp` to edit the system prompt
314
317
- Use `u1`, `u2`, etc. to unset parameters, or `uall` to reset all
315
318
-`h`/`help`: Show parameter details and tips
@@ -324,6 +327,7 @@ The `model-config` (`mc`) command opens the advanced model settings interface, a
-**Large Context:**`num_ctx: 8192` or higher for complex conversations requiring more context
327
331
328
332
> [!TIP]
329
333
> All parameters default to unset, letting Ollama use its own optimized values. Use `help` in the config menu for details and recommendations. Changes are saved with your configuration.
0 commit comments