@@ -54,7 +54,7 @@ This implementation was adapted from the [Model Context Protocol quickstart guid
54
54
- 🌐 ** Multi-Server Support** : Connect to multiple MCP servers simultaneously
55
55
- 🚀 ** Multiple Transport Types** : Supports STDIO, SSE, and Streamable HTTP server connections
56
56
- 🎨 ** Rich Terminal Interface** : Interactive console UI
57
- - 🖥️ ** Streaming Responses** : View model outputs in real-time as they're generated
57
+ - 🌊 ** Streaming Responses** : View model outputs in real-time as they're generated
58
58
- 🛠️ ** Tool Management** : Enable/disable specific tools or entire servers during chat sessions
59
59
- 🧑💻 ** Human-in-the-Loop (HIL)** : Review and approve tool executions before they run for enhanced control and safety
60
60
- 🎮 ** Advanced Model Configuration** : Fine-tune 10+ model parameters including temperature, sampling, repetition control, and more
@@ -72,6 +72,7 @@ This implementation was adapted from the [Model Context Protocol quickstart guid
72
72
- 📊 ** Usage Analytics** : Track token consumption and conversation history metrics
73
73
- 🔌 ** Plug-and-Play** : Works immediately with standard MCP-compliant tool servers
74
74
- 🔔 ** Update Notifications** : Automatically detects when a new version is available
75
+ - 🖥️ ** Modern CLI with Typer** : Grouped options, shell autocompletion, and improved help output
75
76
76
77
## Requirements
77
78
@@ -116,7 +117,16 @@ ollmcp
116
117
117
118
### Command-line Arguments
118
119
119
- #### Server Options:
120
+ > [ !TIP]
121
+ > The CLI now uses ` Typer ` for a modern experience: grouped options, rich help, and built-in shell autocompletion. To enable autocompletion, run:
122
+ >
123
+ > ``` bash
124
+ > ollmcp --install-completion
125
+ > ` ` `
126
+ >
127
+ > Then restart your shell or follow the printed instructions.
128
+
129
+ # ### MCP Server Configuration:
120
130
121
131
- ` --mcp-server` : Path to one or more MCP server scripts (.py or .js). Can be specified multiple times.
122
132
- ` --servers-json` : Path to a JSON file with server configurations.
@@ -126,11 +136,19 @@ ollmcp
126
136
> Claude' s configuration file is typically located at:
127
137
> ` ~/Library/Application Support/Claude/claude_desktop_config.json`
128
138
129
- #### Model Options:
139
+
140
+ # ### Ollama Configuration:
130
141
131
142
- ` --model MODEL` : Ollama model to use. Default: ` qwen2.5:7b`
132
143
- ` --host HOST` : Ollama host URL. Default: ` http://localhost:11434`
133
144
145
+ # ### General Options:
146
+
147
+ - ` --version` : Show version and exit
148
+ - ` --install-completion` : Install shell autocompletion scripts for the client
149
+ - ` --show-completion` : Show available shell completion options
150
+ - ` --help` : Show help message and exit
151
+
134
152
# ## Usage Examples
135
153
136
154
Connect to a single server:
@@ -322,6 +340,12 @@ What would you like to do? (y):
322
340
323
341
## Autocomplete and Prompt Features
324
342
343
+ ### Typer Shell Autocompletion
344
+
345
+ - The CLI supports shell autocompletion for all options and arguments via Typer
346
+ - To enable, run ` ollmcp --install-completion ` and follow the instructions for your shell
347
+ - Enjoy tab-completion for all grouped and general options
348
+
325
349
### FZF-style Autocomplete
326
350
327
351
- Fuzzy matching for commands as you type
@@ -448,6 +472,8 @@ This project is licensed under the MIT License - see the [LICENSE](LICENSE) file
448
472
- [ Model Context Protocol] ( https://modelcontextprotocol.io/ ) for the specification and examples
449
473
- [ Ollama] ( https://ollama.com/ ) for the local LLM runtime
450
474
- [ Rich] ( https://rich.readthedocs.io/ ) for the terminal user interface
475
+ - [ Typer] ( https://typer.tiangolo.com/ ) for the modern CLI experience
476
+ - [ UV] ( https://www.uvicorn.org/ ) for the lightning-fast Python package manager and virtual environment management
451
477
452
478
---
453
479
0 commit comments