You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/website/docs/hub/features/mcp-server.md
+100Lines changed: 100 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,6 +4,106 @@ description: Install the dlt MCP with your preferred LLM-enabled IDE.
4
4
keywords: [mcp, llm, agents, ai]
5
5
---
6
6
7
+
# Workspace MCP Server - current status
8
+
9
+
The server can do the following:
10
+
- list pipelines in workspace
11
+
- inspect table schema and data for dataset in particular pipeline
12
+
- do sql queries
13
+
14
+
It is the same server that is called **the open-source `dlt`** in the documentation below.
15
+
16
+
Since all mcp clients work with `sse` transport, it is the default when running the server. Before we were struggling with
17
+
launching `mcp` as a part of client process. There was no way to pass right Python virtual environment and dlt run context.
18
+
There were also issues with `stdio` pollution from `print` statement (overall that was IMO a dead end, mcp is a server by nature.)
19
+
20
+
To launch the server in workspace context:
21
+
```sh
22
+
dlt workspace mcp
23
+
24
+
INFO: Started server process [24925]
25
+
INFO: Waiting for application startup.
26
+
INFO: Application startup complete.
27
+
INFO: Uvicorn running on http://127.0.0.1:43654 (Press CTRL+C to quit)
28
+
```
29
+
Workspace mcp server has **43654** as default port and is configured without any path (ie `/sse`) so user can just copy the link above in the right
30
+
client.
31
+
32
+
To launch the server in pipeline context:
33
+
```sh
34
+
dlt pipeline fruitshop mcp
35
+
36
+
Starting dlt MCP server
37
+
INFO: Started server process [28972]
38
+
INFO: Waiting for application startup.
39
+
INFO: Application startup complete.
40
+
INFO: Uvicorn running on http://127.0.0.1:43656 (Press CTRL+C to quit)
41
+
42
+
```
43
+
Pipeline mcp server has **43656** as default port. Pipeline is already attached when mcp server starts. Both pipeline and workspace mcps can work side by side.
44
+
45
+
46
+
Example client configurations
47
+
48
+
Cursor, Cline
49
+
```json
50
+
{
51
+
"mcpServers": {
52
+
"dlt-workspace": {
53
+
"url": "http://127.0.0.1:43654/"
54
+
},
55
+
"dlt-pipeline-mcp": {
56
+
"url": "http://127.0.0.1:43656/"
57
+
}
58
+
}
59
+
}
60
+
```
61
+
62
+
Continue: for some reason it does see mcp configuration created in dev container. Maybe someone will make it work...
63
+
```yaml
64
+
name: dlt mcps
65
+
version: 0.0.1
66
+
schema: v1
67
+
mcpServers:
68
+
- name: dlt-workspace
69
+
type: sse
70
+
url: "http://localhost:43654"
71
+
```
72
+
73
+
## Configuration
74
+
Server can still be started with `stdio` transport and different port using the command line. The plan is to allow to configure
75
+
mcp deeply via dlt configuration system.
76
+
77
+
```toml
78
+
[workspace.mcp]
79
+
path="/sse"
80
+
port=888
81
+
```
82
+
83
+
```toml
84
+
[pipelines.fruitshop.mcp]
85
+
transport="stdio"
86
+
```
87
+
88
+
## Interactions with Runtime
89
+
This is a heads-up on how we host mcps on runtime. To be deleted.
90
+
91
+
* deployed workspace dashboard has two routes `/app` to see the notebook and `/mcp` to connect to mcp server
92
+
* workspace dashboard in single pipeline mode `/app/fruitshop` and `/mcp/fruitshop`
93
+
* I'm also pondering exposing some kind of mcp attached to each marimo notebook
94
+
95
+
96
+
# Project MCP server
97
+
98
+
This is our "project" mcp (**integrates with `dltHub` features** below) and can be launched with:
99
+
```sh
100
+
dlt project mcp
101
+
```
102
+
It gets **43655** port and project context is obtained before launching the server.
103
+
104
+
105
+
106
+
7
107
# MCP Server
8
108
9
109
Currently, dltHub is [building two MCP servers](https://dlthub.com/blog/deep-dive-assistants-mcp-continue) that you can run locally and integrate with your preferred IDE. One server is for the open-source `dlt` library and the other integrates with `dltHub` features ([Learn more](ai.md)).
0 commit comments