DataDam is a Model Context Protocol (MCP) server backed by Supabase. It supports both streamable HTTP endpoints and stdio connections, allowing multiple AI tools to share a single personal database.
Important: There is no auth yet. Do not store sensitive data. OAuth is planned.
-
How it works
- Your AI tool will invoke the neccessary tools in your console/command line upon needing personal information.
- It will also fill out the parameters of the call itself
- Categories group related records (e.g.,
books
,contacts
,basic_information
). All datapoints are assigned to a category. - Tags are used as an optional refinement to narrow down results within each category
- More information on how each tool works can be found here
-
Data model
- Categories are maintained in the database and surfaced via the
data://categories
resource, which are static at the moment. - Filtering order: choose a category first, then use
tags
to further narrow results within that category (tags are optional refinements, not replacements).
- Categories are maintained in the database and surfaced via the
-
Server tools (at
…/mcp
)
Tool | Title | Purpose | Required | Optional |
---|---|---|---|---|
datadam_search_personal_data |
Search Personal Data | Find records by title and content; filter by categories/tags. | query |
categories , tags , classification , limit , userId |
datadam_extract_personal_data |
Extract Personal Data by Category | List items in one category, optionally filtered by tags. | category |
tags , limit , offset , userId , filters |
datadam_create_personal_data |
Create Personal Data | Store a new record with category, title, and JSON content. | category , title , content |
tags , classification , userId |
datadam_update_personal_data |
Update Personal Data | Update fields on an existing record by ID. | recordId |
title , content , tags , category , classification |
datadam_delete_personal_data |
Delete Personal Data | Delete one or more records; optional hard delete. | recordIds |
hardDelete |
- ChatGPT endpoint tools (at
…/chatgpt_mcp
)
Tool | Title | Purpose | Required | Optional |
---|---|---|---|---|
search |
Search (ChatGPT) | Return citation-friendly results for a query. | query |
— |
fetch |
Fetch (ChatGPT) | Return full document content by ID. | id |
— |
DataDam supports two connection methods:
- Use case: Hosted deployments, multiple clients, web-based AI tools
- Setup: Deploy to cloud service (e.g., Render), configure clients with URL
- Environment: Server-side environment variables in hosting platform
- Protocol: HTTP/HTTPS with MCP over streamable transport
- Use case: Local development, single-client setups, desktop AI applications
- Setup: Run server.js locally, configure clients to launch the process
- Environment: Local environment variables or passed via client config
- Protocol: MCP over stdio transport with direct process communication
- Homebrew: Package Manager for MacOS and Linux - Homebrew
- Git: Version control system - Download Git
- Node.js + npm: JavaScript runtime and package manager - Download Node.js
- Accounts: Supabase (required), Render (for hosting)
1. Clone this repository:
git clone https://github.com/KennethLeeJE8/datadam_mcp.git && cd datadam_mcp
2. Install dependencies:
npm install
3. Build the TypeScript code:
npm run build
Happy to help if you have any problems w the setup! Shoot me a message or send me an email at kennethleeje8@gmail.com :)
1. Create a Supabase account
- Go to Supabase Sign Up to create your account
- Important: Remember your password - you'll need it for the database connection later
- Create a new project and wait for it to finish setting up
2. Load the database schema (choose preferred option):
Option 2a) Using Supabase SQL Editor:
- Copy the entire contents of src/database/schema.sql
- Supabase Dashboard → SQL Editor → New query
- Paste the copied schema code into the editor
- Click "Run" to execute the schema
3. You should see your Supabase table editor view populated with tables in "Table Editor".
✅ Supabase setup is complete! Your database is ready to use.
Select the connection method based on your AI tool:
-
Option A: Stdio (Standard Input/Output)
- Use for: Coding agents (Cursor, Windsurf, etc.), Claude Desktop (Free tier)
- Next step: Continue to Local Testing section below
-
Option B: HTTP Streamable
- Use for: ChatGPT Plus or higher, Claude Pro or higher
- Next step: Skip to Render Deployment section
1. Set up environment variables by cloning the .env file:
cp .env.example .env
Edit .env
and add your Supabase credentials:
To find your SUPABASE_URL:
- Supabase Dashboard → Project Settings → Data API → Project URL
To find your SUPABASE_SERVICE_ROLE_KEY:
- Supabase Dashboard → Project Settings → API Keys → service_role (click "Reveal" to copy)
2. Test the connection with the MCP Inspector:
npm run inspector:stdio
- Transport: Select "stdio"
- Arguments: Enter "server.js"
- Click "Connect"
3. Verify the setup:
- Verify: The inspector should connect and show available tools, confirming Supabase database connection
- Test: Go to the Tools tab and click "List Tools" → find "extract_personal_data_tool" → enter "interests" for categories → click "Run Tool" to verify database connectivity
- You should see a datapoint on "MCP (Model Context Protocol)"
Feel free to use any hosting platform, this is personal preference.
1. Go to Render Dashboard and click New > Web Service
2. Choose "Build and deploy from a Git repository" and click Next
3. Connect to the public GitHub repository:
- Repository URL:
https://github.com/KennethLeeJE8/datadam_mcp.git
- Branch:
main
4. The render.yaml
file automatically configures most settings (name, runtime, build/start commands, root directory, etc.)
5. Fill in the environment variables in the Advanced section:
SUPABASE_URL
- Get from: Supabase Dashboard → Project Settings → API → Project URLSUPABASE_SERVICE_ROLE_KEY
- Get from: Supabase Dashboard → Project Settings → API → Project API keys → service_role (click "Reveal" to copy)
6. Click Create Web Service to deploy
Notes:
- Health check path is automatically set to
/health
via render.yaml - Free tier can hit limits; use Standard tier for reliable uptime
- Health endpoint:
curl http://{render_url}/health
- Test:
Go back to the Command Line and run:
npm run inspector:http
- Server URL: Enter
http://<YOUR_RENDER_URL>/mcp
- Transport: Select "HTTP"
- Click "Connect"
- Server URL: Enter
- Verify: The inspector should connect and show available tools, confirming Supabase database connection
- Test: Go to the Tools tab and click "List Tools" → find "extract_personal_data_tool" → enter "interests" for categories → click "Run Tool" to verify database connectivity
For hosted deployments using streamable HTTP:
Notes
- The server's database credentials belong in hosting platform environment variables, not in clients
Claude Desktop (Custom Connector)
- Open Claude Desktop → Connectors → Add Custom Connector.
- Name:
dataDam
- Type: HTTP
- URL:
https://<YOUR_RENDER_URL>/mcp
- No local
.env
needed; the server reads credentials from Render.
ChatGPT (Connectors / Deep Research)
- Note: ChatGPT only supports HTTP connections, not stdio
- Requirement: Custom connectors require ChatGPT Pro, Business, Enterprise, or Edu subscription
- Enable Developer Mode in Settings → Connectors → Advanced → Developer mode.
- Add a custom MCP server using the ChatGPT endpoint:
- URL:
https://<YOUR_RENDER_URL>/chatgpt_mcp
- URL:
- The server implements
search
andfetch
as required.
Cursor (and similar coding agents)
- Many editors/agents use a similar JSON shape for MCP servers. Adapt paths and UI as needed.
{
"mcpServers": {
"dataDam": {
"type": "http",
"url": "https://<YOUR_RENDER_URL>/mcp"
}
}
}
Generic MCP Clients
- If your tool supports MCP over HTTP, configure:
- Type:
http
- URL:
https://<service>.onrender.com/mcp
- Type:
For local development using stdio transport:
Notes
- Clone this repository locally and use the
server.js
file - The client launches the server process directly
MCP Client Config:
{
"mcpServers": {
"dataDam": {
"command": "node",
"args": ["path/to/server.js"],
"env": {
"SUPABASE_URL": "your_supabase_url",
"SUPABASE_SERVICE_ROLE_KEY": "your_service_role_key"
}
}
}
}
server.js
and replace environment variables with your actual Supabase credentials
Claude Desktop
- Open Claude Desktop → Settings → Developer → Edit Config
- Add the MCP server configuration
Claude Code
- Open your
.claude.json
file in your IDE (use search tool to search for "mcp" if you can't find it) - Add the MCP server configuration under mcpServers
Config file locations:
- Codex:
~/.codex/config.toml
(see docs) - Other coding agents: Similar JSON format in their respective config files
Works w all the coding agents
- Framework: Express.js with TypeScript
- Database: Supabase (PostgreSQL) with Row Level Security
- MCP SDK:
@modelcontextprotocol/sdk
- CORS: Configured for browser-based clients
- Environment: dotenv for configuration management
Categories available:
- interests
- digital_products
- favourite_authors
- basic_information
- contacts
- books
The MCP Server is designed to teach the AI to retrieve personal information it needs to answer your questions. Your AI tool should make tool calls as it needs personal context to give you a better answer.
Tips to use tools:
- Mention DataDam MCP in your prompt to let the AI tool know your want data from it
- Using "my {category_name}" in your query will trigger the AI to use DataDam
- Ensure to use plural form for the categories, such as 'books' instead of book, 'contacts' instead of 'contact
You can add categories in the category_resgistry table and it will dynamically update in resources.
-
datadam_search_personal_data
- Purpose: Find records by title and content; optionally filter by categories and tags.
- Args:
query
(required);categories?
string[];tags?
string[];classification?
one ofpublic|personal|sensitive|confidential
;limit?
number (default 20);userId?
string (UUID). - Example:
{ "query": "John", "categories": ["contacts"], "limit": 10 }
-
datadam_extract_personal_data
- Purpose: List items in a single category; refine with tags.
- Args:
category
(required string);tags?
string[];limit?
number (default 50);offset?
number;userId?
string (UUID);filters?
object. - Example:
{ "category": "contacts", "tags": ["family"], "limit": 20 }
-
datadam_create_personal_data
- Purpose: Store a new record.
- Args:
category
(required string);title
(required string);content
(required object/JSON);tags?
string[];classification?
(defaultpersonal
);userId?
string (UUID). - Example:
{ "category": "documents", "title": "Passport", "content": { "number": "A123...", "country": "US" }, "tags": ["important"] }
-
datadam_update_personal_data
- Purpose: Update fields on an existing record by ID.
- Args:
recordId
(required string UUID); plus any fields to change:title?
,content?
,tags?
,category?
,classification?
. - Example:
{ "recordId": "<UUID>", "title": "Emergency Contact – Updated" }
-
datadam_delete_personal_data
- Purpose: Delete one or more records; optional hard delete for permanent removal.
- Args:
recordIds
(required string[] of UUIDs);hardDelete?
boolean (default false). - Example:
{ "recordIds": ["<UUID1>", "<UUID2>"], "hardDelete": false }
-
search
- Purpose: Return citation-friendly results for a query.
- Args:
query
(required string). - Example:
{ "query": "contacts" }
-
fetch
- Purpose: Return full document content by ID.
- Args:
id
(required string UUID). - Example:
{ "id": "<DOCUMENT_ID>" }
If you want to scope data to specific users, you can set up user authentication and profiles:
- Create a user in Supabase Authentication → Users; copy the UUID for later.
- Insert a profile row for your Auth user:
INSERT INTO profiles (user_id, username, full_name, metadata) VALUES ('<AUTH_USER_UUID>'::uuid, 'your_username', 'Your Name', '{}'::jsonb);
- Some tools can scope operations to a particular user by accepting a
userId
argument (UUID from Supabase Auth). This field is optional. - If your client supports passing environment variables to tool calls, you may set a convenience variable like
DATABASE_USER_ID
in the client's MCP config and have your prompts/tools use it when needed. - Otherwise, just supply
userId
explicitly in the tool call input when you want to target a specific user.
-
Health check fails
- Verify Render env vars are set; inspect Render logs
- Confirm Supabase URL/key values
-
Empty categories/data
- Insert data; run
select * from get_active_categories();
- Insert data; run
-
Client cannot connect
- Use the
…/mcp
URL (or…/chatgpt_mcp
for ChatGPT) - Check CORS/firewall and that the service is not sleeping (Starter tier)
- Use the
- No authentication yet — do not store sensitive data
- Use
SUPABASE_SERVICE_ROLE_KEY
(server-side only in Render) for full functionality and the complete toolset. - OAuth and stronger auth are planned
- If you need read/limited writes only, you can deploy with
SUPABASE_ANON_KEY
instead of the service role key. - Writes will depend on your Row Level Security (RLS) policies, and some tools (create/update/delete) may fail under anon.
MIT License