mtwKernel
Open source · Self-hosted · MCP-native

Your personal AI kernel.

60+ life modules — tasks, contacts, finance, agents, smart home — exposed as MCP tools to any LLM. Your data stays on your machine. Plug it into Claude, ChatGPT, or your own local model.

$ curl -fsSL https://mtwkernel.com/install.sh | sh
Other install options
macOS Linux Windows Docker

Free · Apache-2.0 · No telemetry · No accounts

60+ life modules
7 notification channels
4 LLM providers
100% on your machine

Your entire life, one kernel.

Every module is an MCP tool. Each one is optional, hot-swappable, and runs locally against your SQLite database.

TasksGTD-style, with deps
👥CRMContacts + interactions
📅EventsCalendar + RSVPs
RemindersWith smart triage
💰FinanceAccounts + budgets
📈TradingCCXT + Pine + GDS
🛒ShoppingLists, stores, prices
💪TrainingPrograms + PRs
🥗NutritionFood log + fasting
📝NotesMarkdown + tags
🎯GoalsOKRs + key results
⏱️Time trackingPer task / project
📚LearningSRS flashcards
📰RSSFeeds + discovery
🎬CinemaWatchlist + ratings
📖BooksLibrary + highlights
✈️TravelTrips + flights + docs
🏠Life logDaily / weekly digests
💡LightsZones + scenes
📷CamerasONVIF discovery
📧CommsGmail + IMAP + threads
🐦SocialNostr + Twitter
🤖AgentsOffices + chains + sandbox
🧠Graph intelNeo4j + GDS, optional
🔐SecurityAudit, secret scan, ports
🌐Web intelScrape + search
🔁FederationMesh + peer sync

One server. Every AI sees it.

mtwKernel speaks the Model Context Protocol over both stdio and Streamable HTTP. Any MCP-capable client connects to the same set of tools.

Your AI clients

Claude Desktopvia stdio
Cursor / Zedvia stdio
ChatGPTvia HTTP
Local LLMLM Studio · Ollama

mtwKernel

MCP Server (stdio + HTTP)
Module registry · 60+ tools
SQLite + (optional) Neo4j
Local embeddings · ONNX
localhost:3086

Your stuff

Life dataSQLite
IntegrationsGmail · Calendar · GitHub · IMAP
ChannelsWhatsApp · Telegram · Slack · Discord
Smart homeLights · Cameras · ONVIF
{
  "mcpServers": {
    "mtw-kernel": {
      "command": "bun",
      "args": ["run", "bin/mcp-server.ts"],
      "cwd": "/path/to/mtwKernel"
    }
  }
}

Install in one line.

Detects your OS, downloads the right package, and opens the dashboard.

One-line install
curl -fsSL https://mtwkernel.com/install.sh | sh
One-line install
curl -fsSL https://mtwkernel.com/install.sh | sh
Debian / Ubuntu
sudo apt install ./mtwkernel_*.deb
Fedora / RHEL
sudo dnf install ./mtwkernel-*.rpm
Manual download .deb · .rpm →
One-line install (PowerShell)
irm https://mtwkernel.com/install.ps1 | iex
Manual download .msi · .zip →
Docker Compose
git clone https://github.com/fastslack/mtwKernel
cd mtwKernel
cp .env.example .env
docker compose up -d --build

Runs kernel + dashboard + Neo4j + WhatsApp bridge. Dashboard at http://localhost:3086.

After install, the dashboard opens at http://localhost:3086. Configure LLM keys via ~/.config/mtwkernel/.env.

Why run it yourself?

🔒

Privacy by default

Your tasks, contacts, finance, health data — all in a SQLite file on your machine. Nothing leaves unless you tell it to.

🧩

Extensible by design

Every module is a plugin. Drop a .mtwext bundle in, or write your own — same MCP surface.

Fast & light

Bun runtime, statically linked. Starts in under a second. ~150 MB on disk.

🛠️

Truly open

Apache-2.0 license. Fork it, sell plugins on top, build a competing distro — that's the point.

🤖

Local-first LLMs

LM Studio and Ollama work out of the box. Run agents on your own GPU.

📡

Federated, not centralized

Mesh with other instances over the Matware mesh protocol. Your phone, your laptop, your VPS — same kernel.

Ready to take your life off the cloud?