Self-hosted RAG engine with hybrid semantic and keyword search, document ingestion, local privacy, and seamless OpenClaw integration via Docker.
MCP tool access to MarkItDown -- a library that converts many file formats (local or remote) to Markdown for LLM consumption.
An MCP server that allows checking local time on the client machine or current UTC time from an NTP server
MCP of MCPs. Automatic discovery and configure MCP servers on your local machine.
AI image generation & editing MCP server with 1,500+ curated prompt library, smart prompt enhancement, and multi-provider routing (local ComfyUI, MeiGen Cloud, OpenAI-compatible APIs).
Access the time in any timezone and get the current local time
Five-level AI collaboration system with persistent memory and anticipatory capabilities. MCP-native integration for Claude and other LLMs with local-first architecture via MemDocs.
Chroma MCP server to access local and cloud Chroma instances for retrieval capabilities
Lightweight local RAG MCP server for semantic vector search over markdown documents. Reduces token consumption by 40x with sqlite-vec and multilingual-e5-small embeddings. Supports filtered search by
Interact with Obsidian vaults for knowledge management. Create, read, update, and search notes. Works with local Obsidian vaults using filesystem access.
Persistent memory and guardrails for Claude Code. Features mistake tracking, loop detection, scope guard, and hooks that block risky edits. Runs locally with Ollama.
Desktop app that captures screen activity via event-driven screenshots, stores AI-generated summaries and OCR text locally in SQLite, and exposes your activity history to AI assistants via MCP with se
Persistent intelligence infrastructure for agentic development that gives AI coding assistants cumulative memory and pattern learning. Hybrid TypeScript/Rust implementation with local-first storage us
Local MCP server for generating image assets with Google Gemini (Nano Banana 2 / Pro). Supports transparent PNG/WebP output, exact resizing/cropping, up to 14 reference images, and Google Search groun
Python Streamable HTTP Server you can run locally to interact with [memvid](https://github.com/Olow304/memvid) storage and semantic search.
Calculate detention, demurrage, and local shipping charges for ports worldwide. Compare inland haulage rates and CFS costs across multiple shipping lines to optimize logistics expenses. Access compreh
Persistent memory for AI coding agents with semantic search, auto-capture, cross-session learning, and intelligent forgetting. 12 MCP tools, local-first.
Zero-dependency MCP memory server. Single binary, 100% local, no Docker.
Local MCP server for searching 300,000+ foods, nutrition facts, and barcodes from the OpenNutrition database.
Analyze React code locally, generate docs / llm.txt for whole project at once
Allows the AI to read from your local Apple Notes database (macOS only)
An MCP Server that enables AI assistants to interact with your local browsers.
A MCP server for LocalStack to manage local AWS environments, including lifecycle operations, infra deployments, log analysis, fault injection, and state management.
A CLI for interacting with GitKraken APIs. Includes an MCP server via `gk mcp` that not only wraps GitKraken APIs, but also Jira, GitHub, GitLab, and more. Works with local tools and remote services.