Multi-LLM deliberation with anonymized peer review. Runs a 3-stage council: parallel responses → anonymous ranking → synthesis. Based on Andrej Karpathy's LLM Council concept.
llmctx transforms technical documentation into AI-ready formats. It provides a simple way to access condensed, LLM-friendly versions of popular framework and library documentation through preset URLs.
An MCP server that enables cross-LLM communication and memory sharing, allowing different AI models to collaborate and share context across conversations.
Token usage tracker for OpenAI and Claude APIs with MCP support, real-time session tracking, and accurate pricing for 2025 models
Connect your AI coding assistant directly to up-to-date Svelte 5 and SvelteKit documentation via this MCP server.
This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app).
Unify and supercharge your LLM workflows by connecting your applications to any model. Easily switch between various LLM providers and leverage their unique strengths for complex reasoning tasks. Expe
Agent infrastructure that runs itself. Six tools for autonomous agents: free LLM model recommendations across 17 models, real-time provider availability monitoring, LLM routing proxy with 3% margin, a
Serve MCP resources and tools over a streamable HTTP interface to enable dynamic integration with LLM applications. Provide efficient, real-time access to external data and actions through a standardi
Convert any webpage to clean markdown and feed it directly into AI agent workflows. Why This Matters? Adding webpages to LLM conversations usually means dumping raw HTML, bloated with ads, scripts,
AI-powered web search via Parallel API. Returns ranked results with LLM-optimized excerpts. Use for up-to-date research, fact-checking, and domain-scoped searching.
Provide a server implementation that integrates with the Model Context Protocol to expose tools, resources, and prompts for LLM applications. Enable dynamic interaction with external data and actions
Use natural language to explore LLM observability, traces, and monitoring data captured by Opik.
Provide a scaffolded environment to develop and run MCP servers with ease. Enable rapid prototyping and integration of tools, resources, and prompts for LLM applications. Simplify MCP server setup and
A MCP server for replacing Rest Clients like Postman/Insomnia, by allowing your LLM to maintain and use api collections.
Manage your Clockify time entries effortlessly by sending prompts to an LLM. Add, view, and control your time tracking directly through AI-powered interactions. Simplify your time management workflow
Enable seamless integration of Jira issue tracking within your LLM applications. Fetch, create, and manage Jira issues and projects dynamically to enhance your workflow automation. Streamline your pro
MCP server for Apache Kafka that allows LLM agents to inspect topics, consumer groups, and safely manage offsets (reset, rewind).
The Unitree Go2 MCP Server is a server built on the MCP that enables users to control the Unitree Go2 robot using natural language commands interpreted by a LLM.
Enable seamless interaction with Alpaca's trading API through a standardized protocol. Execute trading operations, fetch market data, and manage portfolios efficiently within your LLM applications. Si