MCP Integration
Ask your AI assistant about your monitors
Connect exit1.dev to Claude, Cursor, or Windsurf via the Model Context Protocol. Check uptime, investigate failures, and compare response times — all in natural language. No dashboards required.
Key Features
Everything you need to monitor your infrastructure effectively
Natural Language Queries
Ask "are any of my monitors down?" or "what's the uptime for my API this week?" and get instant answers. No dashboards, no clicking, no context switching.
Works with Claude, Cursor & Windsurf
Connect to Claude Desktop, Claude Code, Cursor, or Windsurf. Any AI tool that supports the Model Context Protocol can plug in.
5 Read-Only Tools
List checks, get check details, pull historical results, query aggregate stats, and view status pages. All read-only — your data stays safe.
One-Line Setup
Install via npx with zero dependencies. Add your API key to the config and restart your AI assistant. Connected in under 2 minutes.
Uptime & Performance Stats
Query uptime percentages, average/min/max response times across multiple time ranges. Compare this week vs last week in a single prompt.
Incident Investigation
Ask your AI assistant to show recent failures, filter by status, and dig into historical check results with timestamps and error details.
Why Choose exit1.dev?
See how we stack up against the competition
| Feature | exit1.dev | Others |
|---|---|---|
| Natural language monitoring queries | ||
| Claude Desktop support | ||
| Cursor IDE support | ||
| Windsurf support | ||
| Claude Code (CLI) support | ||
| Read-only (safe) access | N/A | |
| Zero-dependency install (npx) | N/A | |
| Historical data queries | ||
| Multi-range stat comparison | ||
| Available at $3/mo | Enterprise only |
Technical Details
Built for developers, by developers
Architecture
The exit1-mcp server runs locally on your machine as a stdio-based MCP server. It communicates with your AI assistant through the Model Context Protocol standard. All API calls go directly from your machine to the Exit1 API — no intermediate servers, no data stored locally.
Performance
Tool calls execute in under 500ms for most queries. The server starts instantly via npx with zero warm-up time. Rate limits are enforced server-side at 5 requests/minute per API key, with generous daily limits for normal AI conversation patterns.
API
The MCP server exposes 5 tools: list_checks, get_check, get_check_history, get_check_stats, and get_status_page. All tools are read-only and require a checks:read scoped API key. The server is published as exit1-mcp on npm and distributed via npx.
Frequently Asked Questions
Everything you need to know about our monitoring
MCP Is a Nano Feature
Connect your AI assistant to exit1.dev and query monitoring data conversationally. Available on the Nano plan starting at $3/month.
API & Webhooks
Full REST API access for programmatic integration. The MCP server is built on top of the same API.
Analytics & Reports
Track uptime trends and response times. MCP lets you query the same data conversationally.
Smart Alerting
Get notified via email, SMS, and webhooks. Use MCP to investigate after alerts fire.
Status Pages
Public status pages for your customers. Query their current state via MCP.
Real-Time Monitoring
Live status updates for websites and APIs. Ask your AI assistant about current status anytime.
Global Monitoring
Monitor from multiple regions. MCP surfaces data from all regions in a single conversation.