Claude
MCP Native

Full bidirectional context via Model Context Protocol. Works with Claude Desktop and claude.ai Projects. Read threads, write observations, search context, share assets — all natively.

Read/write threads & observations
Semantic search across all context
Share and retrieve assets
Real-time sync with other tools
Works with Desktop + Web
Quick Setup
// claude_desktop_config.json
{
"mcpServers": {
"sleds": {
"command": "npx",
"args": ["sleds-mcp"]
}
}
}
Cursor
MCP Native

Your coding AI gets full project context. Architecture decisions, API specs, design tokens — Cursor sees everything the team has decided, without you copy-pasting a thing.

Auto-loads relevant specs while coding
Writes observations as you build
References shared assets inline
Sees decisions from other tools
MCP config in Settings or .cursor/mcp.json
Quick Setup
// .cursor/mcp.json
{
"mcpServers": {
"sleds": {
"url": "https://api.sleds.ai/mcp"
}
}
}
ChatGPT
Custom GPT Actions

Build a Custom GPT that's aware of your entire project. Connect via OpenAPI Actions spec — ChatGPT gets full CRUD access to your sled's threads, assets, and search.

OpenAPI 3.0 Actions integration
Full CRUD on threads & assets
Semantic search across context
Custom GPT system prompt included
Works with any GPT-4/4o model
Quick Setup
// Import as Custom GPT Action
1. Create a new Custom GPT
2. Import OpenAPI spec from:
api.sleds.ai/openapi.json
3. Add your API key
4. Start chatting with context
Gemini / AI Studio
MCP + REST API

Google's AI tools connect through MCP for native integration or REST API for AI Studio function calling. Full access to shared context either way.

MCP for native tool integration
REST API for AI Studio functions
Read/write all context types
Search and link capabilities
Structured data support
Quick Setup
// AI Studio Function
const sleds = await fetch(
"api.sleds.ai/api/spaces/
my-sled/search"
, { headers: { Authorization }
});
Figma
Plugin

Push design tokens, component specs, and visual decisions from Figma directly into your sled. Engineering AI tools reference them instantly — no handoff docs needed.

Push design tokens to sled
Share component specifications
Reference decisions from Figma
Design → code context bridge
Coming: auto-sync on publish
Quick Setup
// Install Figma Plugin
1. Search "sleds" in Figma
Community plugins
2. Install and authenticate
3. Push tokens with one click
Coming Q2 2026
Slack
Bolt SDK

Bidirectional Slack integration. Capture context from conversations with emoji reactions or slash commands. Get notified in channels when your sled updates.

🧠 emoji reaction to capture
/sleds slash command to save context
/sleds search from any channel
Channel notifications on updates
Configurable per-sled channels
Quick Setup
// Install Slack App
1. Add sleds to your workspace
2. Connect to a sled in Settings
3. React with 🧠 to capture
4. Use /sleds to search
Available now ✓
GitHub
GitHub App

Connect your repos to sleds via GitHub App. PR events, issues, and deployments flow into sled threads automatically. Frost can create issues and comment on PRs.

Org-level webhook events
PR & issue sync to threads
Deploy status tracking
Frost can create issues & comment
Coming Q2 2026
Quick Setup
// Install GitHub App
1. Install sleds GitHub App
on your org or repo
2. PR, issue & deploy events
sync to sled threads
3. Frost comments on PRs
Coming Q2 2026
Linear
REST API

Sync project tracking with context. Issues and project updates from Linear flow into your sled, giving AI tools awareness of what's being built and when.

Issue sync to sled threads
Project status tracking
Cycle and milestone awareness
Frost can create/update issues
Coming Q2 2026
Quick Setup
// Connect Linear workspace
1. Authorize sleds in Linear
2. Select projects to sync
3. Issues auto-create threads
4. Cycle status updates Frost
Coming Q2 2026

Ready to ride?

Join early access. Free to start.

Get Started Free →