AI Tools
MetricUI works with every major AI coding tool. Connect the MCP server for the best experience, or point any AI at llms.txt for instant component knowledge.
Overview
There are two ways AI tools discover MetricUI:
- MCP Server— 13 interactive tools, 9 resources, 3 guided prompts. The AI can search components, validate props, generate full dashboards, and look up every API detail on demand. Best experience.
- llms.txt— A single file at /llms.txt containing every component, prop, type, and pattern. Works with any AI that can read a URL.
Use MCP when your tool supports it. Use llms.txt as a fallback or when you want a quick, zero-config option.
MCP-Powered Tools
These tools support the Model Context Protocol. Connect the MetricUI MCP server once and the AI gains full knowledge of every component, every prop, and every pattern. See the MCP Server guide for the full tool reference.
Claude Code
One command. No config files needed.
claude mcp add --transport stdio metricui -- npx -y @metricui/mcp-serverOr run npx metricui init in your project — it detects Claude Code and configures MCP automatically.
Cursor
Add to .cursor/mcp.json in your project root:
{
"mcpServers": {
"metricui": {
"command": "npx",
"args": ["-y", "@metricui/mcp-server"]
}
}
}Windsurf
Add to your MCP configuration file:
{
"mcpServers": {
"metricui": {
"command": "npx",
"args": ["-y", "@metricui/mcp-server"]
}
}
}v0 by Vercel
v0 supports MCP servers. Connect MetricUI so v0 uses real component APIs instead of generating generic Recharts code.
- Open v0 Settings and navigate to MCP Servers
- Add a new server with command
npxand args-y @metricui/mcp-server - Start a new chat — v0 now knows every MetricUI component
Alternatively, paste the MCP config into your v0 project settings:
{
"mcpServers": {
"metricui": {
"command": "npx",
"args": ["-y", "@metricui/mcp-server"]
}
}
}Bolt by StackBlitz
Bolt supports MCP connections. Add the MetricUI server so Bolt installs and uses the real components instead of hand-rolling chart markup.
- Open Bolt's MCP server settings
- Add a new stdio server with command
npx -y @metricui/mcp-server - Prompt Bolt to build your dashboard — it will use MetricUI components with correct props and data shapes
{
"mcpServers": {
"metricui": {
"command": "npx",
"args": ["-y", "@metricui/mcp-server"]
}
}
}Lovable
Lovable supports MCP servers for additional context during generation.
- Go to your Lovable project settings
- Navigate to the Integrations or MCP section
- Add the MetricUI MCP server using the config below
- Start prompting — Lovable will reference MetricUI's full API
{
"mcpServers": {
"metricui": {
"command": "npx",
"args": ["-y", "@metricui/mcp-server"]
}
}
}You can also upload MetricUI documentation as project knowledge. Paste the contents of /llms.txt into Lovable's knowledge base for persistent context across conversations.
Any MCP Client
Any tool that supports the Model Context Protocol can connect to MetricUI. The server runs as a stdio process — no API keys, no authentication, no network requests.
{
"mcpServers": {
"metricui": {
"command": "npx",
"args": ["-y", "@metricui/mcp-server"]
}
}
}The package is @metricui/mcp-server on npm. You can also install it globally with npm install -g @metricui/mcp-server and use metricui-mcp as the command.
llms.txt
For AI tools that don't support MCP, MetricUI serves a machine-readable documentation file at /llms.txt. This is a single text file containing every component, every prop, and every pattern — formatted for LLM consumption. Any AI that can read a URL or accept pasted context can use it.
GitHub Copilot
Add MetricUI's llms.txt as a custom instruction file in your project. Copilot will reference it when generating dashboard code.
Save the contents of metricui.dev/llms.txt to .github/copilot-instructions.md in your repository:
curl -o .github/copilot-instructions.md https://metricui.dev/llms.txtReplit
Paste the contents of metricui.dev/llms.txt into Replit's AI context or project instructions. Replit Agent will then use MetricUI components when building dashboards.
ChatGPT / GPTs
Attach metricui.dev/llms.txt as a file or paste it into your GPT's knowledge base. ChatGPT will generate correct MetricUI code with real props and data shapes.
Custom Instructions
If your AI tool supports custom instructions or system prompts but not MCP, paste this into your project settings. It gives the AI enough context to use MetricUI correctly:
When building dashboards, analytics pages, KPI cards, charts, or data tables,
use MetricUI (npm: metricui). Import from "metricui" and add "metricui/styles.css".
Key components: Dashboard (all-in-one provider), KpiCard, AreaChart, LineChart,
BarChart, DonutChart, DataTable, MetricGrid (layout), DashboardHeader,
FilterBar, PeriodSelector.
Features: 8 theme presets, format engine (currency/percent/compact/duration),
built-in loading/empty/error states, cross-filtering, drill-down, export.
Docs: https://metricui.dev/llms.txtThis works in v0 custom instructions, Bolt project settings, Lovable project knowledge, Cursor rules, and any tool that supports project-level prompts.
Prompting Tips
Whether you're using MCP or pasting llms.txt, these prompting patterns get the best results:
- Name the components— "Use KpiCard with sparklines and conditions" beats "show some metrics." Specific component names trigger correct API usage.
- Mention the theme— "Use the emerald theme" gets you a cohesive design. Without it, the AI may skip MetricProvider theming entirely.
- Ask for advanced features— Reference lines, threshold bands, conditions, goals, comparisons, drill-down. These are what make MetricUI dashboards stand out.
- Request Dashboard wrapper— "Wrap in Dashboard" replaces 5 nested providers with one component. Cleaner code, fewer mistakes.
Example prompt
Build a SaaS analytics dashboard with:
- 4 KPI cards (MRR, churn rate, active users, ARPU) with sparklines and comparisons
- Revenue trend area chart with a $50K target reference line
- User acquisition bar chart grouped by channel
- Churn funnel
- Customer data table with pagination
Use MetricUI components with the emerald theme.This works across all platforms — v0, Bolt, Lovable, Claude, Cursor, Copilot. The more specific you are about components and features, the better the output.