Unifying AI Coding Assistant Configs with Ruler
Ruler actually solves the fucking nightmare of managing configs across 20+ AI coding assistants
Unifying AI Coding Assistant Configs with Ruler
Every AI coding tool wants its configuration in a different fucking place. Claude Code uses CLAUDE.md
, OpenCode wants AGENTS.md
, Cursor needs .cursor/rules/
, Copilot expects .github/copilot-instructions.md
. If you're using multiple tools, you're copy-pasting the same instructions everywhere like an idiot.
The Problem: Configuration Chaos
Look at this shit:
- Claude Code:
~/CLAUDE.md
- OpenCode:
~/AGENTS.md
- Cursor:
.cursor/rules/ruler_cursor_instructions.md
- Copilot:
.github/copilot-instructions.md
- Windsurf:
.windsurf/rules/ruler_windsurf_instructions.md
- Aider: Both
AGENTS.md
AND.aider.conf.yml
- 20+ more tools, each with their own bullshit location
When I update my instructions, I have to remember to update ALL of them. Fuck that.
The Solution: Ruler Does the Distribution
Ruler maintains ONE source of truth in .ruler/
and automatically distributes to all tools. It's already built by Eleanor Berger (intellectronica), works perfectly, don't reinvent it.
# Install Ruler (it's npm now, not pip)
npm install -g @intellectronica/ruler
# Initialize in your home directory
cd ~
ruler init
# This creates:
# .ruler/
# ├── AGENTS.md (your main instructions)
# └── ruler.toml (config for which tools to update)
Critical Safety: Preventing Context Explosion
Here's how I fucked up and you can avoid it:
NEVER create output files in .ruler/
directory. Ruler includes ALL .md files recursively.
If output files end up there, they get re-included, causing exponential growth. I hit 2.7 million tokens.
The safety protocol:
# GOLDEN RULE: Only these files belong in .ruler/
~/.ruler/AGENTS.md # Primary source
~/.ruler/agents/*.md # Agent modules
~/.ruler/ruler.toml # Configuration
# NEVER create these in .ruler/:
.windsurf/ # Tool outputs
.cursor/ # Tool configs
*.backup # Temp files
# If contamination happens:
cd ~/.ruler
rm -rf .[a-z]* # Remove hidden dirs
ruler revert # Undo everything
ruler apply # Re-apply clean
MCP Server Configuration
Ruler also manages Model Context Protocol servers. Define them once in TOML:
[mcp_servers.filesystem]
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "/home/wv3"]
[mcp_servers.git]
command = "npx"
args = ["-y", "@modelcontextprotocol/server-git", "--repository", "."]
These get distributed to compatible tools automatically. No more maintaining separate MCP configs.
Results
.ruler/
- edit once, applies everywhereruler revert
if you fuck upWhy This Matters
Mad aweomse frameworks are dropping like 2x a day, lots good, lots whateverr, some real frickin gems.
I want to stay on top of them and be able to quickly test ideas but using only claude code i realized I was limiting myself seriously. OpenCode is fucking sweet, that guy Dax knows how to build a TUI
Now I edit .ruler/AGENTS.md
, run ruler apply
, and all tools get updated. When a new AI tool drops next week, I add 3 lines to ruler.toml
and it's integrated.
Implementation Notes
Shit that will bite you:
- File discovery is recursive - ALL .md files get included
- Order matters - files concatenate alphabetically
- Ruler NEVER writes outside your project root (except global config)
- Agent formats differ between tools - conversion scripts needed if you want to carry them over.
- i just did opencode and claudecode so far
Don't create your own distribution system. Ruler fucking works.
Next is plugging all this back into my inter-agent messaging system i had built last week to make all my different claude instances talk to each other via tmux commands. excited to see how this looks leveraging all the smaller, cheaper (literally openrouter just has free models you use, they arent the most powerful but realizing that admin tasks around the homelab dont need giant coding models to get the job done. opencode provides solid prompting enough that these ffree models literally work good.)
also fwiw the free Grok Code Fast model on OpenRouter is snappy rn - get it while its hot. it feels like gpt-5 if it laid down the actual lines of code a bit heavier. really nnice time with it yesterday.