The first docs platform with native MCP support

Documentationyour AI can read

Your architects write. Your AI implements.The bridge between human knowledge and AI action.

cursor — architect-docs
>Read the authentication architecture from Inovisum
Connecting to MCP server...
Found: /architecture/auth-flow.md
Found: /specs/oauth-implementation.md
Found: /decisions/ADR-042-jwt-strategy.md
>Implement the OAuth flow following the documented patterns
THE WORKFLOW

From architect's mind
to developer's code

01

Architect Documents

Write specs, ADRs, and architecture decisions in your vault. Native diagrams, rich formatting, living documentation.

02

AI Reads via MCP

Cursor, Claude, Copilot connect directly to your docs. No copy-paste. No stale context. Always current.

03

Developer Ships

AI implements following your documented patterns. Code matches architecture. Teams stay aligned.

"The architect's documentation becomes the AI's implementation guide.
Finally, specs that actually get followed."
DIFFERENTIATORS

What makes us
impossible to ignore

ONLY PLATFORM WITH THIS

Native MCP Support

Model Context Protocol is how AI agents read external data. We're the only documentation platform with first-class MCP support. Your Cursor, Claude, or custom agents can read and write your docs natively.

  • AI reads your architecture specs directly
  • Agents write drafts for review
  • @AI comments for async document review
  • Zero copy-paste, always in sync
// mcp-config.json
{
  "servers": {
    "inovisum": {
      "url": "https://your-org.inovisum.io",
      "vault": "architecture",
      "capabilities": [
        "read",
        "write",
        "search"
      ]
    }
  }
}
DATA SOVEREIGNTY

Your LLMs. Your Data.

Not everyone can send their proprietary architecture to OpenAI. Connect your own Ollama, vLLM, or any local model. Semantic search, AI features, zero cloud exposure.

  • Connect Ollama or any OpenAI-compatible endpoint
  • Semantic search with your own embeddings
  • AI features without data leaving your infra
  • Perfect for regulated industries
Self-hosted models
Zero cloud exposure
You own the repo
Local embeddings
TEXT-TO-DIAGRAM

28 Diagram Formats

Native support for Mermaid, PlantUML, D2, Excalidraw, GraphViz, and 23 more via Kroki integration. Your diagrams live in version control, render live, and never go stale.

MermaidPlantUMLD2ExcalidrawGraphVizBPMNC4ERD+20 more
```mermaid
sequenceDiagram
    Architect->>Inovisum: Write specs
    Developer->>Cursor: "Read auth flow"
    Cursor->>Inovisum: MCP request
    Inovisum-->>Cursor: Architecture docs
    Cursor->>Developer: Implements pattern
```
ZERO LOCK-IN

You Own the Repo

Your documentation lives in Git. Full history, branch workflows, PR reviews for docs. If you ever leave, you keep everything. Standard Markdown, no proprietary format.

  • Full Git history for all changes
  • Branch and PR workflows for docs
  • Standard Markdown + extensions
  • Export anytime, no lock-in
git log --oneline
a3f2c1d Update OAuth flow diagram
8b4e2a1 Add ADR-042: JWT strategy
c9d3f5e Refactor auth architecture
2a1b8c4 Initial architecture docs
COMPARISON

The honest comparison

We're not trying to be everything. We're built for AI-native technical documentation.

Inovisum
NotionConfluenceGitBook
MCP Protocol (AI reads docs)
Local LLM Support
28 Diagram Formats3
@AI Comment Review
Git-Backed Storage
Semantic Searchbasic
Real-time Collaboration
Data Sovereigntyself-host $$$
SECURITY-FIRST

Your architecture.
Your infrastructure.

We know your architecture docs contain sensitive information. That's why we give you complete control over where your data lives and which AI models touch it.

Bring Your Own LLM
Connect Ollama, vLLM, or any OpenAI-compatible endpoint. AI features work without sending data to external providers.
Git-Native Storage
Your docs live in a Git repository you control. Full history, full ownership, full portability.
SSO & RBAC
Enterprise-grade access control. SAML, OIDC, domain restrictions, role-based permissions.
Your Git Repository
Full ownership, portable
Your LLM (Ollama)
Running locally
Zero Cloud Exposure
Data never leaves
EARLY ACCESS

Free during beta.
All features unlocked.

Help us shape the future of AI-native documentation. Get full access while we're in beta.

FREE BETA
Early Access
$0
Full access while in beta
  • 1 personal organization (default)
  • Up to 2 additional organizations
  • 3 vaults per organization
  • 100 MB storage per organization
  • All diagram formats (28+)
  • Full MCP integration
  • Local LLM support via MCP
  • Semantic search
  • Private & public vaults
  • Git sync
Paid Plans
Coming Soon
After beta ends

We're focused on building the best AI-native documentation platform. Once we're out of beta, we'll introduce fair, transparent pricing.

What to expect:
  • Generous free tier stays
  • Team plans for larger orgs
  • Enterprise with SSO & SLA

Ready to bridge the gap
between docs and code?

Join engineering teams who've made their documentation an active part of their development workflow.

IN
Inovisum·Documentation for AI-native teams
© 2025 Inovisum. All rights reserved.