Grant Application: CoW Protocol Integration Documentation
Problem Statement
A developer builds an order submission flow. The request compiles. Parameters match the API reference. The order fails silently. A few hours later, the answer appears in the source code: buyAmount requires slippage to be packed manually.
Situations like this are common during a first integration. Developers often discover that some important behaviors, such as amount semantics, approval flows, or certain error responses, are clearer in the codebase than in the documentation.
Improving how these details are surfaced in the documentation could make the integration experience much smoother, allowing developers to build confidently without inspecting the source.
1. Why CoBuilders
CoBuilders is a technical research and development team that works with blockchain protocols on infrastructure, integrations, and developer tooling.
Our work focuses on the layer where developers actually interact with protocols: APIs, relayers, signing flows, and integration tooling. This perspective strongly influences how we approach documentation: not as external writers, but as builders who have faced the same integration challenges.
Our team has collaborated with organizations such as OpenZeppelin, Arbitrum, Worldcoin, and ZetaChain, delivering work that spans protocol research, developer infrastructure, and technical documentation.
Within the CoW ecosystem, we have already completed a grant project integrating Otterscan into the CoW Protocol Playground, improving transaction inspection and debugging for local development environments. The project progressed through multiple milestones and was delivered through public pull requests and documentation updates.
-
December 5, 2025: grant approval by the CoW grants committee.
-
February 28, 2026: milestone 3 completion posted.
-
Source: CoW Forum grant thread
Additional technical documentation delivered:
Because of this background, we approach documentation from a practical integration perspective: identifying where developers actually get blocked and designing documentation that directly resolves those friction points.
2. What We Deliver
To address both immediate integration friction and long-term documentation maintainability, we propose a hybrid approach composed of two complementary tracks.
Track B: Targeted Long-Form
- Documentation Overhaul
Rewrite of every high-friction area identified in the RFC. All code examples are in TypeScript, Python, and cURL.
| Area | What Changes |
|---|---|
| Order creation | Amount semantics, slippage packing, and fee handling, validTo semantics, appData hash. Field-level precision with working examples |
| Approval setup | GPv2VaultRelayer addresses per chain, gasless approval flow, ABI-level relay patterns |
| Quote selection | Fast vs optimal: when to use each, response differences, latency tradeoffs, timing between quote and order creation |
| Error reference | Structured by integration stage: quoting → signing → submission → settlement. Root cause + concrete fix for each |
| Rate limits & quotas | Per-endpoint limits, backoff strategy, quota headers, critical for bots and aggregator backends |
To better identify real integration friction, we will also consult active integrators and developers building on CoW Protocol. These conversations help surface common onboarding blockers and ensure the documentation overhaul focuses on the issues developers actually encounter during integration.
- Three Quickstart Paths (< 10 minutes each)
Each targets a different integration pattern and language:
| Path | Language | Use Case |
|---|---|---|
| Swap Order | TypeScript + Python | Frontend/dApp, most common integration entry point |
| Limit Order | TypeScript + Python | Backend bots, automation, algorithmic trading |
| Raw API | cURL | Quick exploration, language-agnostic validation |
Each path goes from zero to a confirmed order on staging. Every quickstart is backed by an executable artifact that CI validates on every push.
- Full-Section Validation + Cross-Repo CI
Every authored section ships with:
-
A test-case manifest (YAML).
-
Executable code artifacts, the same snippets shown in the docs.
-
Deterministic pass/fail checks.
-
Evidence logs for reviewer verification.
Every authored documentation section is backed by executable code artifacts and a test manifest. When documentation or snippets change, CI runs these artifacts to verify they still match the expected API behavior.
To prevent documentation from drifting as the protocol evolves, the system also performs cross-repository drift detection. When a relevant cow-services repository merges into it main, it triggers a repository_dispatch event to the documentation repository. This automatically re-runs the artifact validation pipeline against the live API contract.
If an API change breaks a documented claim, the pipeline flags the mismatch and surfaces a visible warning so maintainers can update the documentation before integrators encounter stale information.
This goes beyond typical CI checks on quickstarts: it introduces contract-level validation between implementation and documentation across repositories.
Implementation details, including CI wiring, job topology, and notification channels, will be finalized with CoW maintainers to align with the existing repository governance.
Track A: Systematic/AI-Based
- MCP Assistant
An AI assistant exposed via the Model Context Protocol (MCP) that helps developers interact with protocol documentation and API specifications.
Rather than acting as a traditional chatbot, the assistant helps integrators navigate documentation and generate integration artifacts, such as request payloads, parameter explanations, and example snippets.
The final shape of the assistant will be defined collaboratively with the CoW core team. Two architectural approaches are possible.
Possible Approaches
- Documentation RAG
One approach is to build a retrieval-augmented system that indexes the protocol documentation and OpenAPI specifications.
In this model:
-
Documentation and specs are indexed and kept in sync with public sources.
-
The LLM retrieves relevant documentation fragments and generates answers based on them.
-
The assistant behaves similarly to an AI documentation reader that explains API behavior and integration patterns.
This approach prioritizes flexibility and natural documentation exploration.
- Schema-driven MCP tools
Another approach is to expose deterministic tools derived from the OpenAPI specification through the MCP server.
In this model:
-
Tools are generated directly from OpenAPI operations.
-
The LLM interprets the user’s request and calls the appropriate tool.
-
The tool produces structured outputs such as request payloads, parameter tables, or example code.
This reduces the risk of hallucinations because the LLM does not need to infer API schemas from documentation; the schema is generated directly from the spec and resolved by the tool.
These two approaches are not mutually exclusive, and the final implementation may combine both, for example:
-
RAG for documentation discovery and conceptual questions
-
deterministic tools for schema generation and request construction
The architecture will be finalized together with the CoW team once the integration requirements and preferred developer workflow are validated.
- Access Points
The assistant can be accessed from multiple environments:
-
Documentation site, via a Docusaurus plugin embedded in the docs.
-
External LLM clients, developers can connect their preferred client (Claude, ChatGPT, or local models) to the MCP server and access the same assistant capabilities from their own workflow.
- Monthly Gap Report
The assistant logs questions it cannot confidently answer using the available documentation or tools.
A monthly report summarizes:
-
frequently unanswered queries
-
documentation gaps revealed by integrator questions
-
recommended improvements or additions to the docs
This provides continuous feedback about where documentation can be improved.
Handover
All work is delivered through production-ready pull requests and repositories, allowing the CoW team to review, merge, and maintain the system within their existing workflow.
Documentation
-
Documentation improvements delivered as PRs to the official docs repository
-
Includes rewritten sections, quickstarts, examples, and error references
Documentation Validation
-
CI validation system delivered either:
-
integrated into the docs repo, or
-
as a separate
docs-testingrepository (if maintainers prefer separation)
-
Includes executable artifacts, test manifests, and cross-repo drift detection.
AI Assistant (Track A)
If Track A is implemented, the MCP assistant is delivered as a separate deployable repository with Docker/Kubernetes deployment support.
Operational Notes
Runbooks and setup documentation are included to enable the CoW team to maintain and extend the system independently.
3. Budget & Timeline
Commercial Envelope
| Scope | Budget | Duration |
|---|---|---|
| Core (docs overhaul + 3 quickstarts + validation + cross-repo CI + handover) | $16,500 | 6 weeks |
| MCP assistant + gap reports | $6,500 | +2 weeks |
| Full engagement | $23,000 | 8 weeks |
Payment Structure
| Milestone | Amount | Deliverable |
|---|---|---|
| M1: Kickoff + scope lock | $2,500 | Scope map, baseline audit, quickstart skeletons |
| M2: Documentation delivery | $7,000 | All docs sections + 3 quickstart paths + error reference + rate limits |
| M3: Validation + cross-repo CI | $4,500 | Test manifests, executable artifacts, cross-repo dispatch setup |
| M4: Handover | $2,500 | Runbook, ownership notes, final review |
| M5: MCP assistant | $6,500 | MCP server, Docusaurus plugin, monthly gap report template |
Core Team Time
Fixed estimate: 10 hours total.
| Activity | Hours |
|---|---|
| Kickoff and scope alignment | 2 |
| Technical review of docs sections | 3 |
| Quickstart validation | 2 |
| Validation and CI review | 1.5 |
| Final handover | 1.5 |
Delivery Timeline
| Week | Checkpoint |
|---|---|
| 1 | Scope map, baseline audit, quickstart skeletons |
| 2–3 | Full docs sections: order creation, approvals, quoting, error reference, rate limits |
| 4 | 3 quickstart paths complete and testable |
| 5 | Validation manifests + cross-repo CI setup |
| 6 | Handover: runbook, ownership notes, final review |
| 7–8 | MCP assistant: server, plugin, gap report template |
4. Out of Scope
Aligned with RFC exclusions:
-
Solver documentation.
-
CoW AMM docs.
-
MEV Blocker docs.
-
Platform migration unrelated to this integration scope.
-
End-user conceptual rewrites.