CoW Protocol Integration Documentation

CoW Protocol’s current documentation creates measurable friction for integrators. BD has reported that even large partners (Binance-tier) struggle with basic order creation. The search functionality returns poor results. Parameter descriptions are written from an internal engineering perspective rather than from an integrator’s.

Recent examples illustrate the gap. A team that has already integrated limit orders reached out to ask how to set token approval via the ABI for a gasless swap. Another partner couldn’t determine that the buy amount field requires slippage to be packed in. These aren’t conceptual misunderstandings. They lack the parameter-level clarity that the existing documentation lacks.

This adds unnecessary steps to what should be a straightforward integration path. Every extra click or context switch is an opportunity for a partner to deprioritize the integration or reach out to BD for help that documentation should provide.

Reference Models

The council identified DFlow (Solana) as the north star. After reviewing their documentation alongside Jupiter and 0x, clear patterns emerge.

  • DFlow organizes endpoints by use case rather than internal architecture. Each endpoint page includes a description, a parameter table with types and constraints, a response schema, and cURL examples. An “Ask AI” button appears on every page. Their quickstart guides help integrators through end-to-end workflows.

  • Jupiter takes a two-tier approach with Ultra (simple) and Swap API (advanced), providing clear guidance on when to use which. Code examples in TypeScript appear directly on reference pages alongside migration guides.

  • 0x provides a cheat sheet for quick contract and endpoint reference, decision trees for choosing between integration approaches, and human-readable error documentation. Their quickstart helps developers go from zero to their first successful call.

The common thread across all three: endpoint reference pages with parameter tables, code samples in multiple languages, interactive playground functionality, a quickstart that delivers results in under 10 minutes, and an error code reference with troubleshooting steps.

Approach Options

This RFC invites proposals across a spectrum of approaches. Applicants should indicate which track they propose to join.

Track A: Systematic/AI-Based

Proposals that deploy trained RAG systems, documentation generators, or other programmatic approaches that pull from source code and existing documentation. These solutions sit as a meta-layer that improves as the underlying documentation improves.

The council will evaluate Track A proposals based on demonstrated framework or methodology from previous deployments, how training data is sourced and maintained over time, and operational cost structure, including API access and hosting. Applicants should provide sample queries and outputs from similar projects to demonstrate accuracy. The council will also consider the review burden: what does the core team need to verify before each release?

Track B: Targeted Long-Form

Proposals for manual documentation of specific high-friction areas. These require depth of subject-matter expertise and will require core team availability for interviews and technical reviews.

The council will evaluate Track B proposals based on portfolio quality, with a focus on similar API or integration documentation. Applicants should propose specific sections rather than a vague, full-overhaul pitch, estimate the core team hours needed for interviews and reviews, and describe their process for validating technical accuracy before delivery.

The council may fund multiple proposals across both tracks if responses warrant it.

Target Scope

The primary objective is to make the integration setup clear for developers with limited context on CoW Protocol internals. This is not about conceptual explanations (which exist) but parameter-level precision.

Priority areas based on BD feedback:

  • Order creation: which fields require slippage, and how amounts should be formatted
  • Approval setup: ABI-level guidance for relay approval patterns
  • Error responses: what each error means and how to resolve it
  • Endpoint selection: when to use fast vs optimal quoting

Phase 1 deliverables depend on the approach track. Track A should deliver a deployed system with sample queries demonstrating accuracy in priority areas. Track B should deliver restructured reference pages for the Order Book API, including parameter tables, type constraints, and plain-language descriptions. Both tracks should deliver a quickstart that gets an integrator to a successful first order in under 10 minutes, with correct amount calculation covering fees, slippage, and quote handling.

Out of scope: solver documentation, CoW AMM docs, MEV Blocker docs, platform migration, and conceptual guides for end users.

Coordination Requirements

Before finalizing any award, the council needs to confirm the scope boundaries between internal roadmap work and grant-funded work, the availability of technical reviewers for milestone reviews, and the core team’s bandwidth for applicant interviews (Track B in particular).

Applicants must specify:

  • Estimated hours of core team time required
  • Number and purpose of sync calls needed
  • What existing documentation or source code access is required
  • How will technical accuracy be verified without a heavy core team review

The Q1 roadmap includes partner usability improvements. Proposals should account for potential overlap and describe how they would coordinate rather than duplicate effort. The information architecture work should align with the existing product separation proposal (PR #551). The core team should address Swagger/OpenAPI sync as a prerequisite to ensure documentation accuracy remains maintainable.

Applicant Profile

Track A applicants should have prior experience with documentation generation or RAG-based systems and a demonstrated record of output quality. They need an understanding of operational costs and maintenance requirements, and familiarity with parsing OpenAPI/Swagger specifications programmatically.

Track B applicants should have a portfolio that includes API documentation for DeFi or trading systems and experience interviewing subject-matter experts to extract technical details. Availability for 3-5 sync calls with the core team during execution is required, along with familiarity with Docusaurus or similar static site generators.

Both tracks benefit from a Web3/DeFi background, understanding of EVM transactions and approval patterns, and prior work with intent-based trading, batch auctions, or aggregator protocols.

Submission Guidance

Proposals should include:

  • Which track (A, B, or hybrid) proposing under
  • Specific sections or capabilities you would deliver
  • Estimated budget range and timeline
  • Estimated core team time required
  • For Track A: example outputs from similar deployments
  • For Track B: portfolio links to comparable documentation work

The council may select multiple proposals if complementary approaches emerge.

6 Likes