Grant Application: Open-Source Tool for Verifying MEV Blocker Transactions

Grant Title:
Development of an Open-Source Tool for Verifying MEV Blocker Transactions

Author:

About You:
I am a Python developer with six years of experience, currently working on an anonymous P2P messenger using Web3 technology. My expertise lies in blockchain interaction and Django-based web applications.

Grant Category:
Developer tools (SDK)

Grant Description:
This project aims to build an open-source tool that verifies MEV Blocker transactions, maximizes user refunds, and flags rule violations. The tool will operate as a standalone service with dependencies on an archive RPC node and Dune, running its logic on each newly seen block with some delay. Communication will be through various channels such as Telegram (via python-telegram-bot mainly because of async), Slack, Dune, and stderr.

Grant Goals and Impact:
The primary goal is to develop a reliable and efficient tool to verify MEV Blocker transactions. This will maximize user refunds and provide real-time monitoring and alerts, enhancing the transparency and security of the CoW Protocol ecosystem. The successful execution of this project will benefit users by ensuring fair transaction processes and increasing trust in the protocol.

Milestones:

Milestone Payment (xDAI)
Data Gathering 1000 xDAI
Bundle Simulation 2000 xDAI
Alerting and Productionisation 1000 xDAI
Total 4000 xDAI

Milestone Descriptions:

Milestone 1: Data Gathering

  • Tasks:
    • Fetch block contents from the RPC.
    • Identify potential MEV Blocker bundles from Dune.
  • Outcomes:
    • A robust data gathering service that accurately fetches and identifies relevant block data.

Milestone 2: Bundle Simulation

  • Tasks:
    • Apply a greedy algorithm to compute merged bundle candidates.
    • Simulate refunds from bundle candidates using trace_callMany with stateDiff.
    • Compare the optimal outcome for users with the actual payout received.
  • Outcomes:
    • A simulation service that accurately calculates and compares potential refunds.

Milestone 3: Alerting and Productionisation

  • Tasks:
    • Create a development container and Docker image.
    • Develop deployment instructions.
    • Integrate alert channels (logs, Slack, Telegram).
  • Outcomes:
    • A production-ready tool with integrated alerting capabilities.

Funding Request:
The total funding requested is 4000 xDAI, justified by the scope and complexity of the project, as well as the anticipated impact on the CoW Protocol ecosystem.
I would prefer to receive the funding in xDAI.

Budget Breakdown:

Category Amount (xDAI)
Development 2500 xDAI
Testing 700 xDAI
Deployment 300 xDAI
Documentation 300 xDAI
Miscellaneous 200 xDAI
Total 4000 xDAI

Gnosis Chain Address (to receive the grant):
0xC6C985637eabC798c232dDAd3afFde525242dFd7

Other Information:
This project builds on my previous work in blockchain and Django, ensuring a high level of reliability and performance. I have attached relevant documents and previous work samples for reference.

Referral:
Olga Fetisova

Terms and Conditions:
By submitting this grant application, I acknowledge and agree to be bound by the CoW DAO Participation Agreement and the CoW Grant Terms and Conditions.

4 Likes

Thanks april for the detailed specification!

I’m not sure a django project will be needed here (we are not looking for a webserver to query information). I believe some standalone service with dependencies on an archive RPC node and Dune which runs its logic on each newly seen block - with some delay - and can be initialised with a starting block in the past would be sufficient. Communication with the outside world should happen by pushing messages/alerts into different communication channels (e.g. telegram, slack, dune, or simply stderr)

As for the simulations, you will probably have to use trace_callMany with stateDiff in order to simulate refunds from bundle candidates (by creating a list of all transactions that were in the block until the target transaction, followed by the target transaction and the different backrun permutations). I don’t believe this is currently exposed in web3.py, but I might be wrong and you will surely find a way around that.

In terms of deliverables, I think it would make sense to split the project into three phases:

  1. Data gathering (1 & 2.1 in your proposal): This includes fetching the block contents from the RPC as well as all potential MEV Blocker bundles from dune.
  2. Bundle Simulation (2.2 & 3 in your proposal): This includes applying the greedy algorithm to compute merged bundle candidates as well as the optimal outcome for the user and compare it to the payout they actually received.
  3. Alerting and productionisation (4, 5 & 6 in your proposal): This includes the creation of a devcontainer, docker image, deployment instructions as well as integration for different alert channels (logs, slack, telegram)

In terms of payouts (DAI) I think 1k, 2k and 1k (thus higher than in your proposal) would sound fair to me.

1 Like

Dear Felix,

Thank you for your detailed feedback and valuable suggestions. I appreciate the direction provided. Here’s how I plan to adjust the project:

Adjustments and Workarounds

  1. Standalone Service:

    • I will develop a standalone service instead of a Django project, relying on an archive RPC node and Dune.
    • The service will run logic on each newly seen block with some delay and push messages/alerts to channels like Telegram, Slack, Dune, or stderr.
  2. Simulation of Refunds:

    • For simulating refunds, I will use the trace_callMany method with stateDiff. Although not directly exposed in web3.py, I will use the make_request method to send raw RPC calls (ex):
      def trace_call_many(web3, calls, block_identifier='latest'):
          payload = {
              "jsonrpc": "2.0",
              "method": "trace_callMany",
              "params": [calls, [block_identifier], {"tracer": "stateDiff"}],
              "id": 1
          }
          response = web3.provider.make_request("trace_callMany", payload['params'])
          return response
      

Revised Phases and Milestones

  1. Data Gathering:

    • Fetch block contents from the RPC and MEV Blocker bundles from Dune.
    • Payment: 1000 DAI
  2. Bundle Simulation:

    • Apply a greedy algorithm for bundle candidates and use trace_callMany with stateDiff for simulation.
    • Compare the optimal outcome for the user with the payout they actually received.
    • Payment: 2000 DAI
  3. Alerting and Productionisation:

    • Create a development container, Docker image, and deployment instructions.
    • Integrate alert channels (logs, Slack, Telegram).
    • Payment: 1000 DAI

I believe these adjustments will meet the project’s objectives efficiently. Please let me know if there are any further modifications needed.

Best regards,
Artem

1 Like

Thanks @april for adjusting according to the provided feedback.
I upgraded your user with higher trust level, I hope this allows you to edit the original post with the adjusted plan (instead of just a reply to the OP)
Few additional feedback points:

  • Something that was unclear to me is whether you expect payment in COW, xDAI or a mix?
  • Can you specify some minimal level of documentation of the code?
  • Can you provide some time estimation of each milestone?

Thanks!

1 Like

Hi @middleway.eth,
Thanks for upgrading my trust level and for the additional feedback. Here are my responses:

Payment Preferences

I can accept payment in xDAI, COW, or a mix of both. I prefer xDAI because it’s a stable coin, but I’m also open to receiving COW tokens. Supporting the COW token can benefit the company and the ecosystem in ways I’m eager to contribute to.

Minimal Documentation

I’ll make sure the code includes:

  • Detailed comments and docstrings for all functions and classes.
  • A README.md file explaining the project setup, usage, and dependencies. (As an example of readme, you can check my project SpyLock. In this project, it makes a lot more sense on making it even larger and a lot more explained than in SpyLock, as it will be a tool)
  • Usage examples and instructions for running and configuring the standalone service.

Time Estimation for Each Milestone

  1. Data Gathering:

    • Duration: 3 days
  2. Bundle Simulation:

    • Duration: 5 days
  3. Alerting and Productionisation:

    • Duration: 3 days

Clarification on Post Editing

Could you please confirm which specific post I need to edit (I assume it’s the original post by you)? Also, please guide me on the format and how I should mark the edits.
UPD: deleted info which were mentioned above

Thanks again for your support and guidance.

1 Like

What I meant is, you should update your original grant proposal post above with all the changes that you’ve proposed in various comments.

One last ask - please choose your preferred way of payment (XDAI, COW, mix) and specify it as well in the original post
Then I’d encourage you to submit the grant to snapshot Snapshot

1 Like

Hello!
Successfully published on snapshot and updated on forum.

2 Likes

Project Status Update

Milestone 1 Seems to be ready
Github

For data gathering, used this SQL from Dune cowprotocol

Additionally, added Multiprocessing (For future development), and c_extension (For simulation optimization in Milestone 2).

I want to ensure that everything meets the requirements. If there’s something missing or whatever, I’m open to the feedback:)

3 Likes

I found one small issue with your query execution and reported it here:

Thank you very much for the feedback!

Project Status Update: Milestone 1 reached.
@middleway.eth Updated everything after @fleupold feedback.
Submitting a final review for Milestone 1. Felix also proposed to add a logic to fulfill gaps, which I’ve added, keeping in mind that Chen wanted to keep delay for a block configurable.

I’ll keep doing Milestone 2.

Thanks for submitting milestone 1 :raised_hands:
We’ll queue it for review and update here. @fleupold

Hello!
I’m pushing a extremely important hotfix to the Milestone 1 right now.
I’ve ran into two different issues when switched to “run_query” method (Did not noticed them, as I was not getting enough log info from Dune).
Changelog:

1. To ensure we’re not running out of api limit (which is crucial), I’ve explicitly moved the “run_query” outside of multiprocessing polling, as multiprocessing called an unexpected behaviour with “run_query” func.
2. For the same purpose, we’re checking if the query is executed and finished, only every 9 minutes (as I’ve noticed, the query mentioned above is being executed in ~17 minutes).

This two crucial changes ensures that there will not be API usage in excess, while giving us a needed output.

1 Like

Hi @april, I’ve reviewed your submission and opened a couple of issues that I consider required for milestone 1.

Currently the data that is being fetched is not optimal nor sufficient for the analysis that will be done in Milestone 2.

Hi @fleupold. I’ve seen your issues on github. Fixed almost all (except one thing).
In the main Scope of work provided by @middleway.eth was written that I must only gather information inside the block itself.
In your github issue you asked me to “Fetch all submitted backruns (even the ones that didn’t make it on chain)”.

This clearly increases the amount of work which I have to do, and time too (As I have to collect the mempool which is outside the backrun, and are not being collected by Dune), So I have to:

  1. Make my own service which will periodically check the node directly
  2. Or use flashbots.

What should I do in this case? Proceed without or we’ll discuss the future terms and steps, considering new changes?

2 Likes

Commented on the original issue, let’s keep technical discussion on github.

3 Likes

Updated Project status!

1 Like

Hello!
Here to say that Milestone 1 is fully Completed and revised.
Had several issues with Dune itself, but now everything is functional and okay.

1 Like

Thanks for the update.
The milestone was reviewed and payment will be processed in the coming days.

(I’d add that usually we require submitting evidence to the completion of milestone. IN this case, evidence can be found in the following repo)

2 Likes

Hello!
Here to comment some issues and progress I have.
There’s already a branch on github with milestone 2, but there’s issue related to dune.
I’ve found out that Dune somehow detects patterns of this project and instantly bans any free account when making query. Sure that’s an error, wrote a message to support team but did not got any response yet (UPD. got answer, will wait 'til they solve the issue).
The Milestone 2 seems theoretically good at this point, but I can not accurately test it at this moment (because of mentioned above).

Also, will push Milestone 3 branch within today.