IOSG Ventures: What can ZK co-processors do from 0 to 1?

IOSG Ventures
2023-10-31 22:08:57
Collection
The ZK co-processor is an exciting innovation in the field of blockchain.

Author: IOSG Ventures


Thanks to Mo Dong from Celer Network and Brevis for the in-depth discussion on the core concepts and use cases of ZK co-processors, which inspired the creation of this series of articles.

ZK co-processors are an exciting innovation in the blockchain space. They were first introduced by projects such as Brevis, Axiom, Lagrange, and Herodotus, and are expected to revolutionize the way we develop applications on the blockchain. With ZK co-processors, developers can create data-driven dApps that leverage the historical records of omnichain data to perform complex computations without relying on any additional trust assumptions. More importantly, it leads to a new development paradigm: asynchronous application architecture, which brings unprecedented efficiency and scalability to Web 3.0 software frameworks.

In this series of articles, we will unveil the mysteries of ZK co-processors. Whether you are interested in their concepts, practical applications, underlying mechanisms, challenges faced, or market strategies, or if you want to compare different projects, we hope these articles will provide you with new insights.

The Case of Missing VIP Trader Programs on DEX

To understand the basic idea of ZK co-processors, we need to start with incentive-based examples from the real world.

A clear distinction between centralized exchanges (CEX) and decentralized exchanges (DEX) is the existence of volume-based fee structures, commonly referred to as "VIP trader loyalty programs." These programs are powerful tools for retaining traders, increasing liquidity, and ultimately boosting exchange revenue.

IOSG Ventures: What Can ZK Co-processors Do from 0 to 1?

Interestingly, while every CEX has at least one such program, DEXs have none at all. Why is that?

This is because implementing this functionality on a DEX is more challenging and costly than on a CEX.

In a CEX, implementing a loyalty program requires:

  • Recording all users' trading histories in a centralized database—this is a task that facilitates reducing future query costs.

  • Performing a direct query in a high-performance centralized database once a month to determine each user's trading volume and fee tier based on historical data.

However, DEXs face significant challenges when trying to follow the same steps:

  • Due to the high storage costs on the blockchain, it is not feasible to store each user's trading history directly in smart contracts. Implementing such logic would mean that users would incur 4 times the transaction fees for each trade.

  • Even if we store trading records, the cost of statistical queries and calculations on this data is even higher. For example, calculating the trading volume data for a single user with 10K trades would cost 156M Gas (yes! We have calculated it).

You might say, "Wait, what are you talking about? Every user's transaction on the blockchain is automatically stored (because it's the blockchain!)." Smart contracts, which are native to the blockchain, should have access to all this data at any time, right?

Unfortunately, that's not the case!

The data stored on the blockchain and the data accessible to smart contracts within the blockchain virtual machine are two entirely different things.

For full/archive nodes of the blockchain, they store a vast amount of data from the history of the blockchain. Through these nodes, you can easily access:

  • The state of the entire blockchain at any given point in history (e.g., who the first owner of a Cryptopunk was).

  • Transactions and events generated by transactions at any given point in history (e.g., Charlie exchanged $1,000 for 0.5 ETH).

In fact, popular off-chain data indexing or analytics tools (like Nansen and Dune Analytics) can leverage this extensive dataset for in-depth analysis.

IOSG Ventures: What Can ZK Co-processors Do from 0 to 1?

However, for smart contracts embedded in the blockchain virtual machine, the limitations on data access are much greater. They cannot use data generated by off-chain indexing solutions because that would introduce additional trust issues with these external and often centralized indexing solutions.

In fact, smart contracts can only easily and trustlessly access the following data:

  • Data stored in the virtual machine state (excluding transaction or event data).

  • Data from the latest block (access to historical data is limited).

  • Data from other smart contracts that are made public through the "view" function (excluding private or internal contract data).

A key nuance in the above statement is the word "easily."

Smart contracts are not completely unaware of all the data on the blockchain. In the EVM, smart contracts can access the block header hashes of the latest 256 blocks. These block headers encompass all activities on the blockchain up to the current block and are condensed into a 32-byte hash through Merkle trees and Keccak hashes.

IOSG Ventures: What Can ZK Co-processors Do from 0 to 1?

Compressed data can be decompressed… it's just not easy?

Imagine if you wanted to access specific data from the previous block using the latest block headers without trust. This approach involves retrieving off-chain data from archive nodes and then constructing Merkle trees and block validity proofs to determine the authenticity of the data on the blockchain. Then, the EVM processes the validity proofs for verification and interpretation. Such operations are both cumbersome and arduous; merely retrieving a few past token balances could consume tens of millions of Gas.

The root of this challenge lies in the fact that the blockchain virtual machine itself lacks the capability to handle large volumes of data and intensive computations (like the decompression tasks mentioned above).

IOSG Ventures: What Can ZK Co-processors Do from 0 to 1?

ZK Co-processor Architecture (Source: Brevis presentation slides at ETHSG)

If there were a magic solution that could delegate such data-intensive, cumbersome computations to the blockchain and quickly obtain results at low cost without any additional trust assumptions, that would be ideal.

Friends, this is precisely the purpose of ZK co-processors.

The name "co-processor" is inspired by the history of computer architecture. For example, GPUs were introduced as co-processors to CPUs because CPUs had to delegate certain expensive and difficult-to-execute important computational tasks (like graphics computation or AI training) to "auxiliary processors," namely GPUs.

But what does "ZK" mean in ZK co-processors? Before delving into the complex technical details, let's first understand the broad significance and potential of this innovative technology.

We Need Data-Driven dApps in Web 3.0

Transaction fee rebates are a great example. With ZK co-processors, various loyalty programs can be seamlessly integrated into numerous DeFi protocols.

However, this is far from just DeFi loyalty programs. You may now see that similar issues exist in other areas of Web 3.0. Think about it: all modern Web 2.0 applications are data-driven, and so are all Web 3.0 applications. To create "killer applications" that provide user experiences comparable to traditional internet applications, this data-driven approach is essential.

Let's look at another example in the DeFi space: improving liquidity efficiency by redesigning liquidity mining reward mechanisms.

Currently, the liquidity incentive mechanism on AMM DEXs adopts a "pay-as-you-go" model. In this model, when LPs contribute liquidity, farming rewards are immediately allocated to them. However, this model is far from optimal. Professional farmers can quickly withdraw liquidity when they sense market volatility to avoid impermanent loss. As a result, the value they provide to the protocol is minimal, yet they still receive substantial rewards.

An ideal AMM liquidity incentive mechanism would retrospectively assess the steadfastness of LPs, especially during significant market fluctuations. Those who consistently support the liquidity pool in such situations should receive the highest rewards. However, obtaining the historical behavior data of LPs, which is crucial for this model, is still unfeasible today.

To achieve this, you need ZK co-processors.

In the DeFi space, we can cite many similar examples, whether it's using predetermined algorithms and rules for proactive LP position management, establishing credit lines using non-token liquidity positions, or determining dynamic liquidation preferences for loans based on past repayment behavior.

However, the potential of ZK co-processors is not limited to DeFi.

Building on-chain games with excellent user experiences using ZK co-processors

IOSG Ventures: What Can ZK Co-processors Do from 0 to 1?

Example of real-time operations in Web 2.0 games

When you enter a newly installed Web 2.0 game, every move you make is meticulously recorded. This data is not idle; it significantly impacts your gaming journey. It determines when to offer you in-game purchase options, when to launch reward games, when to send you carefully worded push notifications, and which opponents to match you with, among other things. These are all components of what the gaming industry refers to as LiveOps, which are the cornerstones of enhancing player engagement and revenue streams.

To make the user experience of fully on-chain games comparable to classic Web 2.0 games, these LiveOps features are necessary. These features should be based on players' historical interactions and transactions with the game smart contracts.

Unfortunately, in blockchain games, such features are either completely absent or still driven by centralized solutions. The reason is similar to the DEX example: it is challenging to mine and compute historical game data on the blockchain.

Yes, similarly, you need ZK co-processors to achieve this.

Web 3.0 social and identity applications are another area that cannot operate without the support of ZK co-processors.

IOSG Ventures: What Can ZK Co-processors Do from 0 to 1?

In the blockchain world, your digital identity is a web woven from your past behaviors.

  • Want to prove you're an NFT OG? You need to prove you were one of the original miners of Cryptopunk.

  • Boasting about being a big trader? Prove to me that you've paid over $1 million in trading fees on a DEX.

  • Close ties with Vitalik? Prove to me that his address has sent funds to your address.

Off-chain systems, whether human or Web 2.0 applications, can easily generate such proofs because, like the trading volume example, they can access archive nodes containing all this data.

Proof of identity based on this direct data access requires strong wallet address associations, and thus carries the downside of sacrificing privacy, but it is feasible.

However, just like in the trading volume example, if you want smart contracts to believe in your OG identity and experience some new things without introducing additional trust proofs, there is really no good way to do so.

With ZK co-processors, you can weave a reliable proof of identity, a proof of your past behaviors, a proof that any smart contract would accept without question. Your interactions across different applications and even different blockchains can be cleverly combined to form this proof.

What’s more appealing is the inherent privacy of ZK. Your wallet address does not have to be publicly associated with your identity. For example, you can prove that you own a Cryptopunk NFT without revealing the specific wallet address. Or, you can prove that you have executed 10,000 trades on Uniswap without disclosing the exact number.

ZK co-processors open up a whole new realm for building data-driven dApps, but their significance goes far beyond that.

Beyond the Data-Driven Paradigm: Pioneering Web 3.0 Asynchronous Models with ZK Co-processors

IOSG Ventures: What Can ZK Co-processors Do from 0 to 1?

While the data-driven dApp model is appealing, it is just the tip of the iceberg.

The emergence of ZK co-processors will fundamentally change our perception of blockchain computing, ushering in an era where asynchronous processing becomes the standard for Web 3.0. This shift redefines how tasks are processed, allowing dedicated processors to operate independently, thereby enhancing efficiency.

Let’s first understand what asynchronous processing is.

Imagine a synchronous restaurant where one person simultaneously plays the roles of chef and waiter. When you order a dish, he starts preparing it, making you wait. He can only attend to another customer after serving you. While this setup may meet your needs, it is difficult to enhance efficiency for others.

In contrast, in an asynchronous restaurant, different chefs and waiters work together. After the waiter takes your order, he quickly hands it off to the chef while serving other customers. Once the dish is ready, the chef signals the waiter, who promptly serves it to you.

In computer systems:

Synchronous architecture is like the first restaurant, where one person waits for each task to complete before moving on. This architecture is straightforward but may be slow because it processes one task at a time. Moreover, this person may be a good waiter but not a good chef.

Asynchronous architecture is like the second restaurant, where there are decoupled and specialized system components that send messages and tasks to each other as a coordination method. This allows each component to manage its task line simultaneously. While it may require more complex management methods, this architecture is faster and more efficient.

Every modern internet application is built on an asynchronous architecture to improve efficiency and scalability, and we believe Web 3.0 should be the same.

ZK co-processors will be the pioneers of this transformation. For dApp developers, the blockchain acts like the waiter in our asynchronous restaurant. It primarily handles computations that directly change the blockchain state, such as changes in asset ownership. All other computations should be handled by robust ZK co-processors, which, like skilled chefs, efficiently cook up results and send them to the waiter through the powerful capabilities of asynchronous processing.

Specifically, if the computations in a blockchain application meet either of the following two "feasibility conditions," ZK co-processors should be considered.

Conditions for ZK co-processors to execute:

  • On-chain computation cost > (Off-chain ZK co-processor computation (including proof generation) + On-chain verification cost)

  • On-chain computation delay > (Off-chain ZK co-processor computation (including proof generation) + On-chain verification delay)

Even if they only meet one of these conditions, it is worth considering!

Now you can see that it’s not just about data-driven dApps! It’s a new way to bring high-level general computing, such as ML, into the blockchain, but more importantly, it introduces an asynchronous architecture to build dApps, which was previously impossible.

Next Chapter….

If we have successfully convinced you that ZK co-processors are an idea that will have a profound impact, then perhaps it’s time to talk about how they work. In the next blog, we will explore the key architecture of ZK co-processors and discuss the biggest technical challenges that still exist in this field.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
ChainCatcher Building the Web3 world with innovators