SevenX Ventures: How much application space is there for co-processors after Uniswap V4?

SevenX Ventures
2023-12-28 12:00:23
Collection
Exploring how Uniswap v4 stimulates new types of blockchain infrastructure: Coprocessor

Author: Hill

Recently, Uniswap v4 was released. Although its features are not yet complete, we hope the community can explore unprecedented possibilities widely. Considering that there may be a large number of articles discussing the significant impact of Uniswap v4 in the DeFi space, this article will explore how Uniswap v4 inspires a new type of blockchain infrastructure: the Coprocessor.

Introduction to Uniswap v4

As stated in its white paper, Uniswap v4 has four main improvements:

  • Hooks: Hooks are externally deployed contracts that execute developer-defined logic at specified points during pool execution. Through these hooks, integrators can create flexible and customizable centralized liquidity pools.

  • Singleton: Uniswap v4 adopts a singleton design pattern, where all pools are managed by a single contract, reducing pool deployment costs by 99%.

  • Flash Accounting: Each operation updates an internal net balance, also known as an increment, with external transfers occurring only at the end of the lock. Flash accounting simplifies complex pool operations, such as atomic swaps and additions.

  • Native ETH: Supports WETH and ETH trading pairs.

Most of the gas savings come from the latter three improvements, but undoubtedly, the most exciting new feature is the new highlight mentioned at the beginning of this article: hooks.

Hooks make liquidity pools more complex and powerful

The main enhancement of Uniswap v4 revolves around the programmability unlocked by hooks. This feature makes liquidity pools more complex and powerful, providing greater flexibility and customization than ever before. Compared to the concentrated liquidity of Uniswap v3 (a net upgrade from Uniswap v2), the hooks in Uniswap v4 offer a broader range of possibilities for how liquidity pools operate.

This version can be seen as a net upgrade from Uniswap v3, but in practical implementation, it may not be so. Compared to Uniswap v2 pools, Uniswap v3 pools are always an upgrade because the "worst" upgrade possible in Uniswap v3 is to "concentrate" liquidity across the entire price range, operating similarly to Uniswap v2. However, in Uniswap v4, the programmability of liquidity pools may not lead to a good trading or liquidity provision experience, and errors may occur, along with new attack vectors. Given the numerous changes in how liquidity pools operate, developers looking to leverage the hooks feature must proceed with caution. They need to thoroughly understand how their design choices impact pool functionality and the potential risks for liquidity providers.

The introduction of hooks in Uniswap v4 signifies a significant shift in how code is executed on the blockchain. Traditionally, blockchain code is executed in a predetermined sequential manner. However, hooks allow for a more flexible execution order, ensuring that certain code executes before others. This feature pushes complex computations to the edge of the stack rather than resolving them in a single stack.

Essentially, hooks support executing more complex computations outside of Uniswap's native contracts. While this feature could be achieved in Uniswap v2 and Uniswap v3 through manual calculations outside of Uniswap and triggered by external activators like other smart contracts, Uniswap v4 integrates hooks directly into the smart contracts of liquidity pools. This integration makes the process more transparent, verifiable, and trustless compared to previous manual processes.

Another benefit brought by hooks is scalability. Uniswap no longer needs to rely on new smart contracts (which require liquidity migration) or forks to deploy innovations. Hooks can now directly implement new features, refreshing old liquidity pools.

The present of Uniswap v4 liquidity pools is the future of other dApps

I expect that more and more dApps will push computations outside their smart contracts like Uniswap v4.

The way Uniswap v4 operates today allows for splitting liquidity pool execution at any step, inserting arbitrary conditions, and triggering computations outside of the Uniswap v4 contract. So far, the only similar situation is flash loans, where execution resumes if the loan is not repaid within the same block. However, the computation still occurs within the flash loan contract.

The design of Uniswap v4 brings many advantages that were either not feasible or poorly executed in Uniswap v3. For example, embedded oracles can now be used, reducing reliance on external oracles that often introduce potential attack vectors. This embedded design enhances the security and reliability of price information, which is a key factor for the operation of DeFi protocols.

Additionally, automation that previously had to be triggered externally can now be directly embedded into liquidity pools. This integration not only alleviates security concerns but also addresses reliability issues associated with external triggers. Furthermore, it allows liquidity pools to operate more smoothly and efficiently, enhancing their overall performance and user experience.

Finally, with the introduction of hooks in Uniswap v4, a more diverse range of security features can be implemented directly within liquidity pools. In the past, security measures for liquidity pools mainly relied on audits, bug bounties, and purchasing insurance. With Uniswap v4, developers can now design and implement various fail-safe mechanisms and low liquidity alerts directly within the pool's smart contracts. This development not only enhances the security of the pools but also provides liquidity providers with greater transparency and control.

Compared to traditional phones, the advantage of smartphones lies in their programmability. Smart contracts have long lived in the shadow of "persistent scripts." Now, with the advantages of Uniswap v4, liquidity pool smart contracts have received a new programmable upgrade, becoming "smarter." I cannot understand why, given the opportunity to upgrade from Nokia to iPhone, not all dApps want to move in that direction. While I can understand that some smart contracts may prefer to maintain the status quo because Nokia is more reliable than iPhone, I am talking about the future direction of dApp development.

dApps wishing to use their own "hooks" face scalability issues

Imagine applying this to all other dApps, where we can insert conditions to trigger and then insert arbitrary computations between the original transaction sequences.

This sounds like how MEV operates, but MEV is not an open design space for dApp developers. It is more like an unknown dark forest hike, at best seeking external MEV protection, but only hoping for the best outcome.

Suppose the flexibility of Uniswap v4 inspires a new generation of dApps (or upgrades existing dApps) to adopt similar concepts, resulting in more programmable execution sequences. Since these dApps typically deploy on a single chain (L1 or L2), we expect most state changes to occur on that chain.

  • The additional computations inserted during the dApp state change process may be too complex and cumbersome to run on that chain. We may soon exceed gas limits or find it difficult to achieve at all. Moreover, this will bring numerous challenges, particularly in terms of security and composability.

  • Not all computations are equal. The reliance of dApps on external protocols like oracles and automation networks proves this point. However, this reliance may introduce security risks.

To summarize the issue: consolidating all computations into a single chain's state-changing smart contract execution is far from optimal.

Solution Hint: Already Solved in the Real World

To address the issues brought by the new generation of dApps (which may largely be inspired by Uniswap v4), we must delve into the core of the problem: the single chain. The way blockchains operate is akin to a distributed computer, processing all tasks with a single CPU. Modern CPUs have made significant progress in solving this issue.

Just as computers transitioned from single-core CPUs to modular designs consisting of multiple efficiency cores, performance cores, GPUs, and NPUs.

dApp computations can also be scaled in a similar manner. By specializing processors and combining their outputs, outsourcing some computations beyond the main processor can achieve flexibility, optimality, security, scalability, and upgradability.

Practical Solutions

There are essentially two types of coprocessors:

  • External Coprocessors

  • Embedded Coprocessors

External Coprocessors

External coprocessors are akin to cloud GPUs; they are easy to use and powerful, but there is additional network latency between CPU and GPU communication. Moreover, you do not ultimately control the GPU, so you must trust that it is doing the work correctly.

Taking Uniswap v4 as an example, suppose some ETH and USDC are added to the liquidity pool during the last 5 minutes of TWAP. If the TWAP calculation is completed in Axiom, then Uniswap v4 is essentially using Ethereum as the main processor and Axiom as the coprocessor.

Axiom

Axiom is Ethereum's ZK coprocessor, providing smart contracts with trustless access to all on-chain data and the ability to compute arbitrary expressions on that data.

Developers can query Axiom and use the trustless results verified by zero-knowledge (ZK) in their smart contracts. To complete a query, Axiom performs three steps:

  • Read: Axiom uses zero-knowledge proofs to trustlessly correct any historical Ethereum block's block header, state, transaction, and receipt reading data. All Ethereum on-chain data is encoded in one form or another, meaning Axiom can access any data available to archive nodes.

  • Compute: After obtaining the data, Axiom applies verified computational primitives based on that data. This includes a range of operations from basic analytics (summing, counting, max, min) to cryptography (signature verification, key aggregation) and machine learning (decision trees, linear regression, neural network inference). The validity of each computation will be verified in the zero-knowledge proof.

  • Verify: Axiom attaches a zero-knowledge validity proof to the result of each query, proving (1) that the input data was correctly obtained from on-chain and (2) that the computation was correctly applied. This zero-knowledge proof is verified on-chain in the Axiom smart contract, and the final result is made trustless for all downstream smart contracts.

Warp Contracts (via RedStone)

Warp contracts are the most common SmartWeave implementation, designed to create a reliable, fast, production-ready smart contract platform/engine on Arweave. Essentially, SmartWeave is an ordered array of Arweave transactions, benefiting from the absence of a transaction block inclusion fee market on Arweave. These unique properties allow for infinite transaction data without additional costs beyond storage.

SmartWeave employs a unique approach called "lazy evaluation," shifting the responsibility of executing smart contract code from network nodes to the users of the smart contracts. Essentially, this means that the computation for transaction validation is deferred until needed, reducing the workload on network nodes and allowing for more efficient transaction processing. Through this method, users can execute as many computations as needed without incurring additional costs, providing functionalities that other smart contract systems cannot achieve. Clearly, attempting to evaluate contracts with thousands of interactions on the user's CPU is ultimately futile. To overcome this challenge, an abstraction layer, such as Warp's DRE, was developed. This abstraction layer consists of a distributed network of validators that handle contract computations, significantly shortening response times and improving user experience.

Moreover, SmartWeave's open design allows developers to write logic in any programming language, providing a fresh alternative to the often rigid Solidity codebase. By delegating certain high-cost or high-throughput operations to Warp, seamless SmartWeave integration can enhance existing social graph protocols built on EVM chains, leveraging the strengths of both technologies.

Hyper Oracle

Hyper Oracle is a ZK oracle network designed specifically for blockchains. Currently, the ZK oracle network operates solely on the Ethereum blockchain. It retrieves data from each block of the blockchain using zkPoS and serves as a data source while processing data with programmable zkGraph running on zkWASM, all in a trustless and secure manner.

Developers can define custom off-chain computations using JavaScript, deploy these computations to the Hyper Oracle network, and utilize Hyper Oracle Meta Apps to index and automate their smart contracts.

The indexing and automation Meta Apps of Hyper Oracle are fully customizable and highly flexible. Any computation can be defined, and all computations (even machine learning computations) will be protected by the generated zero-knowledge proofs.

  • The Ethereum blockchain serves as the original on-chain data source for the ZK oracle, but any network can use it in the future.

  • Hyper Oracle ZK oracle nodes consist of two main components: zkPoS and zkWASM.

  • zkPoS retrieves the block headers and data roots of the Ethereum blockchain by proving Ethereum's consensus using zero-knowledge. The zero-knowledge proof generation process can be outsourced to a decentralized network of provers. zkPoS acts as the external loop for zkWASM.

  • zkPoS provides the block headers and data roots to zkWASM. zkWASM uses this data as the fundamental input for running zkGraph.

  • zkWASM runs custom data mappings or any other computations defined by zkGraph and generates zero-knowledge proofs for these operations. The operators of the ZK oracle nodes can choose how many zkGraphs they wish to run (from one to all deployed zkGraphs). The zero-knowledge proof generation process can be outsourced to a decentralized network of provers.

  • The output of the ZK oracle is off-chain data, which developers can use through Hyper Oracle Meta Apps (to be discussed in later sections). This data also comes with zero-knowledge proofs validating its validity and computation status.

Other Noteworthy Projects

If you decide to adopt this approach, there are several projects that can serve as external coprocessors. However, these projects overlap with other verticals of blockchain infrastructure and have not been individually categorized as coprocessors.

  • RiscZero: If a dApp uses RiscZero to compute machine learning tasks for on-chain agents and provides the results to game contracts on StarkNet, StarkNet would serve as the main processor while RiscZero acts as the coprocessor.

  • IronMill: If a dApp runs zk loops in IronMill but deploys smart contracts on Ethereum, Ethereum would serve as the main processor while IronMill acts as the coprocessor.

Potential Use Cases for External Coprocessors

  • Governance and Voting: Historical on-chain data can help decentralized autonomous organizations (DAOs) record the number of voting rights each member possesses, which is essential for voting. Without this data, members may be unable to participate in the voting process, potentially hindering governance.

  • Underwriting: Historical on-chain data can help asset managers assess their performance beyond profits. They can review the levels of risk taken and the types of drawdowns experienced, aiding them in making more informed decisions when compensation or potential rewards decrease.

  • Decentralized Exchanges: On-chain historical price data can help decentralized exchanges trade based on past trends and patterns, potentially yielding higher profits for users. Additionally, historical trading data can assist exchanges in improving algorithms and user experiences.

  • Insurance Products: Insurance companies can use historical on-chain data to assess risks and set premiums for different types of policies. For example, when setting premiums for DeFi projects, insurance companies may review past on-chain data.

Please note that all the above use cases are asynchronous, as the client dApp needs to call the external coprocessor's smart contract when triggered in block N. When the coprocessor returns the computation results, they must be accepted or verified in some form at least in the next block (i.e., N+1). Thus, at least the next triggering block is needed to utilize the coprocessing results. This pattern is very similar to cloud GPUs. They can run your machine learning models well, but due to latency, you cannot enjoy playing fast-paced games on them.

Embedded Coprocessors

Embedded coprocessors are similar to GPUs on a personal computer motherboard, located next to the CPU. The communication latency between the GPU and CPU is very low. Moreover, the GPU is entirely under your control, so you can be very sure it has not been tampered with. However, getting it to run machine learning as quickly as a cloud GPU comes at a high cost.

Taking Uniswap v4 as an example again. Suppose some ETH and USDC are added to a liquidity pool deployed on Artela during the last 5 minutes of TWAP. If that pool is deployed in the EVM on Artela and the TWAP calculation is completed in WASM on Artela, then the pool is essentially using Artela's EVM as the main processor and Artela's WASM as the coprocessor.

Artela

Artela is an L1 built using Tendermint BFT. It provides a framework that supports dynamic scaling of arbitrary execution layers for on-chain custom functionalities. Each Artela full node runs two virtual machines simultaneously.

  • EVM, the main processor that stores and updates the state of smart contracts.

  • WASM, the coprocessor that stores and updates the state of Aspects.

Aspects represent arbitrary computations that developers wish to run without touching the state of smart contracts. They can be viewed as a Rust script that provides custom functionalities beyond the native composability of smart contracts for dApps.

If this is hard to understand, you can try looking at it from the following two perspectives:

  • From the perspective of blockchain architecture

  • Aspects are a new execution layer.

  • In Artela, the blockchain runs two execution layers simultaneously—one for smart contracts and one for other computations.

  • This new execution layer does not introduce new trust assumptions, so it does not affect the security of the blockchain itself. Both virtual machines are protected by the same group of nodes running the same consensus.

  • From the perspective of application runtime

  • Aspects are programmable modules that work alongside smart contracts, supporting the addition of custom functionalities and independent execution.

  • They have advantages over a single smart contract in several ways:

    -- Non-intrusive: They can intervene before and after contract execution without modifying the smart contract code.

    -- Synchronous execution: They support hook logic throughout the transaction lifecycle, allowing for fine-tuned customization.

    -- Direct access to global state and underlying layer configurations, supporting system-level functionalities.

    -- Elastic block space: They provide protocol-backed independent block space for dApps with high transaction throughput requirements.

    -- Compared to static precompiles, they support dynamic and modular upgrades at runtime for dApps, balancing stability and flexibility.

By introducing this embedded coprocessor, Artela achieves an exciting breakthrough: now, arbitrary scalable modules Aspects can be executed through the same transaction as smart contracts. Developers can bind their smart contracts to Aspects, allowing all transactions calling the smart contract to be handled by Aspects.

Moreover, like smart contracts, Aspects store data on-chain, allowing smart contracts and Aspects to read each other's global state.

These two features greatly enhance the composability and interoperability between smart contracts and Aspects.

  • Aspect Features:

    The functionalities provided by Aspects focus primarily on pre- and post-transaction execution compared to smart contracts. Aspects do not replace smart contracts but complement them. Aspects provide the following unique functionalities for applications compared to smart contracts:

  • Automatically inserting reliable transactions into inverted blocks (e.g., for scheduled tasks).

  • Reversing state data changes caused by transactions (only authorized contract transactions can be reversed).

  • Reading static environment variables.

  • Passing temporary execution states to downstream Aspects.

  • Reading temporary execution states passed from upstream Aspects.

  • Dynamic and modular upgradability.

  • Differences between Aspects and Smart Contracts:

    The differences between Aspects and smart contracts are:

  • Smart contracts are accounts with code, while Aspects are native extensions of the blockchain.

  • Aspects can run at different points in the transaction and block lifecycle, while smart contracts execute only at fixed points.

  • Smart contracts can access their own state and limited context of the block, while Aspects can interact with global processing contexts and system-level APIs.

  • The execution environment of Aspects is designed for near-native speed.

    Aspects are merely code logic snippets, independent of accounts, and therefore cannot:

  • Write, modify, or delete contract state data.

  • Create new contracts.

  • Transfer, destroy, or hold native tokens.

These Aspects make Artela a unique platform that can extend the functionalities of smart contracts and provide a more comprehensive and customizable development environment.

*Please note that strictly speaking, the above Aspects are also referred to as "built-in" Aspects, which are embedded coprocessors run by Artela Chain full nodes. dApps can also deploy their own heterogeneous Aspects, which can run on external coprocessors. These external coprocessors can execute on external networks or by a subset of nodes in another consensus. This is more flexible because dApp developers can effectively execute any operation they wish, as long as it is safe and reasonable. This is still under exploration, and specific details have not been disclosed.

Potential Use Cases for Embedded Coprocessors

  • Complex computations involved in new DeFi projects (e.g., complex game theory mechanisms) may require embedded coprocessors to have more flexible and iterative real-time computing capabilities.

  • More flexible access control mechanisms for various dApps. Currently, access control is often limited to blacklists or whitelists based on smart contract permissions. Embedded coprocessors can unlock immediate and fine-grained levels of access control.

  • Certain complex functionalities in full-chain games (FOCG). FOCG has long been limited by EVM. If the EVM retains simpler functionalities like transferring NFTs and tokens, while other logic and state updates are computed by coprocessors, the situation may become simpler.

  • Security mechanisms. dApps can introduce their own proactive security monitoring and fail-safe mechanisms. For example, a liquidity pool could prevent withdrawals exceeding 5% every 10 minutes. If the coprocessor detects one of these withdrawals, the smart contract can halt and trigger some alert mechanisms, such as injecting emergency liquidity within a certain dynamic price range.

Conclusion

It is inevitable that dApps become large, bloated, and overly complex, so the proliferation of coprocessors is also inevitable. It is just a matter of time and the adoption curve.

Running external coprocessors allows dApps to stay in their comfort zone: wherever they were previously on any chain. However, for new dApp developers looking for deployable execution environments, embedded coprocessors are like GPUs on personal computers. If you claim to be a high-performance personal computer, you must have a decent GPU.

Unfortunately, the above projects have not yet launched on the mainnet. We cannot truly benchmark them or demonstrate which project is better suited for which use case. However, one thing is certain: technology is on a spiral upward trajectory. It may seem like we are going in circles, but remember, from the side, history will witness that technology is indeed evolving.

Long live the scalability trilemma, long live coprocessors.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
ChainCatcher Building the Web3 world with innovators