Roundtable Discussion: Analysis Framework and Opportunities in the ZK Track

ChainCatcher Selection
2023-03-25 19:55:36
Collection
Guests contributed their industry observations on issues such as what everyone is paying attention to, how to identify whether zk projects are reliable, and where the real opportunities in zk lie.

Author: ChainCatcher

The zk track continues to be hot. Since last year, zk scaling projects have been making continuous efforts and accelerating their progress, with more zk-Rollup projects expected to go live on the mainnet this year.

Additionally, at the recently concluded ETH Denver conference, zk became the most frequently discussed buzzword among developers and investors. Last week, ChainCatcher, in collaboration with PKU Blockchain, ETH Beijing Hackathon, and Soshow, held a Twitter Space event themed "The Hot Layer2 Track: What Should We Pay Attention to About zk?" This was also the first session of our "zk Masterclass" series. Seven guests focused on the zk field contributed their industry observations on topics such as what everyone is paying attention to, how to identify whether zk projects are reliable, and where the real opportunities in zk lie.

image

Below is a summary of the event:

1. Host Claudia: At the recently concluded ETH Denver event, zk has become a hot topic. What zk hotspots have the guests noticed?

Godot: At the ETH Denver conference, I noticed several interesting phenomena.

First, people are starting to pay attention to the entry barriers of zk. Because zk has a relatively high programming threshold, requiring knowledge in cryptography, number theory, and other areas. A consensus was reached at ETH Denver on how to enable developers who do not understand zk to use zk through DSL compilers or by directly calling APIs or SDKs.

Another point is that the characteristics of zk validity proofs reveal more possibilities for zk scenarios. For example, Modulus Labs is an on-chain AI protocol based on zk, which uses zk to ensure that machine learning models are trustless, ensuring that the inputs received by the model are consistent with the user's inputs, including whether the triggering logic during computation is executed according to the established logic.

There is also zk poker. When playing poker offline, there is usually someone in the middle to deal cards or shuffle. In zk poker, due to being on-chain and decentralized, every player must participate in shuffling. The previous shuffling player will encrypt the card faces, and the subsequent players will encrypt based on the encrypted state, thus utilizing homomorphic encryption technology, allowing calculations to be performed in an encrypted state while ensuring that the results of shuffling are consistent regardless of the order. Furthermore, players must decrypt the card faces in sequence to obtain their starting cards, which raises the question of how to ensure the correctness of the computation results in an encrypted state; at this point, a zk can be added to verify the correctness of the results.

Jeffrey: We have noticed that besides the popular topics of Layer2 and zkEVM in the scaling direction, people are starting to explore zk applications in other fields, such as privacy, data sharing, and social-related identity verification. As zk algorithms mature, it becomes evident that zk has other use cases beyond scaling, including privacy and others. Additionally, zk cross-chain bridges are also a hot topic of interest.

Todd: One phenomenon is that previously, discussions about scaling or smart contracts were mostly focused on zkEVM, but at ETH Denver, people began to pay attention to zkVM (virtual machine).

zkVM may not necessarily support Solidity or the entire EVM framework; it could also support more traditional programming languages like Rust and C++. For example, the star project RISC Zero (an open-source general-purpose zkVM).

There is also zkLLVM (zk circuit compiler), which does not exist in the form of a VM but rather as a computer. For instance, the Nil Foundation's project allows developers to use different languages to develop functionalities or programs based on zk.

These are all hotspots seen from the perspective of scaling or implementing smart contracts (SmartContract) beyond zkEVM.

Additionally, I have also observed some combinations of zk and machine learning (Machine Learning). In the past, these two were often considered incompatible due to the large amount of code not being suitable for Ethereum, and the need to consider the size of proof. However, recently, some preliminary applications have started to emerge in this area. For example, I saw a small team compressing natural language learning, particularly in image recognition, onto the chain. In the future, it is very likely that zk technology will combine AI applications or machine learning models with on-chain solutions.

Young: I would like to add that to lower the entry barriers for developers into zk, both project parties and communities, such as Halo2, are actively promoting their own Turing or ZKP-based infrastructure.

Moreover, the track of DID derived from zk is also quite popular. Currently, the PSE group under the Ethereum Foundation is working on related projects, and there is a project called Axiom, which is a blockchain-based privacy protocol. In short, compared to the previous focus on anonymous coins and privacy, there are now more application directions to explore with zk.

2. Host Claudia: Currently, there are already many zk projects in the market, but some zk projects are only trying to forcefully ride on this narrative or trend. What analytical frameworks and dimensions can we use to identify a reliable zk project?

LonersLiu: I generally look at zk projects from the following dimensions:

First: the direction of the project. Because zk has only a few characteristics: completeness, soundness, and zero-knowledge.

Second: the team's ability to solve problems. The direction they are pursuing will face various issues, and whether the team has the technical capabilities to match, as well as the completeness of their business model, market entry strategy, BD, regulatory compliance, or upstream and downstream industry relationships, and various supporting development tools, and how they solve the problems they face.

Third: the competitive situation of the project’s sector. If this sector already has strong leaders, what competitive approach will be taken to enter the market?

Fourth: the target price of the project, whether it aligns with the investment stage of the institution.

These are roughly the four dimensions we use to assess a zk project. As for whether a project is reliable, it mainly reflects whether the team has a clear understanding of its position and whether it has matching technical and commercialization capabilities. Just because the technology is strong does not mean it can lead in the direction it is pursuing.

Also, does the project direction necessarily require zk? Or can it use other better solutions (such as MPC, TEE)? Is zk the optimal solution among these options?

Additionally, is there someone in the team who understands cryptography? Although with the popularization of zkEVM and the improvement of zero-knowledge proof tools, developers do not need to understand cryptography or circuits very well, since cryptography is constantly evolving, having someone in the team who is well-versed in cryptography may provide an advantage over other teams. If some technological iterations are beneficial to the project, they can discover them earlier. This depends on whether the project founder has a sufficient understanding of their project.

Todd: I think the core factor is actually the team, and whether what they are doing makes sense or is reasonable.

Many zk projects boast about the performance of their entire network. There are already good reference values available in various forums or materials. For those that sound a bit bizarre or too cool in terms of concepts, visions, and achievable efficiencies, you can compare their entire plan, the chosen technical route, the achieved efficiency with what has been practically measured online, which can help avoid some overly outrageous projects.

Godot: First, I would place zk projects at a meso-level sector perspective, examining the project from the perspective of the sector, looking at the overall upstream and downstream relationships and supply-demand situation.

For example, in ETH War, the underlying layer may revolve around ETH POS, with DVT and LSD focusing on the β (beta) yield portion of ETH, and further up may be collateral lending, structured derivatives, DEX, etc., which is the α (alpha) portion. Secondly, what are the leading projects in each horizontal sector, and what are the barriers? The key point is whether the zk attribute can help a relatively latecomer project gain certain network effects or break barriers.

Compared to zk technology itself, I would pay more attention to liquidity, network effects, and user experience.

Then, I would look at whether zk can bring about a paradigm shift and change the overall market game pattern. For example, after using Didi Taxi, users may never return to traditional taxi methods.

As for whether one can obtain a portion of the project Token's revenue? It depends on the token economic model, release curve, and whether it can capture the value of the project's development, such as the protocol's revenue, etc. It is also necessary to analyze the specific supply-demand relationship of the token.

Jeffrey: I would like to add three points. First, is the business model of the zk direction reasonable? For example, I have seen some privacy-oriented projects, but during conversations, I found that the business model was not particularly reasonable, making it difficult for users to get started, and the cooperating service providers were also unfamiliar with its business model, which could indicate that they are trying to forcefully ride the trend.

Second, the team. It mainly depends on the direction the project is pursuing. If it is a very foundational project, it will involve a lot of modifications and innovations, which requires a strong background in cryptography and possibly collaboration with academic resources. However, if the project only applies relatively mature technologies in zk, it may not necessarily require a team with a strong cryptographic background.

Third, if one wants to make a judgment on zk, a relatively quick way is to look at the specific design details of the project. If it is a project that is completely riding the zk trend, after a few rounds of questioning, they may not be able to provide more specific details. Additionally, one can look at its design architecture; if it is copying from Ethereum or other relatively mature and popular ecosystems to other networks, then one should be cautious.

Finally, I would also pay attention to the project's metrics, such as performance, testing methods, and zkEVM virtual machine compatibility information.

Maxlion: I would like to add some insights on the analysis framework for zk-Rollup.

When assessing the progress of a zk-Rollup, first, check whether its nodes are decentralized, including the degree of openness of the sequencer and nodes; second, look at the toolchain, such as whether the compiler and ID have been developed; third, examine their underlying protocol, including consensus mechanisms, design levels, and completion levels.

These three aspects will directly affect the overall mainnet progress. In fact, many Layer2 projects that announced they had gone live on the mainnet last year were half-finished mainnets, not fully formed mainnets.

Additionally, one can look at the developer community and application ecosystem of a zk-Rollup. For general zk-Rollups, the developer community is certainly a fundamental aspect of its long-term development; a solid developer community can drive continuous application incubation, rather than just a user community.

Moreover, zk-Rollup applications can be analyzed based on application migration and application innovation. Application migration often focuses on migrating more DeFi from Ethereum or bringing NFT and GameFi applications on-chain. However, some zk-Rollups may lean more towards application innovation.

Hill: First, check whether zk has played a role in or replaced the original economic game to generate consensus.

The second point is how the generation, transmission, and verification of zk proofs are decentralized. For example, Arweave may incentivize miners to provide storage through economic games, while Filecoin may rely more on zk in this aspect.

Currently, many applications use economic game token models. If zk is used to incentivize the generation of proof verifiers, whether this model is better than the previous economic game, and the decentralization of zk proof generation and transmission, is a question many L2s are trying to solve.

3. Host Claudia: This also reveals the significant obstacles facing the current development of zk. Next, could each guest discuss what dilemmas zk currently faces?

LonersLiu: From Aztec's response, it can be seen that Aztec Connect is not shutting down due to regulatory threats but mainly for commercial considerations. The maintenance cost of Aztec Connect is very high, and the team's goal is to create the next generation of programmable smart contracts, leaning towards the privacy track. You can understand that Aztec is transitioning from specific privacy to general privacy, which is somewhat similar to zkSync's transition from 1.0 to 2.0. Therefore, currently, Aztec's products are not very compatible with its future vision, so it makes sense to invest more energy into the development of new products.

Why is the maintenance cost of Aztec Connect high? Simply put, it is not profitable. The root cause lies in the business model of Rollup. Currently, Rollup's profits mainly come from reselling gas. Layer2 needs to pay Ethereum for data availability costs, and Aztec contracts cannot make money within Layer2 while continuously paying Layer1, ultimately resulting in losses. Additionally, the transaction data of Aztec contracts is quite large, but not much optimization has been done, leading to particularly high costs for submitting data to Layer1.

Regarding transaction censorship resistance, I remember during the seventh AMA of the Ethereum Foundation, someone asked whether a Rollup-centric roadmap is necessarily the best. How will censorship resistance be achieved in the future? At that time, Justin mentioned designing a mechanism where even if the nodes above do not process these transaction affairs, users can still force their transactions to be packed into the mempool for processing, which is also a solution.

Todd: There are many factors to consider in the early development of zk, such as the generation speed of the prove side, block size, and how to apply zk-friendly hash functions, etc. For Layer2, it is also necessary to consider EVM compatibility and on-chain costs.

In the face of different development environments, we initially encountered many pitfalls. Therefore, we later focused on how to lower the usage threshold of zk, allowing developers who do not understand zk to use it directly through APIs and SDKs.

Hill: The biggest obstacle zk may face is user mentality and consensus. This technology requires a high market transaction cost, but users are not yet ready to pay for zk. Just like when users first used Ethereum, they might find a few dollars in gas fees expensive, so they lack the motivation to explore the charm of decentralization. If zk costs can be reduced and users' awareness of privacy transactions can be increased, then zk can be accepted by more users.

4. Host Claudia: I am very curious how each guest views the popular market opinion at the time. Many people say to look at op in the short term and zk in the long term. What is the essence of the debate between op and zk technologies?

LonersLiu: Currently, op has better EVM compatibility, but it faces a challenge period. zk performs better in terms of security and privacy, but building zk-EVM is more difficult, and generating ZKP requires additional computational overhead.

In the early stages, op can leverage its advantages to grow the community and ecosystem while using third-party liquidity bridge solutions to temporarily alleviate the challenges. Of course, op will also introduce some solutions like proof and build similar to op stacks or Arbitrum nova to strengthen its ecosystem. Their first-mover advantage is quite evident.

As a latecomer, zk can enhance the zk-EVM experience in areas such as hardware acceleration and zk algorithm optimization. At the same time, it can use powerful recursive algorithms to create new scenarios. I believe op and zk will coexist for a long time, and the specific market share will depend on who can better attract developers and provide better infrastructure and experience for users.

Meanwhile, the Ethereum Foundation has consistently promoted zk-EVM in AMA activities, believing that future Layer1 will also be in the form of Snark, which can aggregate or compress things as needed. Vitalik organized a team of about ten people within the Ethereum Foundation to study how to upgrade to ensconced rollups. This Rollup has many benefits, as each node no longer needs to re-execute transactions to validate blocks. Currently, every node in Layer1 must recalculate many things, while zk only requires a few nodes to send a ZKP for others to verify, thus eliminating the need for nodes to re-execute, and therefore no need for state witnesses, allowing clients to be more lightweight.

Maxlion: I believe the main competition between op and zk lies in Ethereum compatibility (short term) and the trade-off of public chain performance (long term). Due to the challenge period faced by op technology, it does not possess scalability, so we can see that op has a clear upper limit. In contrast, zk can potentially achieve thousands of TPS over a relatively long time period. Both have their pros and cons, but I believe zk may be the ultimate winner.

Jeffrey: I completely agree that implementing zkEVM is very challenging. Solutions like op require a challenge period, while zk technology can provide higher credibility.

This year, several EVM mainnets may go live, including Polygon zk-EVM. Although most of the compatibility work has been completed, performance and hardware integration still need some time to resolve. From current discussions with some hardware manufacturers, it may take until the end of this year or early next year to have commercially available zk mining hardware. In the long run, zk technology is a direction worth looking forward to, as it can provide verifiability.

The second issue is essentially a verifiability problem. Currently, solutions like OP require trust because many fraud proofs have not yet been fully implemented and enabled. Therefore, if zk technology can achieve verifiability, we will not need to trust but can verify whether the submitted data is correct. This is a significant improvement, as it not only addresses gas issues but also verifies whether information computation is verifiable. Of course, these two technical solutions may coexist, as solutions like cross-chain bridges also require trust or use incentives to ensure that centralized nodes do not act maliciously. However, if zk technology can achieve cross-chain verification, it may be a more wonderful solution.

Young: I think the previous guests have summarized the current advantages and disadvantages of zk and op very well. From the perspective of the entire Layer2 track, they both address Ethereum's congestion issues.

Although both follow the Rollup route, their implementation methods differ. Zk-Rollup relies on mathematical proof, while Optimistic Rollup relies on fraud proof. From a security perspective, zk-Rollup may be better.

Additionally, when submitting data to Layer1, the data volume of Optimistic Rollup may be slightly larger, while zk-Rollup may be more suitable for smaller data volumes. With the current improvements in hardware efficiency, the submission service time in Layer2 can be one hour, as long as the final withdrawal time is normal. However, for some previous Ethereum PoW mechanisms, it is impossible to not produce a block for several hours.

In the future, the controllability of Layer1 may worsen, so zk-Rollup may play a larger role in Layer2.

LonersLiu: I want to ask, if a user uses the op network to transfer money to Coinbase but later gets challenged and rolled back, how is this issue handled? I have yet to see a good explanation. What do the guests think?

Young: I think this should depend on the cross-chain bridge the user uses. If it is the official bridge, the official party should bear the loss. I believe all Layer2s currently cannot solve this problem; they do not have rollback resistance and lack suitable solutions.

LonersLiu: Right, so from this perspective, I think zk might be better.

5. Host Claudia: Recently, many zkEVM testnets have gone live, but how far are we from the first truly usable zkEVM mainnet?

Young: I would like to share the progress of Scroll's zkEVM. We entered the Ethereum Foundation's zk-Rollup during the self-decoding phase. The first version of the testnet officially launched in August last year, and a version upgrade occurred in October last year, followed by a significant upgrade at the end of February this year.

Currently, the entire testnet is completely permissionless, with nearly 900,000 wallet addresses and over 3.7 million transactions. The testnet data is relatively leading among zk-Rollup peers. At the same time, during each upgrade of the testnet, we conduct extensive security audits and stability tests to ensure that experience, security, and stability are our top priorities.

I believe the zk-Rollup track is a very long one. Regardless of how everyone implements the zkEVM mainnet, it requires many iterations. Currently, our zkEVM testnet is very close to the mainnet version, and the main issue is solving engineering development problems. We will launch our mainnet at some point this year.

Maxlion: StarkNet may launch a more complete mainnet around this summer or the end of the year. Before launching the mainnet, three phases of testing need to be completed: usability phase, performance enhancement phase, and promoting network decentralization phase.

Currently, it is in the second performance enhancement phase, which focuses on solving issues like network congestion that previously prevented transactions. Their sequencer and selection points have been open-sourced, and the prover has also announced open-sourcing, which is expected to officially happen during this summer.

The toolchain may be completed by Q2 or the end of Q1. StarkNet has made many modifications to the development language, and some tools may not be very complete and need adjustments. Regarding token governance, StarkNet has conducted a simulation vote and plans to conduct real governance in Q2 this year. Additionally, StarkNet may have an important version upgrade in the next two weeks.

6. Host Claudia: Solutions like Polygon, Optimism, and Arbitrum occupy a large share of the market. As zk solutions mature, what do you think will be the recent, farthest, or biggest opportunities?

Young: The recent opportunity may be zkVM, as it allows developers to use some traditional programming languages to develop their programs and ultimately achieve a good combination on-chain using virtual machines.

The longer-term opportunity, in my opinion, lies in privacy opportunities, especially in financial scenarios, where ZKP is indeed a fantastic way to solve privacy issues.

LonersLiu: I also agree that the launch of zkVM will bring some opportunities. However, the recent opportunity is still the launch of zkEVM, which allows some cross-chain bridges to have a new cross-chain environment, circuit audits, the ecological projects on top, and the software-hardware acceleration business during the ZKP generation process. Furthermore, there are opportunities to build MEV and Layer3 on Layer2 and one-click deployment of Rollup services.

The biggest opportunity is that after the launch of zkEVM, various Rollups will face many issues related to liquidity and composability, which will also spur significant innovation opportunities.

Maxlion: The recent opportunity may be zk cross-chain bridges, as they can solve the fragmentation between different networks; the opportunity that comes a bit later than zk cross-chain bridges is zk-Rollup, as zk-Rollup may still need a few quarters to complete testing and stabilization.

The opportunity after zk-Rollup is the application layer opportunities deployed on zk-Rollup. One is the opportunity in KYC, such as deploying a zk-based KYC model on-chain to help us verify user identity information in a decentralized and trustworthy manner.

The second is the combination of full-chain games and zk, such as using zk technology to expand the playability of games and the logic of game operations. For example, we can buy and sell game intelligence through zk or verify the authenticity of game intelligence, including turning some agreements made by off-chain game players into online smart contract commitments, which traditional games cannot achieve. The biggest opportunity may still be the combination of zk and machine learning.

Hill: The recent opportunity for zk is to replace part of the consensus generated by economic games with mathematically verifiable products, such as public chains, cross-chain bridges, and databases. In addition to products, there are also opportunities similar to Ethereum POW, such as the emergence of mining machines that aggregate computing power.

Secondly, there are opportunities in privacy, as well as opportunities for information asymmetry games and the combination of full-chain games with zk mentioned by previous guests.

The biggest opportunity is to make the security and cost of Web3 infrastructure and applications very low, providing opportunities to compete with Web2 applications in some scenarios covered by Web2.

The farthest opportunity is that extreme arbitrary computation + verifiability + programmable privacy will impact the trust rules of society, leading to less information disclosure by individuals or organizations while reducing the frequency of lies. For example, people may use zk proofs to replace traditional language and other means to gain trust from others, ultimately approaching the ideal harmonious society envisioned by everyone.

Audience @zoezts asks: For zkVM that can adapt to Rust and C++ languages, where should zk proofs be placed? In the verification process, does it provide raw information and then perform a bellfaire on-chain to prove that the verification is correct? Does this mean that this chain is not an EVM chain but rather a chain based on Rust or others? Is this a constraint on the zkVM ecosystem or infrastructure?

Young: In fact, what they really want to upload is the computation result. As long as the computation result is obtained on-chain using a bellfaire smart contract to retrieve the zkEVM computation result, it will not be limited by your on-chain content or chain structure. It can also be said that the results of programs run through zkVM can ultimately be used on any chain.

@zoezts: zkVM and zkEVM are not targeting the same market?

Maxlion: They are different narratives. Both in terms of technical aspects and application trends, they have their own different focuses. zkEVM aims to strike a balance between zk technology and Ethereum compatibility, while zkVM seeks to maximize the utilization of zk's characteristics.

zkEVM relatively discards many zk-unfriendly parts from the previous Ethereum network, featuring many more native designs, which may result in better performance compared to zkVM. On the other hand, zkVM may be more technically friendly, but the trade-off is that it may not be directly compatible and may require some layer3 to achieve compatibility with Ethereum.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
banner
ChainCatcher Building the Web3 world with innovators