a16z Crypto CTO: Protocol design is more important than token economic design

a16z
2023-05-27 14:43:56
Collection
Beyond Token Economics

Original Title: Protocol design: Why and how

Author: Eddy Lazzarin

Compiled by: Sissi (TEDAO)

Introduction:

a16z has established an important position in guiding the development of the industry with its in-depth articles in the crypto space, providing us with the guidance needed for cognitive enhancement and transformation. Recently, a16z has been focusing on topics that go beyond token economics. First, there was a talk about "token design," followed by the publication of "Tokenology: Beyond Token Economics," and now they have launched the highly anticipated "Protocol Design" course.

As the main speaker of this course, Eddy Lazzarin, CTO of a16z Crypto, repeatedly emphasizes that the key to going beyond token economics lies in protocol design, while token design is merely a supporting tool. In this course focused on protocol design, he shared insights and inspirations for over an hour, helping entrepreneurs deeply understand the critical role of protocol design in project success. This article is a translated and condensed version. For more exciting content, see the translated full version link.

1. The Inherent Laws of Protocol Evolution

1.1. Internet Protocols: The Bonds of Interaction

The internet is a network of protocols, encompassing various types of protocols. Some protocols are straightforward, like the state diagram of HTTP, while others are quite complex, such as the interaction diagram of the Maker protocol. The following diagram showcases various protocols, including internet protocols, physical protocols, and political protocols. On the left side of the diagram, we see an interaction diagram of a street intersection, which feels familiar and interesting.

The commonality among these protocols is that they are all formalized interaction systems capable of facilitating complex group behaviors, which is a core component of protocols. The power of internet protocols lies in their ability to connect interactions between people as well as with software. We know that software is highly adaptable and efficient, capable of integrating various mechanisms. Therefore, internet protocols can be considered one of our most important, if not the most important, types of protocols.

image

1.2. The Evolution of Protocols: Web1 ------ Web2 ------ Web3

In the following chart, the horizontal axis represents the degree of decentralization and centralization of protocols, specifically the extent of control over the protocol. The vertical axis represents the economic model of the protocol, indicating whether the economic model is explicit or implicit. This distinction may seem subtle, but it has significant implications.

image

  • Web1: Decentralized & No Explicit Economic Model

Protocols from the Web1 era (such as NNTP, IRC, SMTP, and RSS) maintained neutrality in terms of value flow, ownership, access, and payment mechanisms, lacking an explicit economic model. Among them, Usenet is a protocol similar to today's Reddit, used for exchanging posts and files. IRC was an early widely used chat protocol, while SMTP and RSS were used for email and content subscriptions.

image

Usenet is a categorized platform that allows users to post relevant content on specific category sub-servers. It was an important part of early internet culture, existing outside of HTTP. Using Usenet requires specific clients and internet service providers (ISPs) that support Usenet. Usenet is distributed across a large number of constantly changing news servers, and anyone can run these servers; posts are automatically forwarded to other servers, forming a decentralized system.

While users rarely paid directly for Usenet access, some began paying for commercial Usenet servers in the late 2000s. Overall, Usenet lacked a clear protocol economic model, and users had to utilize it through their own transactional means.

These Web1 protocols are architecturally similar, stemming from the same values. Even with limited knowledge of the protocols, we can still understand how they work, demonstrating the importance of the readability and clarity of Web1 protocols. However, these protocols gradually faced failure or change over time.

The reasons for failure can be summarized in two aspects: first, the lack of specific features that could compete with Web2 counterparts; second, difficulties in obtaining funding. Ultimately, whether a protocol can adopt a decentralized approach and develop a sustainable economic model to integrate specific features determines its success or failure. In summary, Web1 protocols can be categorized as decentralized and lacking an explicit economic model.

image

  • Web2: Centralized & Explicit Economic Model

Web2 brought an interesting trend: Reddit replaced forums like Usenet, while centralized messaging systems like WhatsApp and iMessage replaced IRC. Although email still exists, it faces challenges with spam.

Moreover, RSS performed poorly in competition with Twitter. Web2 addressed the limitations of Web1 protocols by providing specific features. Email and other decentralized protocols could not verify message legitimacy, sender identity, permissions, and economic relationships, making spam management a problem. In immature decentralized systems, the lack of these features allowed centralized competitors to surpass their predecessors by offering unique functionalities.

image

Web2 protocols are entirely controlled by their owners, limited only by business strategies and legal constraints. To promote the development of Web1 protocols, a more explicit economic model was needed. However, achieving an explicit economic model while maintaining decentralization is impossible unless decentralized consensus, verifiable computation, and cryptographic tools are utilized. Protocols typically transition from the lower left corner of the design space to the upper right corner.

Sometimes, protocols become effectively centralized, such as email. Over 50% of emails are handled by centralized email service providers, leading to a highly centralized email system. Email faces pressures from spam issues, lack of economic models, DNS registration cost sharing, and high switching costs.

image

Without a viable economic model, email can only sustain itself as a side project of large tech companies. Methods to reduce spam rely on economies of scale and data integration, making it easier for companies hosting millions of email accounts to detect anomalies. Additionally, switching costs are also a significant factor. Now, we need to recognize two key centralized forces that impact different components of protocols; during the protocol design process, they continuously play a role at every turning point: network effects and switching costs.

image

Network effects refer to the phenomenon where power accumulates as the scale and widespread use of a system increase. Switching costs refer to the economic, cognitive, or time costs required for users to leave the current system and switch to another. In the case of email, switching costs are crucial for users of Gmail. If you use Gmail but do not have your own domain, the switching costs will be high.

However, if you own your domain, you can freely switch email service providers and continue using any provider to receive emails. A company can increase switching costs through protocol design, forcing or encouraging users to use specific components, thereby reducing the likelihood of users switching to other providers.

Take Reddit as an example; it is a system that allows moderators to unilaterally control sub-forums, blurring the lines between decentralization and centralization. While allowing anyone to become a moderator may be seen as a form of decentralization, if ultimate power remains concentrated in the hands of administrators (e.g., the Reddit team), it is still a fully centralized system.

High-quality user experience is unrelated to centralized power, but providing a high-quality user experience often requires financial support. In the Web1 era, decentralized protocols often struggled to provide a good user experience due to a lack of funding. Financial support plays a crucial role in delivering high-quality user experiences.

  • Web3: Decentralized & Explicit Economic Model

On platforms like Twitter, Facebook, Instagram, or TikTok, user choices are restricted, subject to the platform's interface decisions. However, how will the decentralized components introduced by Web3 change protocols? Utilizing cryptography and blockchain technology can reduce reliance on trust while clarifying economic principles and supporting decentralization. Web3 offers openness, interoperability, and open-source characteristics, with explicit economic models that can integrate funding into protocols for sustainable development, avoiding the monopolization of all value.

image

As a developer, choosing to build on a decentralized system with an explicit economic model is the best choice. This ensures the system's continued existence and understanding of the associated economic relationships without letting economic relationships develop outside the protocol. Stability and value capture need to be considered differently. Choosing to build on decentralized systems is crucial as it helps avoid potential risks and establishes a project with lasting potential to become the largest system.

Building on the internet is no longer seen as a crazy act, as the internet itself is a completely decentralized system. Similarly, using open-source programming languages and relying on web browsers has become a reliable foundation for constructing ambitious projects. Building on centralized systems may be limited in scaling the project’s scope and reach. Web3 attracts talented developers who can build larger and more ambitious projects. Other systems or platforms may emerge and compete with existing Web2 platforms, complying with regulations and possessing competitive advantages, engaging in fierce competition with Web2 platforms.

The biggest problem with Web2 networks lies in their fragility and overly optimized business models. These networks pursue optimization of specific metrics while neglecting unrelated aspects, leading to a lack of innovation and development of new products. While they possess strong network effects, they are not sufficient to form monopolies, making them vulnerable to dangers once countermeasures targeting their weaknesses are encountered.

In contrast, Web3 provides a more resilient and innovative space through decentralization and explicit economic models. Similar to a rich and diverse rainforest ecosystem, Web3 systems establish infrastructure and protocols suitable for the development of various interesting things, providing a more fertile ground for innovation. By leveraging cryptocurrencies and token economic models, the creativity and adventurous spirit of participants can be rewarded, further driving the development of the system.

Thus, Web3 possesses better ecosystem sustainability and innovation potential, rather than merely relying on the accumulation of economic resources. The explicit economic model and decentralized characteristics enable Web3 to achieve genuine innovation and development, steering clear of the pitfalls of excessive optimization and concentration of accumulation in a single domain. By introducing cryptographic technology and token economic models, Web3 offers participants greater creative space and reward mechanisms, propelling the system towards a more valuable and lasting direction.

2. Web3 Protocol Design Case Studies

2.1. Case Background and Design Goals

Let’s start with an interesting example, "Stable Horde," which is a free image generation system and a Web2 protocol. It utilizes collaborative layer functionality, allowing users to request help from others to generate images. Clients submit tasks to a work queue, workers perform inference processing, and send the results to a result storage, from which clients can retrieve results and pay Kudos points to workers. In Stable Horde, Kudos is a free points system used for prioritizing tasks. However, due to limitations in donated computing resources, the longer the queue, the longer it takes to generate images.

image

We face an interesting question: How can we scale this system to make it larger and more professional while maintaining openness and interoperability, and avoiding the risk of centralization undermining the project's original spirit? One suggestion is to convert Kudos points into ERC 20 tokens and record them on the blockchain. However, simply adding a blockchain could trigger a series of issues, such as false result attacks.

Let’s rethink the protocol design process. It should always start with a clear goal, then consider constraints, and finally determine mechanisms. When designing a system, it is necessary to weigh the goals and identify effective mechanisms. Constraints can take endogenous and exogenous forms, and by limiting the design space, mechanisms can be more clearly defined. Mechanisms are the substantive content of the protocol, such as settlement, pricing, staking, incentives, payments, and verification. The design should conform to constraints and meet explicit goals.

image

2.2. Web3 Protocol Example: Unstable Confusion

Let’s continue discussing a new Web3 protocol called "Unstable Confusion." In the following content, we will outline some interesting directions proposed in the context of converting the existing Web2 protocol "Stable Horde" into the Web3 protocol "Unstable Confusion."

As mentioned earlier, there is the issue of sending false results, so a mechanism is needed to ensure users receive the content they need, which is referred to as "verifying inference." In simple terms, we need to verify the inference to ensure its results meet expectations. Another issue involves the workers in "Stable Horde." Workers request the next task from the database in the order of requests and assign tasks to the workers who requested them first.

However, in a system involving money, workers may claim tasks en masse to earn more rewards without actually intending to complete those tasks. They may compete for low latency, snatching tasks, leading to system congestion.

To address these issues, several solutions have been proposed. First is "payment by contribution," where workers are compensated based on their contributions, competing for tasks in a way that benefits the network. Second is "flexible participation," allowing workers to freely join or exit the system at a lower cost, attracting more participants.

Finally, "low latency" is crucial for user experience regarding the speed and responsiveness of applications. Returning to our goal, it is to establish a decentralized, interoperable image generation marketplace. While we still have some key constraints, more specific details can be added, modified, or clarified later. Now, we can evaluate the feasibility of different mechanisms.

  • Potential Mechanism Design

image

a. Verification Mechanism

We can employ methods from game theory and cryptography to ensure the accuracy of inferences. Game-theoretic mechanisms can be used for dispute resolution systems, where users can escalate disputes for arbitration by specific roles. Continuous or sampling audits are another method, ensuring tasks are assigned to different workers by reviewing the work of workers and recording the audited work. Zero-knowledge proofs in cryptography can generate efficient proofs to verify the correctness of inferences. Traditional methods include trusted third-party agencies and user evaluations, but they carry risks of centralization and network effect issues.

Other possible verification mechanisms include having multiple workers complete the same task, from which users can choose results. This may incur higher costs, but if the costs are low enough, implementing this method could be considered.

b. Pricing Strategy

Regarding pricing strategies, an on-chain order book can be established. On-chain verified computational resource metrics, such as gas, can also be used. This method differs from a simple free market, where users merely post the inference fees they are willing to pay, and workers can accept or bid for tasks. Instead, users can create a metric similar to gas, where specific inferences require a certain amount of computational resources, directly determining the price. This way, the operation of the entire mechanism can be simplified.

Additionally, an off-chain order book can be utilized, which has lower operational costs and may be very efficient. However, the issue is that the person owning the order book may concentrate network effects on themselves.

c. Storage Mechanism

The storage mechanism is crucial to ensure that work results can be correctly delivered to users, but it is challenging to reduce trust risks and prove whether work has been correctly delivered. Users may question whether items have been delivered, similar to complaints about not receiving expected goods. Auditors may need to verify the computational process and check the accuracy of output results. Therefore, output results should be visible to the protocol and stored in a place accessible to the protocol.

In terms of storage mechanisms, we have several options. One is to store data on-chain, but this is expensive. Another option is to use dedicated storage encryption networks, which, while more complex, can attempt to solve problems in a peer-to-peer manner. Alternatively, data can be stored off-chain, but this raises other issues, as those controlling the storage system may influence the verification process and the transfer of final payments.

d. Task Allocation Strategy

The method of task allocation also needs to be considered, which is a relatively complex area. Workers can be allowed to choose tasks themselves after submission, or the protocol can allocate tasks after submission, or users or end-users can choose specific workers. Each method has its pros and cons, and combinations of which workers can request which tasks can also be considered by the protocol.

Task allocation involves many interesting details. For example, in a protocol-based system, it is necessary to know whether workers are online and available to decide whether to assign tasks to them. It is also essential to understand each worker's capabilities and load. Therefore, various additional factors need to be considered in the protocol, and the initial simple implementation may not have included these factors.

3. Key Points of Decentralized Protocol Design

image

3.1. Seven Key Design Elements That May Lead to Centralization Risks

These include namespaces introduced by email, payment systems, reputation, as well as storage, matching, pricing systems, and verification systems. These elements may become centralized due to network effects or high switching costs. By mitigating the accumulation of network effects, guiding network effects into the protocol, and establishing decentralized control layers within the protocol to manage it, the long-term health of the system can be ensured. Volatile tokens or other governance designs, such as reputation systems or rotating election mechanisms, can be utilized to achieve decentralized control.

3.2. Reducing Switching Costs and Promoting Interoperability

To encourage entrepreneurs to build applications on the system, it is crucial to reduce switching costs and promote interoperability between different systems. Avoid introducing high switching costs and reduce excessive reliance on off-chain order books or third-party verification systems.

3.3. Utilizing Web3 Technologies to Create Decentralized Systems

Utilize Web3 tools and principles to design systems that empower entrepreneurs and avoid excessive centralization. Protocols that embrace Web3 principles typically have larger scales, longer lifespans, and more vibrant ecosystem vitality, providing fertile grounds for innovative exploration beyond the boundaries set by existing major companies.

3.4. In-Depth Research and Selection of the Best Solutions

When designing protocols and determining strategies, it is essential to conduct in-depth research on various aspects. For verification, cryptographic solutions are often the best choice. For pricing, using on-chain verified computational resource metrics can adapt to various inference or machine learning tasks. For task allocation, employing protocols that update worker capabilities and statuses in real time can fairly allocate tasks and allow workers to choose whether to accept tasks. Storage issues can consider solutions like prototype sharding technology to address problems within short time windows and adopt temporary storage methods.

Considering the above points when designing decentralized systems can help build systems with long-term robustness and decentralized characteristics.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
banner
ChainCatcher Building the Web3 world with innovators