The three dilemmas of airdrops: how to acquire customers, retain them, and what is the correct approach to airdrops for the project?

Deep Tide TechFlow
2024-03-27 21:38:15
Collection
Stop conducting one-time airdrops!

Written by: KERMAN KOHLI

Compiled by: Shenchao TechFlow

Recently, Starkware launched a highly anticipated airdrop campaign. Like most airdrops, this has sparked a lot of controversy.

So why does this situation keep happening over and over again? People might hear some of the following viewpoints:

  • Team insiders just want to sell off and cash out billions of dollars

  • The team doesn't know better practices and hasn't received the right advice

  • Whales should be given higher priority because they bring total locked value

  • Airdrops are meant to democratize participation in cryptocurrency

  • Without yield farmers, there would be no usage or stress testing of the protocol

  • Mismatched airdrop incentives continue to produce strange side effects

None of these viewpoints are wrong, but none are entirely correct either. Let's delve into some of these perspectives to ensure we have a comprehensive understanding of the current issues.

When conducting an airdrop, you must choose between three factors:

  • Capital Efficiency

  • Decentralization

  • Retention Rate

You will often find that airdrops perform well on one dimension but rarely achieve a good balance across two or all three dimensions.

Capital Efficiency refers to the standard used to determine how many tokens to allocate to participants. The more efficient you are in distributing the airdrop, the more it turns into liquidity mining (earning one token for every dollar deposited), which benefits whales.

Decentralization refers to who receives your tokens and based on what criteria. Recent airdrops have adopted arbitrary standards to maximize the coverage of the token-receiving crowd. This is often a good thing, as it can help you avoid legal disputes and allow people to get rich and gain more prestige.

Retention Rate is defined as the retention rate of users after the airdrop. In a sense, this is a way to measure how aligned users are with your intentions. The lower the retention rate, the less aligned users are with your intentions. As an industry benchmark, a 10% retention rate means that only 1 out of 10 addresses is your actual user!

Setting retention rate aside, let's take a closer look at the first two factors: capital efficiency and decentralization.

Capital Efficiency

To understand the first point about capital efficiency, let's introduce a new term: "sybil coefficient." It essentially calculates how much benefit you gain from allocating one dollar of capital across a certain number of accounts.

Your position within this range will ultimately determine how wasteful your airdrop becomes. If your sybil coefficient is 1, technically, this means you are running a liquidity mining program, which will frustrate many users.

However, when you have a project like Celestia, where the sybil coefficient skyrockets to 143, you will see extremely wasteful behavior and rampant liquidity mining.

Decentralization

This brings us to the second point about decentralization: you ultimately want to help the "little guy," who is a genuine user and willing to use your product early on, even though they may not be wealthy. If your sybil coefficient is close to 1, you will hardly give the "little guy" much of the airdrop, while most of it goes to "whales."

Now, the airdrop debate becomes heated. There are three types of users here:

  1. "Little Guy A," who just wants to make some quick money and then leave (perhaps using a few wallets in the process)

  2. "Little Guy B," who wants to stick around after receiving the airdrop and likes your product

  3. "Professional yield farmers who behave like many little guys," who are definitely there to take most of your incentives and then move on to the next project.

The third type is the worst, the first type is somewhat acceptable, and the second type is the best. How we distinguish between these three is a significant challenge in the airdrop issue.

So, how do you solve this problem? While I don't have a specific solution, I have a philosophical thought on how to address this issue that I've been contemplating and observing personally over the past few years: project-relative segmentation.

Let me explain what I mean. Zooming out, think about the meta-question: you have all the users, and you need to be able to segment them into several groups based on some value judgment. The value here is context-specific to the observer and will vary by project. Trying to impose some "magical airdrop filter" is never enough. By exploring the data, you can start to understand the true situation of your users and begin to make data-driven decisions on how to execute your airdrop specifically.

Why doesn't anyone do this? That's another article I will write in the future, but a very brief summary is that it's a problem that requires data expertise, time, and money. Not many teams are willing or able to do this.

Retention Rate

The last dimension I want to discuss is retention rate. Before we talk about it, it's best to define what retention rate means. I would summarize it as follows: Retention Rate = Number of people receiving the airdrop / Number of people retaining the airdrop

Most airdrops make a typical mistake of treating this as a one-time event.

To prove this point, I think some data might be needed! Fortunately, OP actually executed multiple rounds of airdrops! I wish I could find some simple Dune dashboards that would provide the retention data I wanted, but unfortunately, I was wrong. So, I decided to gather the data myself.

I didn't want to complicate things; I just wanted to understand one simple thing: how the percentage of users with a non-zero OP balance changes as consecutive airdrops occur.

I accessed this website to get the list of all addresses that participated in the OP airdrop. Then I built a small crawler to manually fetch the OP balance of each address in the list (using some of our internal RPC points for this) and did some data processing.

Before we dive deeper, an important note is that each OP airdrop is independent of the previous one. There are no rewards or links to retain the tokens from the previous airdrop.

Airdrop 1

According to the standards provided here, 248,699 recipients were sent tokens. In short, users received tokens based on the following actions:

  • OP mainnet users (92,000 addresses)

  • Repeated OP mainnet users (19,000 addresses)

  • DAO voters (84,000 addresses)

  • Multisig signers (19,500 addresses)

  • Gitcoin donors on L1 (24,000 addresses)

  • Users excluded due to Ethereum price (74,000 addresses)

After analyzing all these users and their OP balances, I obtained the following distribution. A balance of 0 indicates that the user has sold off, as unclaimed OP tokens are directly sent to eligible addresses; for details, click on this website.

In any case, compared to previous airdrops I observed, this first airdrop was surprisingly good! Most retention rates were above 90%. Only 40% had a balance of 0%, which is remarkably good.

Then I wanted to understand how each standard played a role in determining whether users might retain their tokens. The only problem with this approach is that addresses may belong to multiple categories, which can distort the data. I won't just look at surface value but rather a rough indicator:

Among one-time OP users, the proportion of users with a balance of 0 is the highest, followed by those excluded due to Ethereum price. It's clear that these users are not the best user group. The proportion of multisig users is the lowest, which I think is a good indicator because it’s not obvious for yield farmers to set up a multisig for airdrop transactions!

Airdrop 2

This airdrop was distributed to 307,000 addresses, but in my opinion, this airdrop was not thought out well enough. The standards were set as follows:

  • Governance rewards based on the amount of OP delegated and the duration of delegation.

  • Partial gas refunds for active OP users who spent a certain amount on gas fees.

  • Multiplier rewards determined by additional attributes related to governance and usage.

To me, this intuitively feels like a poor standard because governance voting is something that can easily be manipulated by bots and is quite predictable. As we will see below, my intuition was not too far off. I was surprised at how low the actual retention rate was!

Close to 90% of addresses held 0 OP balance! This is a common retention statistic that people are used to seeing with airdrops. I would love to discuss this further, but I want to turn to the remaining airdrops.

Airdrop 3

This was definitely the best-executed airdrop by the OP team. Its standards were more complex than before. This airdrop was distributed to about 31,000 addresses, making it smaller but more effective. Here are the details, source link here:

  • Cumulative amount of OP delegated daily (i.e., 100 days delegating 20 OP: 20 * 100 = 2,000 OP delegated x days).

  • Representatives who must vote on the OP governance chain during the snapshot period (from January 20, 2023, 00:00 UTC to July 20, 2023, 00:00 UTC).

A key detail to note here is that the on-chain voting standard was set after the previous airdrop period. Therefore, users who participated in the first round might think, "Okay, I've done what I needed to do for the airdrop; it's time to move on to the next thing." This is great because it helps analyze the retention statistics!

Only 22% of airdrop recipients had a token balance of 0! To me, this indicates that the wastefulness of this airdrop is far less than any previous ones. This supports my argument that retention rate is crucial, and the additional data from multiple airdrops is more useful than people think.

Airdrop 4

This airdrop was distributed to 23,000 addresses and had more interesting standards. Personally, I thought the retention rate would be high, but after some thought, I have a hypothesis about why it might be lower than expected:

  • You created NFTs for participating in transactions on the superchain. The total gas involved in NFT transfer transactions created by your address on the OP chain (OP Mainnet, Base, Zora) measured over the 365 days leading up to the airdrop deadline (from January 10, 2023, to January 10, 2024).

  • You created appealing NFTs on the Ethereum mainnet. The total gas amount on Ethereum L1 involved in transactions related to NFT transfers created by your address over the past 365 days (from January 10, 2023, to January 10, 2024) before the airdrop deadline.

You would certainly think that those who create NFT contracts would be a good indicator, right? Unfortunately, that is not the case. The data indicates the opposite.

While the situation is not as bad as airdrop 2, we took a significant step back in retention rate compared to airdrop 3.

My hypothesis is that if they had applied extra filtering for NFT contracts marked as spam or with some "legitimacy," these numbers would have improved significantly. This standard is too broad. Additionally, since tokens were directly airdropped to these addresses (without requiring a claim), you find a situation where scam NFT creators think, "Wow, this is free money. Time to sell."

Conclusion

As I wrote this article and gathered data myself, I managed to prove/disprove some of my assumptions, which turned out to be very valuable. In particular, the quality of your airdrop is directly related to your filtering standards. Those trying to create a universal "airdrop score" or use advanced machine learning models will fail due to inaccurate data or a lot of false positives. Machine learning is great until you try to understand how it arrived at its conclusions.

While writing the scripts and code for this article, I obtained data from the Starkware airdrop, which was also an interesting exercise. I will discuss this in the next article. The key points the team should learn from this are:

  • Stop doing one-off airdrops! This is like shooting yourself in the foot. You want to deploy incentive measures similar to A/B testing. Iterate extensively and leverage past experiences to guide your future goals.

  • By establishing standards based on past airdrops, you will improve your efficiency. In fact, give more tokens to those holding tokens in the same wallet. Make it clear to your users that they should stick to one wallet and only change wallets when absolutely necessary.

  • Obtain better data to ensure smarter and higher-quality airdrop segmentation. Bad data = bad results. As we saw in the article above, the lower the "predictability" of the standards, the better the retention rate results.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
ChainCatcher Building the Web3 world with innovators