<abbr id="nhvpa"><tbody id="nhvpa"></tbody></abbr>
    <noscript id="nhvpa"></noscript>

    <ruby id="nhvpa"></ruby>
  1. 當前位置:首頁 > 資訊 >

    The airdrop trilemma: It is necessary to both acqu

    Written by: KERMAN KOHLI

    Compiled by: Deep Wave TechFlow

    The airdrop trilemma: It is necessary to both acqu

    Recently, Starkware launched its highly anticipated airdrop campaign. Like most airdrops, this caused a lot of controversy.

    So why does this happen over and over again? Some of the ideas people may hear include:

    • Team insiders just want to sell off and cash out billions

    • The team doesn’t know what to do better and isn’t getting the right advice

    • Whales should be given higher priority because they bring in total locked value

    • Airdrops are an effort to democratize participation in cryptocurrencies

    • Without masturbators, there would be no use or stress testing of protocols

    • Mismatched airdrop incentives continue to have strange side effects

    None of these ideas are wrong, but none of them are entirely correct either. Let’s dive into some of these perspectives to ensure we have a comprehensive understanding of the issues at hand.

    When making an airdrop you have to choose between three factors:

    • capital efficiency

    • Decentralization

    • Retention rate

    You tend to find that airdrops perform well in one dimension, but rarely strike a good balance in two or all three dimensions.

    Capital efficiency refers to the criteria used to determine how many tokens are provided to participants. The more efficiently you distribute your airdrop, the more it turns into liquidity mining (one token for every dollar deposited), which benefits whales.

    Decentralization refers to who gets your tokens and according to what criteria. Recent airdrops have adopted an approach that employs arbitrary criteria to maximize the reach of those who receive the tokens. This is usually a good thing because it saves you from getting into legal trouble and helps make people rich and gain more prestige.

    Retention rate is defined as the retention rate of users after the airdrop. In a sense, it's a way of measuring how consistent your users are with your intent. The lower the retention rate, the more inconsistent users are with your intentions. As an industry benchmark, a 10% retention rate means that only 1 out of 10 addresses is your real user!

    Putting retention aside, let’s look at the first two factors in more detail: capital efficiency and decentralization.

    capital efficiency

    To understand the first point about capital efficiency, let's introduce a new term: "sybil co-efficient". It basically calculates how much benefit you get from allocating a dollar of capital to a certain number of accounts.

    The airdrop trilemma: It is necessary to both acqu

    Where you fall within this range will ultimately determine how wasteful your airdrop will be. If your sybil coefficient is 1, technically this means you are running a liquidity mining scheme that will piss off many users.

    However, when you take a project like Celestia, where the sybil coefficient explodes to 143, you see extremely wasteful behavior and rampant liquidity mining.

    Decentralization

    This brings us to the second point about decentralization: what you ultimately want to help is the “little guy” who is a real user and is willing to use your product early, even though they are not wealthy. If your sybil coefficient is close to 1, then you will give almost no airdrops to the "little guys" and most of the airdrops to the "whales".

    Now, the airdrop debate is getting heated. There are three types of users here:

    1. "Little guy A", they just want to make some quick money and leave (maybe using a few wallets in the process)

    2. "Little guy B", they still want to stay after getting the airdrop and like your product

    3. "People who act like a lot of little professional fly-droppers" are absolutely out to take away most of your incentives and move on to the next project.

    The third type is the worst, the first type is still acceptable to some extent, and the second type is the best. How we distinguish between the three is a major challenge in the airdrop problem.

    So, how do you solve this problem? While I don't have a concrete solution, I have a philosophical thought on how to solve this problem that I've been thinking about and observing personally over the past few years: project-relative segmentation.

    I'll explain what I mean. Zoom in and think about the meta-problem: you have all your users, and you need to be able to group them into groups based on some value judgment. The value here is relative to the observer's specific context and therefore will vary from project to project. Trying to impart some kind of "magic airdrop filter" is never enough. By exploring the data, you can start to understand what your users are really like and start making data science-based decisions about how to execute your airdrops.

    Why doesn't anyone do this? This is another article I will write in the future, but the very short summary is that this is a difficult problem that requires data expertise, time, and money. Not many teams are willing or able to do this.

    Retention rate

    The last dimension I want to discuss is retention. Before we talk about it, it’s a good idea to define what retention rate means. I would summarize it as follows: Retention rate = number of people who received the airdrop / number of people who retained the airdrop

    A classic mistake most airdrops make is making this a one-time thing.

    To prove this, I thought I might need some data here! Luckily, the OP actually performed multiple rounds of airdrops! I wished I could find some simple Dune dashboard that would give me the retention data I wanted, but unfortunately I was wrong. So, I decided to go get the data myself.

    I don’t want to overcomplicate it, just want to understand one simple thing: how the percentage of users with non-zero OP balance changes as successive airdrops progress.

    I visited this website and got a list of all addresses participating in the OP airdrop. I then built a small crawler that manually fetches the OP balance for each address in the list (using some of our internal RPC points for this) and did some data processing.

    Before we delve deeper, an important note is that each OP airdrop is independent of the previous airdrop. There are no rewards or links to retain tokens from the previous airdrop.

    Airdrop 1

    248,699 recipients were issued according to the criteria provided here. In short, users were awarded tokens based on the following actions:

    • OP Mainnet Users (92,000 addresses)

    • Duplicate OP mainnet users (19,000 addresses)

    • DAO voters (84,000 addresses)

    • Multisig signers (19,500 addresses)

    • Gitcoin donors on L1 (24,000 addresses)

    • Users excluded due to Ethereum price (74,000 addresses)

    After analyzing all these users and their OP balances, I got the following distribution. A balance of 0 indicates that the user has sold, as unclaimed OP tokens are sent directly to eligible addresses, please click this website for details.

    Regardless, this first airdrop was surprisingly good compared to previously performed airdrops that I have observed! Most airdrop rates are above 90%. Only 40% with a 0% balance is surprisingly good.

    The airdrop trilemma: It is necessary to both acqu

    I then wanted to understand how each criterion plays a role in determining whether a user is likely to retain a token. The only problem with this approach is that addresses may belong to more than one category, which skews the data. I wouldn't take it at face value, but a rough indicator:

    The airdrop trilemma: It is necessary to both acqu

    Among one-time OP users, the highest proportion are those with a 0 balance, followed by those excluded due to Ethereum price. It's obvious that these users are not the best user base. The percentage of multisig users is the lowest, which I think is a good indicator because it’s not obvious to airdrop farmers to set up a multisig for airdrop transactions!

    airdrop 2

    This airdrop was distributed to 307,000 addresses, but in my opinion, this airdrop was not well thought out. The standard settings are as follows:

    • Governance delegation rewards based on the number of delegated OPs and the delegation time.

    • Partial gas refund for active OP users who have spent a certain amount on gas fees.

    • Multiplier rewards determined by additional attributes related to governance and usage.

    To me, this doesn't intuitively feel like a good standard because governance voting is something that's easily manipulated by bots and fairly predictable. As we'll see below, my intuition wasn't too far off the mark. I was surprised at how low the actual retention rate was!

    The airdrop trilemma: It is necessary to both acqu

    Close to 90% of addresses hold 0 OP balance! This is a common airdrop retention statistic that people are used to seeing. I'd love to discuss this in more depth, but I'd rather turn to the remaining airdrops.

    airdrop 3

    This is definitely the best executed airdrop by the OP team. Its standards are more complex than ever. This airdrop was distributed to approximately 31,000 addresses, making it smaller but more effective. Here are the details, source click here:

    • Cumulative amount of OP delegated per day (i.e. 20 OP delegated in 100 days: 20 * 100 = 2,000 OP delegated x days).

    • Representatives who must vote on the OP governance chain during the snapshot period (January 20, 2023 0:00 UTC to July 20, 2023 0:00 UTC).

    A key detail to note here is that the criteria for voting on-chain is after the previous airdrop period. Therefore, users who participated in the first round may think "ok, I've done what the airdrop needs to do, it's time to move on to the next thing". This is great because it helps with analytics and look at those retention statistics!

    The airdrop trilemma: It is necessary to both acqu

    Only 22% of airdrop recipients have a token balance of 0! To me, this shows that this airdrop was far less wasteful than any previous one. This fits with my argument that retention is critical and that the additional data of multiple airdrops is more useful than people give it credit for.

    airdrop 4

    This airdrop was distributed to a total of 23,000 addresses and had more interesting criteria. I personally thought the retention rate would be high this time around, but after thinking about it, I have a hypothesis as to why it might be lower than expected:

    • You create an NFT on the hyperchain that participates in the transaction. The total Gas of NFT transfer transactions created by your address on the OP chain (OP Mainnet, Base, Zora). Measured within 365 days before airdrop deadline (January 10, 2023 to January 10, 2024).

    • You create an attractive NFT on the Ethereum mainnet. The total gas amount of Ethereum L1 in transactions involving the transfer of NFTs created at your address in the past 365 days before the airdrop deadline (January 10, 2023 to January 10, 2024).

    You would surely think that people creating NFT contracts would be a good indicator, right? Unfortunately, this is not the case. The data suggests the opposite.

    The airdrop trilemma: It is necessary to both acqu

    Although the situation is not as bad as Airdrop 2, compared to Airdrop 3, we have taken a big step back in terms of retention rate.

    My hypothesis is that if they had additional filtering on NFT contracts that were marked as spam or had some sort of "legitimacy", these numbers would improve significantly. This standard is too broad. Additionally, since the tokens are airdropped directly to these addresses (without having to claim them), you'll find a situation where scam NFT creators will think "Wow, that's free money. Time to sell."

    at last

    As I wrote this article and obtained the data myself, I managed to prove/disprove some of my hypotheses, which proved to be very valuable. In particular, the quality of your airdrop is directly related to your screening criteria. People trying to create a universal “airdrop score” or use advanced machine learning models will fail due to inaccurate data or a large number of false positives. Machine learning is great until you try to understand how it arrives at the answer.

    While writing the scripts and code for this article, I was given data from Starkware airdrops, which was also a fun exercise. I will talk about this in my next post. The key points teams should take away from this are:

    • Stop doing one-off airdrops! This is shooting yourself in the foot. You w

    猜你喜歡

    微信二維碼

    微信二維碼
    久久夜色精品国产噜噜亚洲a| 久久久亚洲裙底偷窥综合 | 久久久久99精品成人片试看| 色婷婷噜噜久久国产精品12p| 色综合久久天天综合| 国产精品久久久久久| 久久青草国产精品一区| 99久久这里只有精品| 91久久国产视频| 久久亚洲色一区二区三区| 欧美日韩精品久久久免费观看| 一级A毛片免费观看久久精品| 亚洲午夜精品久久久久久浪潮| 久久综合亚洲色HEZYO社区| 中文字幕久久波多野结衣av| 欧美喷潮久久久XXXXx| 久久九九有精品国产23百花影院| 成人午夜精品久久久久久久小说 | 蜜桃麻豆WWW久久囤产精品| 久久婷婷五月综合色奶水99啪| 亚洲AV无码久久精品色欲| 潮喷大喷水系列无码久久精品| 国产精久久一区二区三区| 狠狠综合久久综合中文88| 一本色道久久综合狠狠躁篇| 精品国产乱码久久久久久郑州公司| 777米奇久久最新地址| 国产ww久久久久久久久久| 亚洲国产精品成人AV无码久久综合影院| 亚洲精品无码久久久久去q| 国产国产成人久久精品| 狠狠色丁香久久婷婷综合| 久久99久久99精品免视看动漫| 韩国三级中文字幕hd久久精品| 亚洲综合精品香蕉久久网| 狠狠88综合久久久久综合网 | 久久久久久国产精品无码下载| 久久91精品国产91久久户| 久久亚洲精品国产亚洲老地址| 久久91精品国产91久久麻豆| 免费久久人人爽人人爽av|