On-chain data analysis: How to gain insights into value from the rapidly growing DeFi world?
Contributors: Ye Wang@THUBA Research Core
Reviewers: Yofu@DAOrayaki
Abstract
Key Takeaway
Currently, mainstream on-chain data analysis tools are in the transition phase from Web2 to Web3, and future data analysis services for DeFi will certainly be rooted in the native data characteristics of Web3. In particular, the ability to gain data insights based on smart contract code logic will become the most important moat for on-chain data analysis platforms.
Currently, data products have also shown two divergent development scenarios: one focuses on building developer and user communities, primarily targeting developers and technical personnel, encouraging users to independently publish data insights through community contributions, thus establishing a positive feedback loop for the product, such as Dune. The other is based on its powerful analytical capabilities, primarily providing exclusive customized data insight services for professional institutional investors, such as Nansen and Glassnode. After establishing a mature developer community, community-driven data tools can also produce relatively high-quality insights through product iteration and community feedback, so the combination of the two models will be mainstream in the future.
From Web2 to Web3: Overview of On-Chain Data
As the name suggests, on-chain data refers to "data recorded on the blockchain." Because the blockchain is a public, decentralized distributed ledger, all transactions stored and verified cannot be altered or deleted, and anyone can access it. Therefore, on-chain data has characteristics such as public transparency, immutability, and security. These features mean that people can publicly view all transactions recorded on the blockchain and extract various raw data, including the amount and time of token transfers, wallet addresses, fees paid to miners, and on-chain transaction volumes.
With the development of smart contracts, decentralized applications (Dapps) have disrupted the architecture of traditional Web2 applications, where the backend of Web2 applications is replaced by smart contracts. Any data generated from interactions with smart contracts is published on the blockchain. Anyone can access it, making it a public product, including asset information, transaction data, and contract code. Theoretically, as long as there is enough block space on the blockchain, any data can be stored on the blockchain, and some projects are even attempting to use the blockchain as a database to store data. The large-scale use of Dapps has greatly enriched on-chain data, and various activity data related to on-chain interactions in the entire blockchain ecosystem are also included.
Web3 Application Architecture
Image source: Preethi Kasireddy, The Architecture of a Web 3.0 application
If we classify on-chain data, it can be roughly divided into the following three categories:
- Transaction data: Data containing transaction information, such as transfer amounts, sending and receiving addresses, transaction fees, etc.;
- Block data: Data about the block itself, such as timestamps, validators, included transactions, and transaction ordering;
- Contract code: Non-user-interaction-generated code data deployed on the blockchain, i.e., pre-defined smart contract code.
*Note:
User assets and transaction data on centralized exchanges (CEX) are stored in their internal databases, and each transaction is settled and recorded only within their database; transaction data is not on-chain. Therefore, it is important to clarify that the transaction data of such CEXs is not included in on-chain data and cannot be queried through blockchain explorers.
Data can only be written to the Ethereum blockchain------users can never update existing data.
Taking Ethereum as an example, we can query information about any on-chain transaction on the blockchain explorer Etherscan. Below is a transaction on block #15537394 (transaction hash 0x3af096859a880d9c33718eda59cb96e1504db7390d0e086c7260d91e87139eab), which is a PoS block after Ethereum's Merge.
As can be seen, the page contains information about the transaction's hash, status, block, timestamp, addresses of both parties, the amount of ETH transferred in the transaction, transaction fees, gas fees, and more. This information constitutes the basic components of on-chain data.
Track Analysis
Due to the fact that market prices (especially in the cryptocurrency market) are often susceptible to manipulation, various market information plays an important role in gaining market insights. As an on-chain infrastructure, on-chain data analysis can bridge this information gap, allowing traders to observe all activities occurring within the blockchain ecosystem, understand the flow of funds in the market, and assess which activities may predict market price changes, analyzing trends in the cryptocurrency market. For crypto investment, on-chain data is akin to publicly available financial and business data in traditional financial stock markets, enabling investors to reasonably value different Web3 projects based on on-chain data and refine investment decisions.
For example, investors can query the TVL (Total Value Locked) of DeFi protocols through Defilamma to assess the value of the protocol. TVL describes the total value of all tokens locked in that contract. If a user deposits $100 worth of cryptocurrency into a DeFi lending platform, the platform's TVL increases by $100. If multiple users deposit, reaching a total of $1 million, then the platform's TVL will be $1 million.
TVL reflects the liquidity and potential risks of the protocol, thus indicating its popularity and the level of trust users have in the protocol. For instance, due to the Ethereum Shanghai upgrade in March allowing staked ETH to be withdrawn, many believe that more users will be willing to stake their ETH. As the leading project in Ethereum liquid staking, Lido quickly became the market darling, with its TVL continuously increasing after the new year, surpassing Maker to become the DeFi protocol with the highest TVL.
Conversely, if a protocol currently has a low TVL with a downward trend, it indicates insufficient user confidence in the protocol, which may present risk points, and its price may be overvalued.
As shown in the figure below, following the UST de-pegging and LUNA collapse in May 2022, the total TVL of the DeFi market rapidly shrank from about $140B to below $80B. In June, affected by the collapse of crypto lending giant Celsius, the plummeting of stETH, and the bankruptcy events of a series of affected crypto companies (such as Voyager and Three Arrows Capital), the total TVL fell again to about $52B. The entire market entered a bear market, with the total TVL of the DeFi market fluctuating between $40-50B.
Market Size of On-Chain Data Analysis
From the current data services, overall on-chain data analysis products generally adopt a subscription model for profit. TAM = Number of customers * Average consumption frequency per cycle * Average transaction value
Currently, data analysis businesses can be divided into ToB and ToC, with their provided data services roughly categorized into data APIs, higher granularity/customized data tables, and data research reports. Although on-chain data is publicly transparent and can be accessed by anyone, only effective data analysis can unleash its potential. Therefore, on-chain data analysis products have extremely high added value, with many on-chain data products targeting professional users and having high average transaction values (over $200/month).
Currently, the number of unique addresses trading DeFi assets is 6.868M. If 10% of them are potential users, taking Dune's monthly fee of $420 as an example, then TAM = 6.868M * 10% * $420/month * 12 months = $3.46 billion/year.
However, in reality, due to the significant differences in tiered pricing for various applications, which can reach dozens of times, and the significant differences in the willingness and threshold of institutional and individual users to pay for data services, this article believes that relying solely on the TAM formula cannot accurately measure the market size of on-chain data services. For reference, we can compare the current global data market size and the valuations of data service giants:
According to Statista, the global big data and business analytics (BDA) market was valued at approximately $274.3 billion in 2022. The market capitalization of traditional financial data provider Thomson Reuters is about $56.7 billion. Even business intelligence applications like Tableau have valuations reaching $15.7 billion. In contrast, the unicorn Dune, which is a benchmark for on-chain data analysis platforms, has only recently reached a valuation of $1 billion.
Currently, the total market capitalization of the crypto market has just risen above $1 trillion, and the total TVL of the DeFi market is less than $50 billion. However, during the bull market, the total market capitalization approached $3 trillion. In the DeFi sector, Defillama data shows that the highest DeFi TVL reached $180 billion. According to a report released by Grand View, the global Web3.0 blockchain market is projected to have a compound annual growth rate (CAGR) of 44.9% from 2022 to 2030.
A Markets and Markets report predicts that the global blockchain venture capital market will reach $67.4 billion by 2026. Currently, as market confidence rebounds, the valuation of the DeFi market will recover, and on-chain data products still have broad application prospects. According to @0xwayne_z, the on-chain data market should currently be worth tens of billions.
Dissection of On-Chain Data Analysis Product Tracks
We can dissect the on-chain data product track from the following three aspects.
From the perspective of the data stack, a complete on-chain data analysis process requires the following steps: data extraction (Extract) → data cleaning and transformation (Transform) → loading (Load) into the data warehouse → analyzing data based on different business lines and observation metrics → outputting quantitative results. Blockchain data products can be divided into data sources, data development tools (Data Dev), and data applications (Data App), while the on-chain data analysis products we generally discuss mainly include the latter two.
From the perspective of data processing mechanisms, the data processing mode can be divided into Web2-native or Web3-native, distinguishing whether the product has transplanted and improved Web2's processing logic or developed a unique methodology based on the characteristics of Web3 activity data. This perspective mainly focuses on whether the data analysis product can find similar data processing logic in Web2.
From the perspective of data application fields, the currently relatively mature Web3 tracks for data analysis applications include DeFi, NFT, and GameFi, as these tracks themselves are large in scale, have a high demand for data-driven insights, and generate sufficient amounts and richness of on-chain interaction data to support data mining and analysis. This article mainly focuses on on-chain data analysis applications in the DeFi field, which include tracking and insight platforms for different dimensions of data such as markets, transactions, wallet addresses, asset portfolios, lending, and arbitrage.
According to @zk7hao's classification, there are already many players in the ecological niche of the on-chain data analysis track. Depending on the target users, Dapps can be divided into consumer-facing or enterprise and professional user products, while data development tools can be divided into those aimed at developer communities and those aimed at enterprises. Depending on the specific vertical fields, there are also data products targeting specific analytical needs for DEX, NFT, AML, DAO, etc.
Crypto Data Stack Product Ecosystem Image source: @zk7hao
Competitive Landscape of DeFi Data Analysis Products
As the track with the highest demand for data analysis currently, this article will focus on interpreting DeFi data analysis products.
Y-axis------Data processing stages: From data development to data insights
X-axis------Data processing mechanisms: From Web2-native to Web3-native
Competitive Landscape of DeFi Data Analysis Products
Data analysis development tools target data analysis developers and can complete ETL (Extract-Transform-Load) tasks, parsing transaction, state, and event log data into formats that can be queried using traditional languages like SQL or GraphQL and stored in databases for subsequent queries. They offer a high degree of customization but do not provide data analysis results. Leading application: Dune Analytics.
Data analysis insight tools target ordinary users, providing directly accessible data content for investors' decision-making reference. They generally present results through data visualization dashboards, making them highly readable. They offer a low degree of customization, and users can only read pre-analyzed data results, making it difficult to customize outputs according to their needs. Leading application: Nansen.
Dune, Footprint, and others have data reproduction logic similar to MySQL and Tableau in the Web2 world, while CoinMarketCap and CoinGecko also share similar logic with traditional financial market data monitoring platforms, thus their data processing logic leans towards Web2-native.
On the other hand, Etherscan's scanning information for on-chain transactions and the blocks they reside in, including sending and receiving addresses, contract logs, transaction order, validator information, etc., as well as Nansen, EigenPhi, and 0xScope Watchers' asset gain and loss calculations, fund flow tracking analyses are all born from interactions with on-chain contracts. This information is either centralized storage or missing in the traditional financial world, so we can consider that the data processing logic of the aforementioned products leans towards Web3-native.
Overview of Typical On-Chain Data Analysis Tools
Data Analysis Development Tools
Dune
------https://dune.com
【Core Functionality】 Based on PostgreSQL database, Dune stores ETH's on-chain data in a structured format in a relational database, allowing users to perform custom data analysis using SQL code and generate their own customized data visualization dashboards for other users to view.
【Business Model】 Subscription-based. As of June 21, 2022, Dune Analytics has accumulated query fees exceeding $17 million.
- Ordinary users: Free
- Member users: $420/month, allowing users to create 2000 high-performance queries at 4x speed, 100 private queries, 10 private dashboards, export 250 query results to CSV per month, and hide watermarks.
- Elite users: $1337/month, enhancing high-performance queries to 8x speed and increasing the number to 4000, in addition to allowing users to create 1000 private queries, 100 private dashboards, and export 1000 query results to CSV per month.
Footprint
------https://www.footprint.network
【Core Functionality】 Similar to Dune, Footprint allows users to customize data analysis charts and also provides raw and processed data and chart tools for users to customize their own data analysis panels.
Compared to Dune, Footprint is more beginner-friendly, as users do not need to use SQL queries but can choose to enter the "chart" interface. At this point, the interface resembles Excel, allowing users to filter, sort, calculate, and create pivot tables through the provided menu, quickly building their own data panels.
Footprint also places great emphasis on the developer community, allowing users to search for and view publicly generated charts and dashboards by other users, and copy other users' dashboards for their own modifications.
Footprint currently supports parsing 24 chains, making it the most widely covered data analysis tool for public chains. Its supported data includes 7,432 DeFi protocols, 4.35 million smart contracts, and 4.29 million NFTs.
Footprint categorizes data into three levels: bronze, silver, and gold, where bronze-level data consists of unprocessed raw on-chain transactions, transfers, activities, logs, etc.; silver-level data includes NFT, GameFi, and DeFi data across multiple chains, with extractions and labels for transactions, addresses, etc.; and gold-level data consists of aggregated business-level data, including user profiles, market capitalization, TVL, etc., which can be used directly.
【Business Model】 Subscription-based.
- Analysis Service
Free version: Includes a 1G data limit per query, 5 CSV uploads.
Business version: $299/month, paid features include access to complete historical data, 10G data limit per query, unlimited API uploads, and unlimited CSV uploads and downloads.
Team version: Custom pricing, including customized data, unlimited custom alerts, and unlimited custom dashboards. - Data API
Free version: Limited to 3,000 calls/month, 100 calls/day, 1 call/second, with 30 days of historical data retention.
Growth version: $79/month, limited to 300,000 calls/month, 10,000 calls/day, 10 calls/second, with 6 months of historical data retention, able to return 10 rows of static data and 100 rows of non-static data.
Professional version: $360/month, limited to 10,000,000 calls/month, 10,000,000 calls/day, 50 calls/second, with all historical data retention, able to return 10 rows of static data and 100 rows of non-static data, able to call SQL API database interfaces, plus a complimentary $299/month business version analysis service.
Enterprise version: Custom pricing, including customizable throughput, custom APIs, and other personalized services.
【Insights】 Footprint has its own research report column, regularly publishing independent or collaborative research reports.
GeniiData
------https://studio.geniidata.com/
【Core Functionality】 Similar to Dune, GeniiData is also a cross-chain data analysis platform based on SQL queries, providing cleaned and reliable data sources, allowing analysts to create charts and build visual dashboards by writing SQL queries.
Its advantage lies in supporting some chains that other tools do not, such as the popular new chain Aptos.
Currently, GeniiData also has its own developer community, where users can view dashboards created and shared by others.
【Business Model】 Subscription-based.
Free version: Does not allow uploading or exporting CSV data.
Premium version: Currently by invitation only, allows uploading 10 CSV data tables and unlimited CSV data exports.
Data Analysis Insight Tools
Nansen
------https://pro.nansen.ai/
【Core Functionality】 Nansen is famous for its "wallet labels" feature. Nansen provides tagging and identification of wallet addresses, boasting the largest wallet address label library currently available. On Nansen's main interface, users can query wallet address portfolios, macro data of public chains, stablecoin and DeFi, and NFT market data, etc.
- Wallet labels: By categorizing wallets, originally anonymous on-chain addresses are tagged with multiple labels, such as smart contracts, exchanges, smart currencies, funds, heavy DEX traders, NFT collectors, etc. Users can quickly understand the types of wallets executing transactions, facilitating due diligence on contracts, projects, etc., and discovering market opportunities from the activities of whale and fund wallets.
- Smart money: The most famous wallet label in Nansen is "smart money." Wallet addresses with the smart money label belong to elites in the crypto world, including whales, VCs, etc. Tracking the movements of smart money addresses can help users follow the real-time activities of whales and heavy DeFi players.
- Portfolio: Nansen users can log in to their website using wallet addresses, and once logged in, they can view all assets, transaction records, asset analysis, etc., under that wallet address.
- NFT: Nansen currently offers services such as NFT Paradise, NFT God Mode, NFT Wallet Profiler, NFT Item Profiler, NFT Leaderboards, and Smart NFT Trader dashboards. In the NFT Paradise page, users can browse macro NFT market dynamics, including real-time updates of floor prices from OpenSea, and can see NFT trends and popular NFT collections. By using the NFT profit leaderboard or NFT God Mode, users can identify NFT whales and find top holders of specific NFT collections, then use wallet address tracking to monitor their activities in real-time and discover opportunities.
- Token: Nansen's newly launched feature, similar to the NFT dashboard, provides macro issuance and trading data for tokens, with main dashboard data including Token Paradise and Token God Mode. In Token God Mode, users can see the distribution and balance of that token among CEX/DEX/smart money holders, as well as the profit and loss distribution of trades, assisting investors in decision-making.
- Smart alerts: Nansen allows users to subscribe to smart alerts, notifying them when trading activities occur at the addresses they subscribe to.
- Watchlist: Users can add wallet addresses they want to monitor to the watchlist to keep track of the movements of those addresses.
【Business Model】 Subscription-based.
- Basic version: $150/month, allowing access to Nansen's NFTs, DeFi, Wallets, Smart Money, Wallet, and Token Watchlists data and NFT research reports, creating 10 smart alerts, and using Nansen's portfolio tracking service.
- VIP version: $1,500/month, including all features of the basic version, increasing the limit of smart alerts to 100, and enjoying advanced dashboard services, including filtering, CSV export, and other customization features.
- Alpha version: $3,000/month, providing more closed community and private investment consulting services on top of the VIP version.
- Enterprise version: Custom pricing, providing Nansen's API and other custom services based on the Alpha version.
【Insights】 Nansen has its own research column, regularly publishing data insight reports and investment research analyses.
Glassnode
------https://studio.glassnode.com/
【Core Functionality】 Glassnode primarily provides data on BTC, ETH, and LTC chains, including various modeled key market data indicators, such as addresses, token distribution, holding entities, fees, derivatives rates and leverage, exchange funds, miners, market, profit and loss, supply, transaction numbers, etc., helping users comprehensively analyze the holding situation of the entire market and make trading decisions.
Although most of the data provided by Glassnode is officially modeled analysis data, user-defined features are also available in the dashboard and workbench modules, allowing users to define the indicators they need and generate charts on Glassnode Studio.
【Business Model】 Subscription-based.
- Standard version: Free, only allowing observation of first-level indicators, updated at a 24-hour level, limited to 30 API requests/min. Allows observation of complete on-chain historical data but not derivative historical data, and allows creating 1 alert.
- Advanced version: $29/month, allowing observation of second-level indicators, with first-level indicators updated at a 1-hour level and second-level indicators at a 24-hour level, limited to 120 API requests/min, allowing observation of 1 month of derivative historical data, and creating 10 alerts.
- Professional version: $799/month, allowing observation of third-level indicators, with all three levels of indicator data updated at a 10-minute level, limited to 600 API requests/min, allowing observation of complete derivative historical data, and creating 50 alerts.
- Institutional version: Custom pricing, allowing observation of third-level indicators; custom indicators + custom queries.
【Insights】 GlassNode has a research column on its official website, regularly publishing data interpretations and weekly insight reports.
EigenPhi
------https://eigenphi.io/
【Core Functionality】 Eigenphi is a tool for liquidity analysis in DeFi, focusing on MEV trading and its associated analysis. Currently, EigenPhi provides real-time MEV data, including real-time monitoring data for arbitrage, sandwich attacks, liquidations, and flash loan behaviors, as well as related trading data for the currently hottest tokens and liquidity pools, and identification of malicious tokens (tokens that may charge transfer or transaction fees without notifying users).
- MEV trading data: The most microscopic data level, including the type of MEV for that transaction (arbitrage/sandwich/liquidation, whether a flash loan was used), the EOA address of the MEV seeker, the contract address of the MEV bot, the block where the transaction is located, internal token fund flows in the transaction, gas fees, and profit and loss calculations.
- MEV contract/address data: The mesoscopic data level, including the types of MEV initiated by that address and corresponding profit and loss statistics, return on investment, frequently traded tokens, the proportion of miners accepting MEV trades, and a list of all MEV trades that address has participated in recently. If that address is a victim of a sandwich trade, it can also query the number of times it was attacked during the query period, the amount lost, and the corresponding sandwich trades and attackers.
- MEV market data: The macroscopic data level, reflecting the overall market's activity and profit and loss situation, including the profit and loss distribution and changes of different MEV types of trades, as well as the leaderboard of the most profitable MEV trades, and popular tokens and token pairs in MEV trades. In the main function page's upper right corner, the Reports menu also provides daily summaries and rankings for the above data.
- MEV real-time flow monitoring: Real-time MEV trading display flow can also help seekers quickly locate tokens and liquidity pools currently experiencing price fluctuations. If the trading tokens in the MEV real-time flow are all WETH/USDC, it is likely that there is a price difference between a certain WETH/USDC liquidity pool and other pools, thus creating numerous MEV arbitrage opportunities.
- EigenTx: EigenPhi has developed a visualization tool for the internal fund flows of trades such as arbitrage and sandwich attacks, using the address: https://eigenphi.io/mev/eigentx
Inputting the transaction hash can provide a flow chart of all tokens in that transaction, helping users identify the trading strategy of that MEV transaction. For example, the following transaction shows that the seeker found an exchange price difference for WOOF/WETH in two liquidity pools, so they first sent 0.01 ETH to the MEV bot contract, then exchanged ETH for 29.76 WOOF in a liquidity pool on Shibaswap (SSLP), and exchanged the 29.76 WOOF for 9.959 WETH in a liquidity pool on Uniswap V2. The transaction paid the miner 0.9954 ETH as a miner fee and transferred the remaining funds to two external addresses.
【Business Model】 Free for the public. Provides data APIs and customized research collaboration services for institutions.
【Insights】 EigenPhi has a research column on its official website, providing data analysis research reports on MEV arbitrage, liquidity, and security events in the DeFi field.
0xScope
------https://0xscope.com/
【Core Functionality】 Similar to Nansen, the 0xScope protocol also focuses on address identification, but emphasizes the association analysis of addresses. According to 0xScope's official statement, the protocol aims to identify the actual controlling entities behind multiple addresses through address clustering algorithms and view the transactions between associated addresses. Currently, 0xScope provides two data services: API and data analysis tool Watchers.
API: Developers can call 0xScope's API to query wallets, tokens, contracts, and other addresses. API documentation: https://0xscope.readme.io/reference/chains
Watchers: A tool developed by 0xScope based on its protocol, using the address: https://www.watchers.pro/
Similar to Nansen, Watchers also provides address labels, recent trading activities of addresses, etc. The difference is that Watchers can also view associated addresses and transactions, query the fund flows between addresses, and conduct anti-money laundering (AML) risk identification for associated transactions.
Additionally, on Watchers, users can view specific situations of wallets, tokens, smart contracts, and projects. For project evaluation, since the address clustering algorithm used by Watchers can identify the unique entities behind multiple addresses, it can easily find the "dehydrated" real interaction user data, making it very suitable for due diligence. For example, for AAVE, one can query the number of currently active entities and the token price. If a newly launched project has 100,000 new addresses but only 10,000 corresponding entities, it indicates that the new users are likely to be engaging in wash trading.
【Business Model】 Subscription-based.
- API
Free version: 100,000 points, limited to 5 calls per second. Only accessible to public endpoints.
Advanced version: $180/month, 3 million points, limited to 5 calls per second, can access API endpoints.
Enterprise version: Custom pricing, customizable API calls, can access all endpoints, and can customize endpoints. - Watchers
Free version: Monitoring list can track up to 5 entities; can only select addresses with a fixed certainty of 70%; can only view up to the highest three levels of related addresses for address association analysis; can only set 1 alert, with 50 alert pushes per hour.
Pro users: $28/month, paid features include unlimited entity monitoring lists and addresses with customizable certainty; can view all levels of related addresses for queried address clustering; can set 10 alerts, with 200 alert pushes per hour.
【Insights】 Watchers has released some data dashboards targeting specific events and entities within the tool, including tracking dashboards for the FTX & Alameda Research events and the positions of major CEXs.
Conclusion
The high added value characteristics of on-chain data analysis determine that its service users are highly differentiated, thus the business model is mostly subscription-based, with steep tiered pricing structures. In terms of expansion paths, current data products have also shown two divergent development scenarios: one focuses on building developer and user communities, primarily targeting developers and technical personnel, encouraging users to independently publish data insights through community contributions, thus establishing a positive feedback loop for the product, like leading Dune, as well as emerging Footprint, 0xScope, EigenPhi, etc., which also pay relatively more attention to community building. The other is based on its powerful analytical capabilities, primarily providing exclusive customized data insight services for professional institutional investors, such as Nansen and Glassnode. In my view, after establishing a mature developer community, community-driven data tools can also produce relatively high-quality insights through product iteration and community feedback, as Footprint, EigenPhi, and 0xScope currently have released exclusive research reports. I believe that the combination of the two models will be mainstream in the future.
The public transparency of on-chain data means that on-chain data service providers can no longer profit by selling real-time financial data and information like traditional financial data giants such as Bloomberg and Reuters, thus shifting towards finding increments from massive on-chain information. Currently, the data processing logic of leading on-chain data analysis tools is mostly in the transition phase from Web2 to Web3. Future data analysis services for DeFi will certainly be rooted in the native data characteristics of Web3, especially the ability to gain insights based on smart contract code logic, which will become the most important moat for data analysis platforms in the Web3 world.
From the user's perspective, I firmly believe that compared to narrative-driven analytical logic, returning to data-driven analytical logic can better find rational threads in the warming market sentiment. As one of the infrastructures for the development of the blockchain industry, the market value of the data analysis track will only continue to rise with the overall development of the industry, and the future prospects are broad.
Reference
[1] IOSG Weekly Brief |Current Status and Outlook of On-Chain Data Analysis Platforms #132 https://mp.weixin.qq.com/s/o1pO7unj3cUS9sWt4q_gBw
[2] What Are We Talking About When We Talk About Web3 Data?|ZONFF Research https://mp.weixin.qq.com/s/TnVIj93CYchY0wnaazLRTg
[3] The Architecture of a Web 3.0 application
https://www.preethikasireddy.com/post/the-architecture-of-a-web-3-0-application
[4] Becoming an On-Chain Data Analyst
https://sixdegreelab.gitbook.io/mastering-chain-analytics/00_introductions
[5] Seven New Generation Web3 Data Tools
https://mp.weixin.qq.com/s/CvMey3rPKRgdukYUCXw_4Q
[6] Understanding the Unicorns, Disruptors, and Future Stars in the Web3 Data Track | SevenX Ventures
https://www.panewslab.com/zh/articledetails/7f6b20861yji.html
[7] Review of On-Chain Data Analysis Tools | IOBC Capital
https://mp.weixin.qq.com/s/Oj2jl0WXoGOz_2DX7G5-lw
[8] Crypto Data Stack Landscape
https://twitter.com/zk7hao/status/1412076712444108805
[9] Data Track Projects and Simple Data Analysis | UZ Capital
https://mp.weixin.qq.com/s/PfqYmayVXbLS0XGSPVjw
[10] Research Report on Data Analysis Tool Dune Analytics
https://foresightnews.pro/article/detail/8869
[11] A&T Family: Why We Invest in Footprint Analytics?
https://mp.weixin.qq.com/s/fAsBuNnKy4tc3c-X0lmJFg
0xScope Protocol: Web3.0 Tianyancha------Watchers https://mp.weixin.qq.com/s/ReqafssBIMpKFIV1ARVRZA
[12] Nansen Wallet Labels & Emojis: What Do They Mean?
https://www.nansen.ai/guides/wallet-labels-emojis-what-do-they-mean
[13] Dune Documentation
https://dune.com/docs/
[14] EigenPhi user guide
https://eigenphi-1.gitbook.io/arbitrage-scan-user-guide/
https://docs.google.com/spreadsheets/d/1OZR9LITGc2RMVh65fO_N6TJlD2CYG3VdndTcE7Qs4qY/edit
[15] On-chain Data Analysis --- A Crucial Tool to Evaluate the Market and Make Reasonable Investment Decisions
https://medium.com/polkafoundry/on-chain-data-analysis-a-crucial-tool-to-evaluate-the-market-and-make-reasonable-investment-4d17cfcbc3eb
[16] DeFi Statistics [updated in 2023] by Nansen
https://www.nansen.ai/guides/defi-statistics-in-2022
[17] Web 3.0 Blockchain Market Size, Share & Trends Analysis Report
https://www.grandviewresearch.com/industry-analysis/web-3-0-blockchain-market-report
[18] Worldwide Big Data Business Analytics Revenue
https://www.statista.com/statistics/551501/worldwide-big-data-business-analytics-revenue/