The next wave of narrative evolution in the crypto AI sector: Catalysts, development paths, and related assets
Author: Alex Xu, Mint Ventures
Introduction
As of now, this round of the cryptocurrency bull market cycle is the most lackluster in terms of commercial innovation, missing the phenomenal hot tracks of the previous bull market such as DeFi, NFT, and Gamefi. This has led to a lack of industry hotspots in the overall market, resulting in sluggish growth in users, industry investment, and developers.
This is also reflected in the current asset prices. Throughout this cycle, most altcoins have continuously lost value against BTC, including ETH. After all, the valuation of smart contract platforms is determined by the prosperity of applications. When the development and innovation of applications are lackluster, it is difficult for public chains to increase their valuations.
AI, as a relatively new category in the cryptocurrency business, benefits from the explosive growth and continuous hotspots in the external business world, which may still bring considerable attention to AI projects in the cryptocurrency space.
In the IO.NET report published by the author in April, the necessity of combining AI with Crypto was outlined, highlighting that the advantages of cryptocurrency economic solutions in terms of determinism, resource allocation, and trustlessness may be one of the solutions to the three challenges of AI: randomness, resource intensity, and the indistinguishability between humans and machines.
In the AI track of the cryptocurrency economy, the author attempts to discuss and deduce some important issues through this article, including:
- What emerging narratives exist in the cryptocurrency AI track that may explode in the future
- The catalytic paths and logic of these narratives
- Project targets related to the narratives
- The risks and uncertainties of narrative deductions
This article represents the author's thoughts up to the time of publication, which may change in the future. The views expressed are highly subjective and may contain errors in facts, data, and reasoning logic. Please do not use this as investment advice, and constructive criticism and discussion from peers are welcome.
The following is the main text.
The Next Wave of Narratives in the Cryptocurrency AI Track
Before formally reviewing the next wave of narratives in the cryptocurrency AI track, let’s first look at the current main narratives in cryptocurrency AI. Based on market capitalization, those exceeding $1 billion include:
- Computing Power: Render (RNDR, circulating market cap $3.85 billion), Akash (circulating market cap $1.2 billion), IO.NET (recently valued at $1 billion in the latest round of primary financing)
- Algorithm Networks: Bittensor (TAO, circulating market cap $2.97 billion)
- AI Agents: Fetch.ai (FET, pre-merger circulating market cap $2.1 billion)
Data as of: 2024.5.24, currency in USD.
Aside from the above fields, which AI track will be the next single project to exceed a market cap of $1 billion?
The author believes we can speculate from two perspectives: the narrative from the "supply side of the industry" and the narrative of the "GPT moment."
The First Perspective of AI Narratives: Opportunities in Energy and Data Behind AI from the Supply Side of the Industry
From the supply side of the industry, the four driving forces for AI development are:
- Algorithms: High-quality algorithms can execute training and inference tasks more efficiently.
- Computing Power: Whether for model training or inference, GPU hardware is required to provide computing power, which is currently a major industry bottleneck, with a chip shortage leading to high prices for mid- to high-end chips.
- Energy: The data centers required for AI generate significant energy consumption. In addition to the electricity needed for GPUs to perform computing tasks, a substantial amount of energy is also required to handle GPU heat dissipation; a large data center's cooling system accounts for about 40% of total energy consumption.
- Data: The improvement in large model performance requires expanding training parameters, which means a massive demand for high-quality data.
For the above four industrial driving forces, both the algorithm and computing power tracks have cryptocurrency projects with a circulating market cap exceeding $1 billion, while the energy and data tracks have yet to see projects of similar market cap size.
In reality, a shortage in the supply of energy and data may soon arise, becoming a new wave of industry hotspots, thereby driving a surge in related projects in the cryptocurrency field.
Let’s first discuss energy.
On February 29, 2024, Musk mentioned at the Bosch Connected World 2024 conference: "I predicted the chip shortage over a year ago; the next shortage will be electricity. I believe that next year there will not be enough electricity to run all the chips."
From specific data, the Stanford University Artificial Intelligence Institute, led by Fei-Fei Li, releases an "AI Index Report" annually. In the report published by the team in 2022, which assessed the AI industry in 2021, the research group estimated that AI's energy consumption at that time accounted for only 0.9% of global electricity demand, exerting limited pressure on energy and the environment. However, in 2023, the International Energy Agency (IEA) summarized that global data centers consumed about 460 terawatt-hours (TWh) of electricity in 2022, accounting for 2% of global electricity demand, and predicted that by 2026, global data center energy consumption would reach at least 620 TWh, potentially rising to 1,050 TWh.
In fact, the IEA's estimates may still be conservative, as numerous AI-related projects are about to launch, and their corresponding energy demand will far exceed the IEA's 2023 projections.
For example, Microsoft and OpenAI are planning the Stargate project, which is expected to start in 2028 and be completed around 2030. This project aims to build a supercomputer with millions of dedicated AI chips, providing OpenAI with unprecedented computing power to support its research in artificial intelligence, especially in large language models. The plan is expected to cost over $100 billion, which is 100 times more than the cost of current large data centers.
The energy consumption of just the Stargate project is as high as 50 TWh.
Because of this, OpenAI's founder Sam Altman mentioned at the Davos Forum in January this year: "The future of artificial intelligence requires breakthroughs in energy, as the electricity consumed by AI will far exceed people's expectations."
After computing power and energy, the next likely area of shortage in the rapidly growing AI industry is data.
In other words, the shortage of high-quality data required for AI has already become a reality.
Currently, humanity has basically grasped the growth patterns of large language model capabilities through the evolution of GPT—namely, that by expanding model parameters and training data, the model's capabilities can be exponentially enhanced—and this process shows no short-term technological bottlenecks.
However, the problem is that high-quality and publicly available data may become increasingly scarce in the future, and AI products may face a supply-demand contradiction similar to that of chips and energy.
First, there is an increase in disputes over data ownership.
On December 27, 2023, The New York Times officially sued OpenAI and Microsoft in a U.S. federal district court, accusing them of using millions of its articles without permission to train GPT models, demanding that they bear "billions of dollars in statutory and actual damages for illegal copying and use of unique works" and destroy all models and training data containing The New York Times' copyrighted materials.
By the end of March, The New York Times issued a new statement, targeting not only OpenAI but also Google and Meta. In this statement, The New York Times claimed that OpenAI transcribed a large number of YouTube videos using a speech recognition tool called Whisper and then generated text to train GPT-4. The New York Times stated that it has become very common for large companies to use underhanded means to train AI models, and that Google is also doing this by converting YouTube video content into text for training its own large models, essentially infringing on the rights of video content creators.
The lawsuit between The New York Times and OpenAI, as the "first AI copyright case," may not yield a quick result due to the complexity of the case and its profound implications for content and the future of the AI industry. One possible outcome is that both parties reach an out-of-court settlement, with the financially robust Microsoft and OpenAI paying a substantial compensation. However, more data copyright disputes in the future will inevitably raise the overall cost of high-quality data.
Additionally, as the world's largest search engine, Google has also revealed that it is considering charging for its search functionality, but the charge would not be for the general public, but for AI companies.
Source: Reuters
Google's search engine servers store a vast amount of content, arguably all content that has appeared on the internet since the 21st century. Currently, AI-driven search products, such as Perplexity overseas and Kimi and Mita domestically, process the data obtained through these searches using AI and output it to users. Charging AI for search engine access will inevitably increase the cost of data acquisition.
In fact, beyond publicly available data, AI giants are also eyeing non-public internal data.
Photobucket is an established image and video hosting site that had 70 million users and nearly half of the U.S. online photo market share in the early 2000s. With the rise of social media, Photobucket's user base has significantly declined, currently having only 2 million active users (who pay a high fee of $399 annually). According to the agreement and privacy policy signed by users upon registration, accounts that have not been used for over a year will be reclaimed, and Photobucket retains the rights to use the images and videos uploaded by users. Photobucket's CEO Ted Leonard revealed that its collection of 1.3 billion photos and videos is highly valuable for training generative AI models. He is negotiating with several tech companies to sell this data, with prices ranging from $0.05 to $1 per photo and over $1 per video, estimating that the data Photobucket can provide is worth over $1 billion.
A research team focused on AI development trends, EPOCH, published a report titled "Will we run out of data? An analysis of the limits of scaling data sets in Machine Learning," based on the use of data in machine learning and the generation of new data in 2022, considering the growth of computational resources. The report concluded that high-quality text data would likely be exhausted between February 2023 and 2026, while image data would run out between 2030 and 2060. If the efficiency of data utilization does not significantly improve or new sources of data do not emerge, the trend of large machine learning models relying on massive datasets may slow down.
Given the current situation where AI giants are actively purchasing data at high prices, it seems that free high-quality text data has indeed been largely exhausted, and EPOCH's prediction from two years ago appears to be quite accurate.
At the same time, solutions to the "AI data shortage" demand are emerging, namely: AI data provision services.
Defined.ai is a company that provides customized high-quality data for AI companies.
Examples of data types provided by Defined.ai: https://www.defined.ai/datasets
Its business model is: AI companies provide Defined.ai with their data needs. For example, regarding images, they may specify quality requirements such as minimum resolution, avoiding blurriness, overexposure, and ensuring content authenticity. Content-wise, AI companies can customize specific themes based on their training tasks, such as needing nighttime photos, nighttime cones, parking lots, and signs to improve AI's recognition rate in low-light conditions. The public can take on tasks, upload completed photos for company review, and be compensated based on the number of qualifying submissions, with prices around $1-2 for a high-quality image, $5-7 for a short video of a few seconds, and $100-300 for a high-quality video over 10 minutes. Those who take on sub-tasks can receive about 20% of the fees. Data provision may become another crowdsourced business following "data labeling."
The global distribution of tasks, economic incentives, pricing and circulation of data assets, privacy protection, and participation from everyone sound particularly like a business category suitable for the Web3 paradigm.
AI Narrative Targets from the Supply Side Perspective
The attention sparked by the chip shortage has permeated the cryptocurrency industry, making distributed computing power the hottest and highest market cap category in the AI track to date.
So, if the supply-demand contradiction in energy and data in the AI industry erupts in the next 1-2 years, what narrative-related projects currently exist in the cryptocurrency industry?
Let’s first look at energy-related targets.
There are very few energy-related projects that have launched on leading centralized exchanges (CEX), with only Power Ledger (token: POWR) being one.
Power Ledger was established in 2017 as a comprehensive energy platform based on blockchain technology, aiming to decentralize energy trading, promote direct electricity trading between individuals and communities, support the widespread application of renewable energy, and ensure transaction transparency and efficiency through smart contracts. Initially, Power Ledger operated on a consortium chain modified from Ethereum. In the second half of 2023, Power Ledger updated its white paper and launched its own comprehensive public chain, which was modified from the Solana technology framework to facilitate high-frequency microtransactions in the distributed energy market. Currently, Power Ledger's main business includes:
- Energy Trading: Allowing users to buy and sell electricity directly on a peer-to-peer basis, especially electricity from renewable sources.
- Environmental Product Trading: Trading carbon credits and renewable energy certificates, as well as financing based on environmental products.
- Public Chain Operation: Attracting application developers to build applications on the Power Ledger blockchain, with transaction fees paid in POWR tokens.
Currently, the circulating market cap of the Power Ledger project is $170 million, with a total market cap of $320 million.
Compared to energy-related cryptocurrency targets, the number of data track cryptocurrency targets is more abundant.
The author lists the data track projects that he is currently following, which have been launched on at least one of Binance, OKX, or Coinbase, arranged by FDV from low to high:
1. Streamr -- DATA
Streamr's value proposition is to build a decentralized real-time data network that allows users to freely trade and share data while maintaining complete control over their own data. Through its data marketplace, Streamr aims to enable data producers to sell data streams directly to interested consumers without intermediaries, thereby reducing costs and increasing efficiency.
Source: https://streamr.network/hub/projects
In practical collaboration cases, Streamr has partnered with another Web3 in-car hardware project, DIMO, to collect temperature, pressure, and other data through DIMO hardware sensors installed in vehicles, forming a weather data stream transmitted to the institutions in need.
Compared to other data projects, Streamr focuses more on data from the Internet of Things and hardware sensors. In addition to the aforementioned DIMO vehicle data, other projects include real-time traffic data streams from Helsinki. Therefore, Streamr's project token DATA saw a significant increase, doubling in a single day last December when the Depin concept was at its peak.
Currently, the circulating market cap of the Streamr project is $44 million, with a total market cap of $58 million.
2. Covalent -- CQT
Unlike other data projects, Covalent provides blockchain data. The Covalent network reads data from blockchain nodes via RPC, processes and organizes this data, and creates an efficient query database. This allows Covalent's users to quickly retrieve the information they need without having to perform complex queries directly from blockchain nodes; this service is also known as "blockchain data indexing."
Covalent's clients are primarily B-end, including Dapp projects such as various DeFi applications, as well as many centralized cryptocurrency companies like Consensys (the parent company of Metamask), CoinGecko (a well-known cryptocurrency asset tracking site), Rotki (a tax tool), and Rainbow (a cryptocurrency wallet). Additionally, traditional financial giants like Fidelity and the Big Four accounting firms, including Ernst & Young, are also Covalent's clients. According to data disclosed by Covalent, the project's revenue from data services has already surpassed that of leading projects in the same field, such as The Graph.
The Web3 industry, due to the completeness, openness, authenticity, and real-time nature of on-chain data, is expected to become a high-quality data source for specific AI scenarios and "AI small models." Covalent, as a data provider, has begun supplying data for various AI scenarios and has launched verifiable structured data specifically for AI.
Source: https://www.covalenthq.com/solutions/decentralized-ai/
For example, Covalent provides data for the on-chain smart trading platform SmartWhales, using AI to identify profitable trading patterns and addresses; Entendre Finance utilizes Covalent's structured data, processed by AI for real-time insights, anomaly detection, and predictive analytics.
Currently, Covalent's on-chain data services are primarily focused on finance, but as Web3 products and data types diversify, the use cases for on-chain data will further expand.
Currently, the circulating market cap of the Covalent project is $150 million, with a total market cap of $235 million, showing a significant valuation advantage compared to The Graph, a blockchain data indexing project in the same track.
3. Hivemapper -- Honey
Among all data materials, video data often has the highest unit price. Hivemapper can provide AI companies with data that includes video and map information. Hivemapper itself is a decentralized global mapping project aimed at creating a detailed, dynamic, and accessible mapping system through blockchain technology and community contributions. Participants can capture map data using dashcams and add it to the open-source Hivemapper data network, earning project tokens HONEY as rewards for their contributions. To enhance network effects and reduce interaction costs, Hivemapper is built on Solana.
Hivemapper was initially founded in 2015, with the original vision of creating maps using drones, but later found this model difficult to scale and shifted to using dashcams and smartphones to capture geographic data, significantly reducing the cost of global map creation.
Compared to street view and mapping software like Google Maps, Hivemapper can more efficiently expand map coverage, maintain the freshness of real-world maps, and improve video quality through its incentivized network and crowdsourcing model.
Before the explosion of AI's demand for data, Hivemapper's main clients included the automotive industry's autonomous driving departments, navigation service companies, governments, insurance, and real estate companies. Now, Hivemapper can provide extensive road and environmental data for AI and large models through APIs, enabling AI and ML models to better convert data into enhanced capabilities for tasks related to geographic location and visual judgment.
Data source: https://hivemapper.com/blog/diversify-ai-computer-vision-models-with-global-road-imagery-map-data/
Currently, the circulating market cap of Hivemapper -- Honey is $120 million, with a total market cap of $496 million.
In addition to the three projects mentioned above, there are also data track projects like The Graph -- GRT (circulating market cap $3.2 billion, FDV $3.7 billion), which offers blockchain data indexing services similar to Covalent; and Ocean Protocol -- OCEAN (circulating market cap $670 million, FDV $1.45 billion, this project is about to merge with Fetch.ai and SingularityNET, with tokens converting to ASI), an open-source protocol aimed at facilitating the exchange and monetization of data and data-related services, connecting data consumers with data providers to share data while ensuring trust, transparency, and traceability.
The Second Perspective of AI Narratives: The Reappearance of the GPT Moment and the Arrival of General Artificial Intelligence
In the author's view, the inaugural year of the "AI track" in the cryptocurrency industry is 2023, the year GPT shocked the world, and the explosive growth of cryptocurrency AI projects is more a result of the explosive development of the AI industry creating a "heat wave."
Although the capabilities of GPT-4, turbo, and others have continuously upgraded since GPT-3.5, and the astonishing performance of Sora in video creation, as well as the rapid development of large language models beyond OpenAI, it is undeniable that the cognitive shock brought to the public by technological advancements in AI is diminishing. People are gradually starting to use AI tools, and large-scale job replacements seem yet to occur.
So, will there be another "GPT moment" in the future, showcasing a leap in AI development that shocks the public and makes them realize that their lives and work will be changed as a result?
This moment may be the arrival of General Artificial Intelligence (AGI).
AGI refers to machines possessing comprehensive cognitive abilities similar to humans, capable of solving various complex problems, not limited to specific tasks. AGI systems exhibit high levels of abstract thinking, extensive background knowledge, general reasoning across fields, understanding of causal relationships, and the ability to transfer learning across disciplines. The performance of AGI would be indistinguishable from the best humans in various fields, and in terms of overall capability, it would completely surpass the best human groups.
In fact, whether in science fiction novels, games, or films, or in the public's expectations following the rapid proliferation of GPT, society has long anticipated the emergence of AGI that surpasses human cognitive levels. In other words, GPT itself is a precursor to AGI, a prophetic version of general artificial intelligence.
The immense industrial energy and psychological impact of GPT stem from its speed of implementation and performance exceeding public expectations: People did not expect that an artificial intelligence system capable of passing the Turing test would actually arrive, and so quickly.
In reality, artificial intelligence (AGI) may once again replicate the suddenness of the "GPT moment" within 1-2 years: just as people have begun to adapt to GPT's assistance, they will discover that AI is no longer just an assistant; it can independently complete highly creative and challenging tasks, including those that have perplexed top human scientists for decades.
On April 8 of this year, Musk was interviewed by Nicolai Tangen, Chief Investment Officer of the Norwegian Sovereign Wealth Fund, and discussed the timing of AGI's emergence.
He said, "If we define AGI as being smarter than the smartest part of humanity, I think it is likely to appear in 2025."
According to his estimation, it will take at most another year and a half for AGI to arrive, provided that "energy and hardware keep up."
The benefits of AGI's arrival are evident.
It means that human productivity levels will leap forward, and numerous scientific challenges that have trapped us for decades will be resolved. If we define "the smartest part of humanity" as the level of Nobel Prize winners, it implies that as long as energy, computing power, and data are sufficient, we could have countless tireless "Nobel laureates" working around the clock to tackle the most challenging scientific problems.
In reality, Nobel laureates are not as rare as one in hundreds of millions; they are mostly at the level of top university professors in terms of capability and intelligence, but they have been fortunate enough to choose the right direction and persist until they achieved results. Their equally capable colleagues in parallel universes may also have won Nobel Prizes. However, the number of individuals with top university professor qualifications participating in scientific breakthroughs is still insufficient, leading to a slow pace in "exploring all correct scientific directions."
With AGI, under sufficient supply of energy and computing power, we could have an infinite number of AGIs at the level of "Nobel laureates" conducting in-depth explorations in any potential scientific breakthrough direction, accelerating the pace of technological advancement by dozens of times. This technological advancement will lead to what we currently consider expensive and scarce resources increasing hundreds of times over the next 10 to 20 years, such as food production, new materials, new drugs, and high-quality education, with the costs of obtaining these resources decreasing exponentially, allowing us to sustain a larger population with fewer resources and rapidly increasing per capita wealth.
Global GDP total trend chart, data source: World Bank
This may sound a bit sensational, so let’s look at two examples that the author has also used in previous research reports on IO.NET:
In 2018, Nobel Prize winner Frances Arnold stated at the award ceremony: "Today, we can read, write, and edit any DNA sequence in practical applications, but we still cannot compose it." Just five years later, in 2023, researchers from Stanford University and the AI startup Salesforce Research in Silicon Valley published a paper in Nature Biotechnology, where they created a million new proteins from scratch using a large language model fine-tuned from GPT-3, discovering two structurally distinct proteins that both possess antibacterial properties, which may become alternatives to antibiotics. In other words, with the help of AI, the bottleneck of "protein creation" has been broken.
Previously, the AI algorithm AlphaFold predicted the structures of nearly all 214 million protein structures on Earth within 18 months, a result that is hundreds of times the cumulative work of all human structural biologists.
The transformation has already occurred, and the arrival of AGI will further accelerate this process.
On the other hand, the challenges brought by the arrival of AGI are also immense.
AGI will not only replace a large number of intellectual laborers but will also impact physical service workers, who are currently considered "less affected by AI." With the maturation of robotics technology and the reduction of production costs brought about by new materials, the proportion of jobs replaced by machines and software will rapidly increase.
At that time, two issues that once seemed very distant will quickly come to the forefront:
- The employment and income issues of a large number of unemployed individuals.
- In a world where AI is ubiquitous, how to distinguish between AI and humans.
Worldcoin/Worldchain is attempting to provide a solution, namely using a Universal Basic Income (UBI) system to provide basic income for the public and using iris-based biometric features to differentiate between humans and AI.
In fact, the practice of giving money to everyone through UBI is not a mere fantasy; countries like Finland and England have conducted UBI trials, while political parties in Canada, Spain, and India are actively proposing related experiments.
The advantage of distributing UBI based on biometric recognition + blockchain is its global nature, allowing for broader population coverage. Additionally, it can build other business models, such as financial services (DeFi), social networking, and task crowdsourcing, based on the user network expanded through income distribution, creating synergies in business within the network.
One of the corresponding targets of the impact effect brought by the arrival of AGI is Worldcoin -- WLD, with a circulating market cap of $1.03 billion and a total market cap of $47.2 billion.
Risks and Uncertainties of Narrative Deductions
Unlike many previous project and track research reports published by Mint Ventures, this article's narrative deductions and predictions carry a significant degree of subjectivity. Readers should regard the content of this article as a divergent discussion rather than a prophetic prediction of the future. The narrative deductions made by the author face many uncertainties that could lead to erroneous conjectures. These risks or influencing factors include, but are not limited to:
Energy Aspect: Rapid Decline in Energy Consumption Due to GPU Upgrades
Despite the surging energy demand surrounding AI, chip manufacturers represented by NVIDIA are continuously upgrading hardware to provide higher computing power with lower energy consumption. For instance, in March of this year, NVIDIA released the next-generation AI computing card GB200, which integrates two B200 GPUs and one Grace CPU, achieving training performance four times that of the previous generation main AI GPU H100, and inference performance seven times that of H100, while requiring only a quarter of H100's energy consumption. Nevertheless, the desire for power from AI remains far from being satisfied. With the decrease in unit energy consumption and the further expansion of AI application scenarios and demands, total energy consumption may actually rise.
Data Aspect: The Q* Plan to Achieve "Self-Generated Data"
There has been an ongoing rumor about an internal project at OpenAI called "Q," mentioned in internal communications sent to OpenAI employees. According to Reuters, citing OpenAI insiders, this may represent a breakthrough in OpenAI's pursuit of superintelligence/general artificial intelligence (AGI). Q could not only solve previously unseen mathematical problems with its abstract capabilities but also create data for training large models without relying on real-world data. If this rumor is true, the bottleneck of insufficient high-quality data for training large AI models would be broken.
The Arrival of AGI: OpenAI's Concerns
Whether AGI will indeed arrive in 2025, as Musk suggested, remains uncertain, but it is merely a matter of time. However, as a direct beneficiary of the AGI narrative, Worldcoin's greatest concern may come from OpenAI, as it is recognized as the "shadow token of OpenAI."
On May 14, OpenAI showcased the latest GPT-4o and 19 different versions of large language models at its spring product launch event, demonstrating their performance in comprehensive task scoring. From the table, GPT-4o scored 1310, seemingly much higher than the lower-ranked models, but in total score, it was only 4.5% higher than the second-place GPT-4 turbo, 4.9% higher than the fourth-place Google Gemini 1.5 Pro, and 5.1% higher than the fifth-place Anthropic Claude 3 Opus.
Just over a year has passed since GPT-3's shocking debut, and OpenAI's competitors have already closed the gap significantly (even though GPT-5 has yet to be released and is expected to be launched this year). Whether OpenAI can maintain its industry-leading position in the future seems increasingly uncertain. If OpenAI's competitive edge and dominance are diluted or surpassed, the narrative value of Worldcoin as OpenAI's shadow token will also diminish.
Furthermore, in addition to Worldcoin's iris verification scheme, an increasing number of competitors are entering this market. For instance, the palm-scanning ID project Humanity Protocol recently announced it completed a new round of financing at a $1 billion valuation, raising $30 million. LayerZero Labs also announced it would run on Humanity and join its validator node network, using ZK proofs for identity verification.
Conclusion
Finally, while the author has attempted to deduce the subsequent narratives in the AI track, it is important to note that the AI track differs from native cryptocurrency tracks like DeFi. It is more a product of the AI wave spilling over into the cryptocurrency space. Currently, many projects have not yet proven their business models, and many projects resemble AI-themed memes (for example, RNDR is similar to NVIDIA's meme, and Worldcoin is akin to OpenAI's meme). Readers should approach these developments with caution.