DeAI Potential Stock OORT: Breaking the Bottleneck of AI Development and Inspiring Everyone to Contribute Data Enthusiastically
Author: ChainCatcher
The AI sector has entered an explosive era. According to the research report "2024 AI Investment Report" by consulting firm Dealroom, global AI investment is expected to reach $65 billion, accounting for one-fifth of all venture capital. Goldman Sachs' research department also stated that global AI investment could approach $200 billion by 2025.
Thanks to the AI boom, funds are flocking to AI targets. For example, the A-share company Cambricon has surged over 560% since its low in February this year, with a market capitalization exceeding 250 billion RMB; the U.S. company Broadcom has surpassed a market value of $1 trillion, becoming the eighth largest publicly traded company in the U.S.
The combination of AI and Crypto is also showing a hot trend. During the artificial intelligence conference hosted by Nvidia, Bittensor (TAO) led with a market value of over $4.5 billion, while assets like Render (RND) and Fetch.ai (FET) have seen rapid value growth.
Following large language models, AI Agents have become the engine of this round of AI market. For instance, the token of GOAT surged over 100 times in 24 hours, and ACT rose nearly 20 times in a single day, igniting the Crypto world's enthusiasm for AI Agents.
However, behind the rapid development of AI, there are also concerns. According to an article by Dr. Max Li, founder and CEO of OORT, published in Forbes titled "AI Failures Will Surge in 2025: A Call for Decentralized Innovation," the AI industry faces numerous issues, such as data privacy, ethical compliance, and trust crises caused by centralization, which increase the risk of AI failures. Therefore, decentralized innovation has become an urgent priority.
Currently, OORT has established one of the world's largest decentralized cloud infrastructures, with network nodes covering over 100 countries, generating millions of dollars in revenue, and launching the open-source Layer 1 Olympus protocol (its consensus algorithm is "Proof of Honesty" PoH, protected by U.S. patents). Through the native token OORT, it encourages everyone to contribute data, achieving an incentive closed loop. Recently, OORT launched OORT DataHub, marking a further step towards global, diverse, and transparent data collection, laying a solid foundation for the explosion of DeAI.
OORT Born from Accidental Classroom Moments
To understand the OORT project, one must first understand the problems OORT aims to solve. This involves discussing the current bottlenecks in AI development, primarily related to data and centralization issues:
1. Disadvantages of Centralized AI
1. Lack of transparency leading to trust crises. The decision-making process of centralized AI models is often opaque, seen as "black box" operations. Users find it difficult to understand how AI systems make decisions, which can lead to severe consequences in critical applications such as medical diagnosis and financial risk control.
2. Data monopoly and unequal competition. A few large tech companies control vast amounts of data, creating a data monopoly. This makes it difficult for new entrants to obtain sufficient data to train their own AI models, hindering innovation and market competition. Additionally, data monopolies may lead to the misuse of user data, further exacerbating data privacy issues.
3. Ethical and moral risks are hard to control. The development of centralized AI has raised a series of ethical and moral issues, such as algorithmic discrimination and bias amplification. Moreover, the application of AI technology in military and surveillance fields has raised concerns about human rights, security, and social stability.
2. Data Bottleneck
1. Data desert. In the booming development of artificial intelligence, the issue of data deserts has gradually emerged as a key factor restricting further development. The demand for data from AI researchers has exploded, yet the supply of data has struggled to keep up. Over the past decade, the continuous expansion of neural networks has relied on large amounts of data for training, as seen in the development of large language models like ChatGPT. However, traditional datasets are nearing exhaustion, and data owners are beginning to restrict content usage, making data acquisition increasingly difficult.
The causes of data deserts are multifaceted. On one hand, data quality is uneven, with issues of incompleteness, inconsistency, noise, and bias severely affecting model accuracy. On the other hand, scalability challenges are significant; collecting sufficient data is costly and time-consuming, maintaining real-time data is difficult, and manual annotation of large datasets poses a bottleneck. Additionally, access and privacy restrictions cannot be ignored; data silos, regulatory constraints, and ethical issues make data collection arduous.
Data deserts have a profound impact on AI development. They limit model training and optimization, potentially forcing AI models to shift from pursuing large-scale to more specialized and efficient approaches. In industry applications, achieving precise predictions and decisions becomes challenging, hindering AI's greater role in fields like healthcare and finance.
To address data deserts, researchers and companies are actively exploring various avenues. For instance, attempts to collect non-public data face issues of legality and quality; focusing on specialized datasets, though promising, still requires validation of their availability and practicality; generating synthetic data, while having some potential, also presents numerous drawbacks. Furthermore, optimizing traditional data collection methods and exploring decentralized data collection solutions have become important directions for solving data deserts. In summary, the issue of data deserts urgently needs resolution to promote the continuous and healthy development of AI.
2. The "data black box" of centralized AI leads to issues such as privacy concerns, lack of diversity, and opacity.
In the current model, the data collection and processing processes lack transparency, leaving users often unaware of the fate and usage of their personal data. Many machine learning algorithms require vast amounts of sensitive user information for training, which poses risks of data leakage. If privacy protection measures are inadequate, users' private information may be misused, leading to a trust crisis.
A lack of diversity is also a significant drawback. Currently, the data relied upon by centralized AI is often concentrated in a few fields or regions, with most mainstream international datasets primarily in English, resulting in a singular data source. This makes AI models trained on such data perform poorly in diverse real-world scenarios, easily leading to bias. For example, when handling multilingual tasks or data from different cultural backgrounds, models may struggle to accurately understand and respond, limiting the broad applicability and fairness of AI technology.
Opacity permeates the entire data processing workflow. From the source of data collection to processing methods and ultimately how it translates into decisions, these stages are like a black box to outsiders. This lack of transparency not only makes it difficult for users to assess data quality but also obscures whether models are biased due to data, thereby affecting the fairness and accuracy of decisions. In the long run, this is detrimental to the healthy development of AI technology and its widespread acceptance in society.
3. Challenges in data collection have become a key factor hindering AI development. According to Dr. Max Li's column in Forbes, common issues often arise from the following aspects:
(1) Data quality issues.
Incompleteness: Missing values or incomplete data can impair the accuracy of AI models.
Inconsistency: Data collected from multiple sources often has mismatched formats or conflicting entries.
Noise: Irrelevant or erroneous data can weaken meaningful insights and confuse models.
Bias: Data that does not represent the target population can lead to biased models, causing ethical and practical issues.
(2) Scalability issues.
Quantity challenges: Collecting enough data to train complex models can be costly and time-consuming.
Real-time data requirements: Applications like autonomous driving or predictive analytics require continuous and reliable data streams, which can be challenging to maintain.
Manual annotation: Large datasets often require human labeling, creating significant time and labor bottlenecks.
(3) Access and privacy issues.
Data silos: Organizations may store data in isolated systems, limiting access and integration.
Compliance: Regulations like GDPR and CCPA restrict data collection practices, especially in sensitive areas like healthcare and finance.
Ethical issues: Collecting data without user consent or transparently can lead to reputational and legal risks.
Other common bottlenecks in data collection include a lack of diverse and truly global datasets, high costs associated with data infrastructure and maintenance, challenges in processing real-time and dynamic data, and issues related to data ownership and licensing.
OORT emerged from practical needs, and its establishment was somewhat accidental. In 2018, Max was teaching a course to graduate students at Columbia University, and during an artificial intelligence course, students faced challenges completing a project that required training AI agents due to the high costs of traditional cloud services. To solve this dilemma, Max conceived the idea of creating a decentralized AI platform "OORT": initially, they explored using blockchain as an incentive layer to connect globally underutilized nodes, building a preliminary prototype of a decentralized cloud solution, and began experimenting with PayPal for payments and credit allocation, laying the groundwork for the birth of OORT's native token.
Today, OORT has become a leader in DeAI, combining blockchain verification with a global network of data centers and edge devices to design state-of-the-art artificial intelligence infrastructure.
In response to the current lack of training data for AI, OORT connects globally underutilized nodes through blockchain to achieve global data collection. To incentivize participation and solve the challenges of cross-border micropayments, OORT considered using cryptocurrency for payments, thereby establishing a unique business model. Its OORT DataHub product launched on December 11, primarily addressing data collection and annotation bottlenecks, with a customer base that includes SMEs and some leading global tech companies. The product's decentralized nature truly realizes global, diverse, and transparent data collection, allowing global data contributors to easily earn rewards through cryptocurrency, while blockchain technology ensures that data sources and usage records are on-chain, effectively addressing many pain points faced by Web2 cloud services and AI companies. As of the time of writing, OORT DataHub has recorded data uploads from over 80,000 contributors worldwide.
Strong Research and Academic Background, Funded by Giants, Serving Over 10,000 Enterprises and Individuals
The OORT team is formidable. Max is not only the founder and CEO of OORT but also a faculty member at Columbia University, co-founder of Nakamoto & Turing Labs in New York, founding partner of Aveslair Fund in New York, and holds significant influence in the tech field, with over 200 international and U.S. patents (both granted and pending), and has published numerous papers in well-known academic journals covering fields such as communications, machine learning, and control systems. Additionally, he serves as a reviewer and technical program committee member for leading journals and conferences in various fields, as well as a funding reviewer for the Natural Sciences and Engineering Research Council of Canada.
Before founding OORT, Max collaborated with Qualcomm's research team on 4G LTE and 5G system design. Max is also a co-founder of Nakamoto & Turing Labs, a New York-based lab focused on blockchain and AI investment, education, and consulting.
Max is a regular contributor to Forbes magazine, and in his latest articles "AI Failures Will Surge in 2025: A Call for Decentralized Innovation" and "Focus on Decentralized AI in 2025: The Convergence of AI and Cryptocurrency," he emphasizes the development and importance of decentralized AI in the cryptocurrency field, highlighting its transformative potential. It is clear that Max is a solid supporter of decentralized AI.
Michael Robinson, chairman of the OORT Foundation, is also a managing board member of Agentebtc, a managing board member of Burble, managing partner of Aveslair Fund, co-founder and chairman of Reed - Robinson Fund, and a partner at Laireast, bringing rich cross-disciplinary experience and dedicated to promoting the integration of global business and technology.
Other core team members come from top universities and well-known institutions such as Columbia University, Qualcomm, AT&T, and JPMorgan Chase. Additionally, OORT's development has received support from renowned crypto venture capital firms like Emurgo Ventures (ADA Cardano Foundation) and backing from Microsoft and Google.
As of now, OORT has raised $10 million from notable investors, including Taisu Ventures, Red Beard Ventures, Sanctor Capital, and has received funding from Microsoft and Google, establishing partnerships with numerous industry giants such as Lenovo Imaging, Dell, Tencent Cloud, and BNB Chain.
OORT completed early project explorations from 2018 to 2019, focused on research and development from 2020 to 2021, developing a series of core technologies, including data storage, computing, and management, and began building the infrastructure for the OORT ecosystem. During this period, OORT launched decentralized storage nodes, Edge Device, forming a preliminary product prototype and laying the technical foundation for subsequent commercialization.
Since 2022, OORT has begun exploring commercialization pathways:
OORT has built a data marketplace platform connecting data providers and data users. Data providers can sell their data on the platform, while data users can purchase the data they need for AI model training and other purposes. OORT profits by charging transaction fees, and to encourage data providers to offer high-quality data, the platform has established a reward mechanism that provides corresponding rewards based on data quality, diversity, and usage frequency.
Providing decentralized cloud storage and computing services, enterprises and individuals can rent OORT's cloud resources to run their AI applications. Compared to traditional cloud services, OORT's decentralized cloud services offer higher security, lower costs, and better scalability. Users can flexibly choose the cloud resources they need based on their actual requirements and pay according to usage.
For the specific needs of large enterprises, OORT offers customized AI solutions. These solutions are based on OORT's decentralized technology architecture, providing one-stop services for data management, model training, and intelligent decision-making. By collaborating with enterprises, OORT not only secures a stable source of income but also accumulates industry experience to further optimize its products and services.
Currently, OORT serves over 10,000 enterprise and individual clients worldwide, with its network nodes generating millions of dollars in revenue, proving the feasibility of its business model.
Everyone Can Participate in AI Development and Benefit from It
OORT has several products, including OORT Storage, OORT Compute, and OORT DataHub. Based on the application layers of these three products, OORT also offers a solution called OORT AI, which helps enterprises quickly integrate intelligent assistants. Specifically, the functions of the three major products are as follows:
OORT Storage is currently the only decentralized solution that can match the performance of AWS S3 storage services, with numerous registered enterprise and individual clients.
OORT Compute aims to achieve decentralized data analysis and processing, providing better cost-effectiveness for AI model training and inference. It is still in preparation and has not yet launched.
The OORT DataHub, officially launched on December 11, marks a new development phase for the project and is expected to become a new focus for OORT, with "cash cow" expectations.
OORT DataHub provides an innovative way to collect and annotate data, allowing global contributors to collect, classify, and preprocess data for AI applications. By leveraging blockchain technology, it addresses the issues of single data sources and low annotation efficiency in traditional data collection methods while enhancing security. Notably, OORT DataHub has successfully launched on the Shenzhen Data Exchange, opening a new avenue for AI companies and research institutions to acquire high-quality, diverse, and compliant datasets.
OORT DataHub offers users multiple ways to earn points, such as daily logins, completing tasks, verifying tasks, and referral programs. Users can accumulate points to qualify for monthly draws and receive USDT equivalent to dollars as incentives.
This product effectively eliminates intermediaries in data collection, providing a safer, participant-controlled process, aligning with the growing calls for more ethical approaches to AI.
Based on OORT DataHub, OORT has also launched the OORT DataHub Mini App, which will seamlessly integrate with Telegram's mini-app platform, enabling users to contribute data more easily and participate in decentralized data collection, further expanding the OORT ecosystem and increasing user engagement. This integration is expected to bring millions of users and drive the platform's development.
OORT DataHub embodies OORT's vision, which is to enable everyone to participate in AI development and benefit from it, regardless of their geographical location, economic status, or technical background. OORT's mission is to provide reliable, secure, and efficient decentralized AI solutions, promoting the global adoption and application of AI technology while ensuring data privacy, security, and ethical compliance.
Through a decentralized data marketplace model, OORT breaks the data monopoly, allowing data providers from around the world to upload their data to the platform for trading and sharing. Whether individual users or enterprise users, anyone with valuable data can earn corresponding benefits on the OORT platform, achieving fair distribution of data value.
The decentralized architecture ensures that data is no longer stored in a single server or data center but is distributed across nodes worldwide. Each node encrypts the data, and only authorized users can access and use it. Additionally, the immutable nature of blockchain technology ensures the integrity and authenticity of the data, effectively preventing risks of data leakage and tampering.
Since OORT's decentralized network consists of numerous nodes, there is no single point of failure. Even if one node is attacked or fails, other nodes can continue to operate normally, ensuring the stability and reliability of the entire system. Furthermore, the decentralized consensus mechanism makes it difficult for attackers to alter system data or control the entire network, enhancing system security. For example, in the face of distributed denial-of-service (DDoS) attacks, OORT's distributed architecture can disperse attack traffic, allowing the system to maintain normal operations and ensuring that users' data and services are unaffected.
On the other hand, OORT addresses data collection, control, and management issues by providing innovative data collection and annotation methods, establishing strict data quality control and verification mechanisms, and employing advanced AI algorithms for intelligent management and analysis of data.
OORT places a high priority on data protection and privacy compliance, strictly adhering to data protection regulations worldwide, such as GDPR and HIPAA, ensuring that user data is processed legally.
By reviewing OORT's existing product line and product progress, combined with OORT's vision for the future, we can see that OORT has built a fair, transparent, and trustworthy AI ecosystem.