After data assetization, how much of the market can privacy infrastructure break into?
Author: Jason
Seventy years ago, when the popularity of computers was just beginning to spark, we could never have imagined that as a fully digital society approached, individuals would gain a "second life" in the digital world. The digital us continuously expands our self-boundaries in the vast network "parallel universe," constantly trying new things and improving our material lives, while also leaving behind flowing imprints—data.
What is data? This fundamental yet complex question in information science has no obvious answer. Simply put, data is the product of observation. The subjects of observation include objects, individuals, institutions, events, and their environments. Observation is conducted based on a series of perspectives, methods, and tools, accompanied by corresponding symbolic expression systems, such as measurement units.
Data is the product that records the characteristics and behaviors of the observed subjects using these symbolic expression systems. Data can take the form of text, numbers, charts, sounds, and videos. In terms of existence, data can be digital (Digital) or non-digital (such as recorded on paper). However, with the development of information and communication technology (ICT), more and more data is being digitized.
According to Statista analysis, the number of connected devices worldwide is expected to reach 30.9 billion by 2025. Connected devices and services create a massive amount of data, and IDC predicts that by 2025, global data will expand to 163ZB (1ZB equals 1 trillion GB), which is ten times the 16.1ZB of data generated in 2016. In the face of this torrent of data, how to mine its intrinsic value is answered by artificial intelligence.
Sixty Years of Artificial Intelligence
In the summer of 1956, during a six-month seminar at Dartmouth College, the term "artificial intelligence" was proposed in discussions among young scientists like Minsky.
It wasn't until 2006 that Professor Hinton's introduction of "deep learning" neural networks led to a breakthrough in AI performance. This wave of artificial intelligence is distinctly different from the previous two. Machine learning algorithms based on big data and powerful computing capabilities have made groundbreaking progress in a series of fields such as computer vision, speech recognition, and natural language processing. Applications based on AI technology have also begun to mature, allowing artificial intelligence to truly move towards "intelligence" and practical application.
Today, artificial intelligence is no longer a strange technology to everyone; it has entered countless details of people's lives, from online shopping to factory production, showcasing the convenience and progress brought by AI technology.
The increasing maturity of theory and technology has led to breakthroughs in application fields and has driven continuous leaps in commercialization. More and more governments and corporate organizations worldwide are gradually recognizing the economic and strategic importance of artificial intelligence and are venturing into AI from national strategies and business activities.
Ten years ago, the rise of mobile internet brought artificial intelligence to a "singularity" of explosive development. Mobile device providers represented by Apple and Samsung, and mobile internet service providers represented by Alibaba, Tencent, Facebook, and Google.
Their accelerated iteration has broken the existing boundaries of time and space compared to traditional desktop internet, making human-computer interaction more convenient, while AI technologies led by natural language processing, machine learning, and visual algorithms have achieved breakthrough development.
Deloitte's 2019 Global Artificial Intelligence Development White Paper estimates that the global AI market will exceed $6 trillion by 2025, with a compound annual growth rate of 30% from 2017 to 2025. PwC's research report on the economic impact of AI on the global economy states that by 2030, the emergence of AI will bring an additional 14% boost to global GDP.
This equates to a growth of $15.7 trillion, surpassing the combined GDP of China and India. The global AI market is expected to experience phenomenal growth in the coming years.
In sixty years, the fire of artificial intelligence has become a raging force, and in the face of the Fourth Industrial Revolution—the technological revolution—its ceiling is gradually becoming apparent.
Emerging Constraints
Artificial intelligence can become a variable and core technology in the new round of technological industrial reform, relying on three key elements—data, algorithms, and computing power.
Since the rise of the internet, especially with the mobile internet entering households, there has been a massive global increase in data. This real and valuable data has fed the "raw materials" for artificial intelligence.
At the same time, the enhancement of chip processing capabilities, the large-scale application of cloud computing, and the significant decrease in computing hardware prices have ushered in a computing boom globally, providing a "production engine" of real value for artificial intelligence.
Thanks to the leapfrog breakthroughs in deep learning, machine learning, neural networks, and computer vision, the vast industrial and solution markets have allowed AI algorithms to develop rapidly. From an industry perspective, artificial intelligence has already been applied in various vertical fields such as healthcare, finance, education, and security. Algorithms provide effective "production tools" for artificial intelligence.
With the support of these three elements, artificial intelligence has ushered in a "golden decade," but the sword of Damocles hanging over artificial intelligence has also begun to emerge.
The first issue is the pressure of data regulation and privacy. As early as 2018, the European Union introduced the General Data Protection Regulation (GDPR), and in 2021, China's Data Security Law and Personal Information Protection Law were successively implemented.
Especially the Personal Information Protection Law, which focuses on individual rights, aims to protect citizens' privacy, personality, personal safety, property, and other interests. Its definition of "personal information" refers to various information related to identified or identifiable natural persons recorded electronically or in other ways. The strengthening of personal privacy data regulation undoubtedly adds a strong constraint on data misuse.
Moreover, the pressure of data privacy also comes from within. For companies that own data, there is a significant contradiction: sharing and interacting with data can clearly enhance the effectiveness of AI algorithms, but at the same time, they must ensure that their data does not leak.
Whether it's the use of data between different internal departments or data collaboration with third-party partners, strict compliance must be guaranteed. When launching various projects involving data collaboration, the primary consideration is often the security of data flow.
Secondly, the cost of model training is high. Although advancements in hardware and software have been driving down AI training costs by 37% annually, the rapid growth of AI model scales (10 times per year) has led to a continued rise in total training costs. Some organizations believe that the cost of the most advanced AI training models could increase by 100 times, soaring from about $1 million to over $100 million by 2025.
Faced with issues such as data privacy, high costs, and technological centralization, how can artificial intelligence break through these constraints and reach new heights?
Some cutting-edge technology research and applications have paved the way for progress.
Artificial Intelligence for Everyone
The emergence of blockchain and privacy computing technologies provides new ideas for artificial intelligence.
The clever interweaving of data has caused a chemical reaction among blockchain, privacy computing, and AI in different ways. The combination of these technologies can elevate data utilization to a new level while enhancing the underlying infrastructure of blockchain and boosting the potential of AI.
The consensus algorithm of blockchain can help the subjects in AI systems complete collaborative tasks, and its technical characteristics can also assetize data, incentivizing broader participation of data, algorithms, and computing power to create more efficient AI models.
When there is a demand for applications involving private data, privacy computing can analyze and compute data without disclosing the original data from the data providers, ensuring that data remains "available but invisible" during circulation and integration, thus achieving the privacy and security controls required for compliance and promoting data sharing and value exchange.
Currently, we can already see various platform products based on privacy computing and blockchain in the market, such as Ant Chain's Mosdo Multi-Party Secure Computing Platform and Baidu's secure MesaTEE platform. Most of these platform products are aimed at B-end services, which is simple: the data business between enterprises is the most fundamental commercial need.
It resolves the basic contradiction between enterprises regarding data sharing, interaction, and enhancing AI algorithms, but it has not yet ventured into the democratization of artificial intelligence and the establishment of secure general artificial intelligence.
Enterprise services are merely the initial landing point that artificial intelligence can currently achieve. In the foreseeable future, data ownership will ultimately be returned to individuals, and technology, production materials, and production tools will also be transferred back to individuals. This is how, centered around data as the "new generation of production factors," AI, blockchain, and privacy computing will serve as technological infrastructure to promote the emergence and evolution of advanced artificial intelligence, exploring pathways to general artificial intelligence.
Recently, a company focused on cutting-edge technology research released a product that offers users and the market a new direction in the application of universal general artificial intelligence.
The PlatON privacy computing network (temporary name) is a decentralized data sharing and privacy computing infrastructure network. From the very beginning of its product design, it has taken a different approach, innovatively integrating the three elements of artificial intelligence—computing power, algorithms, and data—into the product for users. This means that as long as you are a user, you can log into the platform as a data owner, data user, algorithm developer, or computing power provider, and complete various task requirements, gathering the data, algorithms, and computing power needed for computation in a decentralized manner to create a new paradigm of secure and universal artificial intelligence.
As a commercial-grade product, the PlatON privacy computing network is no longer positioned as a B2B enterprise-level product but is broadly open to institutions and individuals. For example:
As data owners, individuals and institutions can add data as data nodes and participate in computing tasks published on the platform, achieving a surprisingly innovative goal—effectively certifying, pricing, and protecting data, allowing data to be truly assetized under the premise of privacy protection.
As computing power providers, both individuals and institutions can offer computing power on the platform, providing the machine resources needed to execute certain computing tasks, making idle servers (computing power) available to support computing tasks in the network and earn corresponding task rewards.
As algorithm providers, individual AI developers can unleash their maximum potential, providing corresponding AI algorithms to assist in completing computing tasks and earning corresponding rewards.
This forms a free, open, and sustainably developing "AI market," where data and computing power are published based on the platform, and algorithms can be computed using data and computing power. Based on the cryptoeconomics on the blockchain, data, computing power, and algorithms can be monetized, forming an effective incentive mechanism to encourage more data, algorithms, and computing power to join the network. This gradually forms a decentralized sharing and trading market for data, algorithms, and computing power.
Additionally, PlatON has set up multiple protections for data privacy, employing a combination of secure multi-party computing, zero-knowledge proofs, homomorphic encryption, verifiable computing, and federated learning to perform collaborative computing, protecting local data and truly achieving "data available but invisible."
Not only data protection, but the privacy of computing results, such as completed AI models, can also be safeguarded. Moreover, the product can efficiently execute smart contracts and smoothly run popular deep learning frameworks, ensuring its universality, compatibility, and high availability.
From a panoramic view, the privacy computing network establishes management capabilities for the entire data lifecycle in a platform-based manner, with AI, blockchain, and privacy computing technologies as core competencies, achieving seamless collaboration with lower-level economic models, data algorithms, and computing power resources according to application needs. Starting from individual data, it solves the problem of data silos, allowing data not only to be protected and utilized but also to become assets for individuals or institutions.
The product is currently in the internal testing phase. It is not hard to imagine that such a large and complex platform product will inevitably face significant challenges. For example: How to price data among multiple parties? How to achieve precise capture and application of data in multi-party circulation? How can core algorithms attract AI developers to contribute?
Even so, it can still be seen that this is an unprecedented super data commercial entity. The integration and application of new technologies require time, and product refinement requires even more time. The PlatON privacy computing network product has already taken a step in exploring the commercialization of data. Looking to the future, the "singularity" of the data economy leap may bloom upwards.