Dialogue io.net: Aiming to Compete with AWS Cloud Services by Offering More Convenient Decentralized GPUs

Deep Tide TechFlow
2024-03-06 13:09:24
Collection
We hope to embody the spirit of Web3 while defeating AWS and GCP.

Interview with the COO of io.net: Aiming to Compete with AWS Cloud Services and Provide More Convenient Decentralized GPU (Includes Airdrop Interaction Tutorial)

Written by: AYLO

Compiled by: Deep Tide TechFlow

Today, I bring you an interview with another project that I am very optimistic about.

This project covers some currently popular verticals: AI + DePin + Solana. io.net Cloud is an advanced decentralized computing network that allows machine learning engineers to access distributed cloud clusters at a significantly lower cost than centralized services. I spoke with COO Tory Green to get more information. The IO token will soon launch on Solana, and I highly recommend you read this article. I will also include information on how to participate in the airdrop (at the end of the article). I am a private investor in io.net and firmly believe in their platform, as their GPU cluster solution is truly unique.

Introduction to io.net

  • Decentralized AWS for ML (Machine Learning) training on GPUs
  • Instant, permissionless access to a global GPU and CPU network, currently live
  • They have 25,000 nodes
  • Revolutionary technology that brings GPU cloud clusters together
  • Can save 90% of computing costs for large-scale AI startups
  • Integrated with Render and Filecoin
  • Based on Solana

They just announced a $30 million funding round, attracting some of the biggest supporters in the field.

Why Should People Pay Attention to io.net?

We are not only competing with other crypto projects but also with cloud computing. One of the main advantages we offer to customers is significantly lower prices, up to 90% cheaper. What we are really providing is consumer choice, which is the truly interesting part. Yes, on our platform, you can access GPUs at a 90% discount, and I highly recommend you give it a try. You can access affordable, fully decentralized consumer-grade GPUs at a substantial discount. However, if you need high performance, you can use top-tier hardware like the A100 to recreate an AWS-like experience, perhaps only 30% cheaper, but still less expensive than AWS. In some cases, we even offer better performance than AWS, which can be crucial for specific industries like hedge funds.

For one of our major clients, our services are 70% cheaper than AWS and 40% cheaper than what they get elsewhere. Our platform is user-friendly and permissionless, unlike AWS, which may require detailed information like business plans. Anyone can join and start a cluster immediately, whereas AWS may take days or weeks.

Compared to decentralized competitors, if you try to get a cluster on platforms like Akash, you will find that it is not done instantly. They are more like travel agencies, calling their data centers to find available GPUs, which can take weeks. Here, it is instant, cheaper, and permissionless. We want to embody the spirit of Web3 while beating AWS and GCP.

What Does the Roadmap for 2024 Look Like?

It is divided into a business roadmap and a technical roadmap. From a business perspective, TGE is coming soon. We plan to hold a summit this year, during which we will announce many product-related items. Our focus is on continuing to build the network, as despite all the excitement around TGE, we see ourselves as a real business and a legitimate competitor to AWS.

We will continue to aggressively grow our sales team. We hope to emulate companies like Chainlink and Polygon by focusing on recruiting senior sales executives from companies like Amazon and Google to build a world-class sales team. This will help us attract AI clients and establish partnerships with entities like Hugging Face and Prettybase.

Our initial customer base consists of large AI startups facing enormous AI computing costs. I am part of a CFO team in the Bay Area tech sector, and one of the biggest issues is the high cost of AI computing. One Series A SaaS startup spends $700,000 a month on AI computing, which is unsustainable. Our goal is to significantly reduce costs for businesses like theirs.

Once we prove the concept with these initial customers, we will explore adjacent markets. Our network has two SOC-compliant GPUs, and we can target large tech companies or enterprises like JPMorgan or Procter & Gamble, which certainly have their own internal AI departments. Our technology can support clusters of up to 500,000 GPUs, potentially allowing us to surpass AWS or GCP in capacity, as they cannot deploy that many GPUs in one location. This could attract significant AI projects like OpenAI for future versions of GPT. However, building a market requires balancing supply and demand; we currently have 25,000 GPUs in the network, while there are 200,000 on the waiting list. Over time, our goal is to expand the network to meet the growing demand. That is the business roadmap.

From a technical perspective, there is clearly a lot to do. Currently, we support Ray and Kubernetes, and we are actively developing. But as I mentioned, we are considering expanding our product. If you think about how AI works, for example, when you use ChatGPT, that is an application, and ChatGPT is built on a model, which is GPT-3, and GPT-3 runs all its inferences on GPUs. We could ultimately start from GPUs and build the entire stack.

We are also collaborating with Filecoin, and many of these partner data centers already have substantial CPU and storage, so we can also start storing models. This will enable us to provide computing, model storage, and SDKs for building applications, creating a fully decentralized AI ecosystem, almost like a decentralized app store.

What Role Does the Token Play in the Network?

At a high level, this is a utility token that will be used to pay for computing on the network. That is the simplest explanation. I also recommend checking out the website bc8.ai.

This is a proof of concept we built, a stable diffusion clone, and I believe it is currently the only fully on-chain AI dApp. Users can make small transactions using Solana to create images through crypto payments. Each transaction compensates four key stakeholders involved in creating the image: the app creator, the model creator, us, and the GPUs used. Currently, we allow people to use it for free because we are both the app owner and the model owner, but this is more of a proof of concept than an actual business.

We plan to expand the network to allow others to host models and build fully decentralized AI applications. The IO token will power not only our models but also any models created. The tokenomics are still being finalized and may not be announced until around April.

Why Choose Solana?

Original Video Link

I think there are two reasons. First, we really like this community, and second, frankly, it is the only blockchain that can support us. If you look at our cost analysis, every time someone performs inference, there are about five transactions. You have the inference, and then it pays all the stakeholders. So when we do cost analysis and have 60, 70, or 100,000 transactions, all of these transactions need to be 1/100 of a cent or 1/10 of a cent. Given our transaction volume, Solana is actually our only option. Additionally, they have provided a lot of support as partners, and the community is very strong. It was almost a no-brainer choice.

What Do You Think Your Market Size Is?

I think it is unpredictable, you know. We throw around numbers like a trillion, but even then, it is still hard to fully grasp its comprehensive scope. For example, predictions from companies like Gartner suggest that by 2030, model training could account for 1% of GDP, which is about $300 billion. This statistic is relatively easy to find. However, when considering that Nvidia's CEO mentioned that only 10% of the AI market is for training, our perspective changes. If inference and training together represent a $300 billion market, then the entire AI GPU market, just for computing services, could be a $3 trillion market. Then, Kathy Wood predicts that the entire AI market could be $80 trillion. This shows that the potential market size is almost beyond our understanding.

What Do You Think is the Biggest Barrier to io.net's Development?

Building a market is very difficult, and while it may be easier in the crypto space, it still has its challenges. For example, most of our clients request A100s, which are top-tier, enterprise-grade GPUs that cost about $30,000 each, and there is a shortage. They are very hard to find right now. Our sales team is working hard to source these GPUs, which is a significant challenge because they are in high demand and expensive.

We also have many 3090s, which are more of a consumer product, and the demand is not as high. This means we have to adjust our strategy and find clients specifically looking for these types of GPUs. However, this situation can be found in any market, and we address it by hiring the right people and implementing effective marketing strategies.

From a strategic perspective, as I mentioned, it is concerning that we are currently the only platform capable of building decentralized clusters across different geographical locations. This is now our moat. In the short term, we have a significant competitive advantage, and I believe this extends into the medium term. For collaborators like Render, if they can leverage our network and retain 95% of the value, then it makes no sense to try to replicate our model.

The team spent about two years developing this capability. So, it is not an easy task. However, there is always the possibility that someone else will come up with this method in the future. By then, we hope to have established a sufficient moat. We are already the largest decentralized GPU network by an order of magnitude, with 25,000 GPUs, while Akash has only 300 and Render has only a few thousand.

Our goal is to reach 100,000 GPUs and 500 clients, creating a network effect similar to Facebook, where the question becomes, "Where else can you go?" Our aim is to become the go-to platform for anyone needing GPU computing, just as Uber dominates ride-sharing and Airbnb dominates accommodation services. The key is to act quickly to secure our position in the market and become synonymous with decentralized GPU computing.

How to Get the IO Airdrop?

There are two main ways to qualify for the IO airdrop:

They are running a Galaxe campaign called “Ignition”. You just need to complete tasks. This will require you to prove you are human by minting a Galaxe passport, which is great because it cannot be faked.

Provide your GPU/CPU for io.net. Just follow the instructions stated in the documentation. Even if you are not tech-savvy, you can complete it in about 10-15 minutes; it is quite simple.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
banner
ChainCatcher Building the Web3 world with innovators