From Academic Ivory Towers to Decentralized AI Innovation:An Interview with OORT Founder Dr. Max Li

WuBlockchain
9 min readDec 17, 2024

--

About OORT

OORT is a decentralized infrastructure built on the Olympus protocol, designed to provide a trusted cloud for decentralized AI. OORT offers three core decentralized AI products to enterprises and individual customers: OORT Storage, OORT DataHub , and the soon-to-be-launched OORT Compute. To date, OORT has raised a total of USD 10 million in funding including Taisu Venture, Red Beard Venture, and Sanctor Capital, as well as receiving grants from Google and Microsoft.

Host (bms):

We are delighted to have invited Dr.Li, Founder and CEO of OORT, to discuss the future of decentralized AI. Dr.Li, hello! To start, could you introduce your background and experience? How did you first become involved in AI and blockchain?

Dr. Li:

Hello, and thank you for the invitation. I’m very pleased to have this opportunity to share some of my experiences, especially regarding decentralized AI research and practice. I am currently the Founder and CEO of the OORT project, which started in the summer of 2021 and has been running for nearly four years now. Meanwhile, I serve on the faculty of Columbia University in New York City, in the Electrical Engineering department.

Academically, my research focuses on AI and blockchain technology, with a specialization in reinforcement learning. I authored Reinforcement Learning for Cyber-physical Systems published by Taylor & Francis CRC Press, an English-language textbook currently used by several universities. It was also translated by Tsinghua University’s Computer Science Department, and the Chinese edition was officially released in China in 2021.

Host (bms):

How did you initially come into contact with blockchain? Was there a particular turning point?

Dr. Li:

I began delving deeply into blockchain around 2017. At that time, the global ICO frenzy was at its peak, and students became very interested in cryptocurrencies and blockchain technology. Due to their requests, I attempted to offer a blockchain-related course at Columbia. Almost no one had such experience, so I had to gather scattered information on my own, study whitepapers and the limited academic papers available, and then compile a systematic syllabus with homework and exam questions. The goal was not just to have students grasp superficial concepts, but to truly understand the underlying mathematics and engineering principles, enabling them to become future blockchain experts who can drive the industry forward.

Host (bms):

Without mature teaching materials or reference cases at that time, how did you tackle these teaching and research challenges? Did you encounter any particularly interesting or difficult aspects?

Dr. Li:

Indeed, the biggest difficulty then was the lack of teaching materials and the fragmented nature of the information. I spent a great deal of time searching online resources, combining whitepapers with the few available academic papers, and delving deep into the cryptographic foundations of blockchain. For example, why use SHA-256 instead of SHA-512 or SHA-128? Behind that decision are fundamental mathematical theories like the Birthday Paradox. These topics had to be organized into course materials suitable for engineering graduate and doctoral students. They were training to become genuine technical experts, not just novices who understand a few buzzwords.

Host (bms):

You originally focused on reinforcement learning and AI. What prompted you to move from the AI field into the blockchain world, and further, what inspired you to consider combining the two?

Dr. Li:

Before entering academia, I worked in the R&D division of Qualcomm, involved in 5G chipset design, starting from around 2012 with fundamental algorithm research. After joining academia, I mainly focused on reinforcement learning for AI. But by chance, I became involved with blockchain and began teaching it. While exploring blockchain and AI, I realized that their combination holds enormous potential: a decentralized network can provide novel infrastructure for AI’s data and computing resources, while AI technologies can, in turn, optimize the operation and efficiency of blockchain systems. This interdisciplinary perspective motivated me to consider the possibilities and practical pathways of decentralized AI.

Host (bms):

Recently, Vitalik discussed the drawbacks of centralized AI in an interview. In your view, what are the fundamental technical advantages and commercial value propositions of decentralized AI compared to centralized AI?

Dr. Li:

This is a good question. I believe the greatest technical advantage of decentralized AI lies in achieving transparency and traceability throughout the AI development process through blockchain technology.

Right now, openness and transparency of AI are hot topics. Take OpenAI, for instance, many criticize it for no longer being truly “Open,” since we have no way to determine which training data it uses, whether the model contains bias, the specifics of its training methods, or whether outputs are manipulated behind the scenes. Such issues are notoriously difficult to clarify in centralized AI setups.

Decentralized AI, by contrast, uses blockchain to make key steps of the training process publicly verifiable, thereby building trust. With transparency and traceability, users have greater confidence in AI’s decisions and data sources. OORT’s three upcoming flagship products are fundamentally designed to enhance AI transparency and traceability via blockchain.

In terms of commercial value, decentralized AI allows more people to participate in building and using AI. Just like the Web2 social media giants, which harness the collective intelligence and contributions of countless individuals, decentralized AI fosters a positive cycle: “Made by people, for people.” Individuals and organizations contribute data, computing power, and resources to build AI, and the final product serves everyone. This model has enormous commercial potential, extending well beyond traditional B2B or B2C approaches. Decentralization invites widespread participation, making AI a genuine public infrastructure created by and for the masses, which can unlock even greater commercial value.

Host (bms):

Was OORT established under this kind of backdrop? Could you introduce the OORT project to us?

Dr. Li:

Yes, OORT took shape relatively early. Back in 2017, when I taught a reinforcement learning course at Columbia, my students needed extensive computing resources to train their AI agents for their final projects. Cloud services like AWS or Google Cloud were very expensive, and many students complained about the financial burden.

I asked myself: Given so many idle computing resources worldwide, why not use blockchain to pool these scattered resources into a decentralized computing network that not only reduces costs but also enhances flexibility?

Back in 2017, the concept of “decentralized AI” wasn’t clearly defined and barely existed, but integrating fragmented computing power was already a key step in building a decentralized AI infrastructure. During our lab research, we published academic papers and applied for key U.S. patents. By the summer of 2021, when our core patent was granted, we decided to commercialize our work and founded OORT.

Host (bms):

When building a decentralized AI infrastructure, what do you see as the biggest technical challenges, such as in computing power or data privacy and storage? How is OORT addressing these challenges?

Dr. Li:

The biggest challenge isn’t just privacy or data security — it’s how to achieve performance comparable to centralized cloud providers.

When considering a shift to decentralized infrastructure, enterprises often prioritize performance indicators like reliability, availability, and latency. If decentralized cloud performance can’t match that of AWS or Google Cloud, it’s hard to convince customers, even if we offer advantages in cost, privacy, and flexibility.

Take OORT’s decentralized storage as an example. We fragment and encrypt files, distributing them across hundreds or even thousands of nodes. When the user wants to retrieve the file, we fetch fragments in parallel from multiple nodes and then decode and decrypt them.

To keep latency on par with centralized providers, we apply advanced coding theory and algorithmic optimizations at the fundamental level. Drawing on methodologies similar to 5G’s underlying algorithms, we ensure that even if some nodes fail or go offline, we can quickly reassemble data. As a result, we can achieve retrieval times close to traditional cloud storage — around 100 milliseconds difference — while reducing costs by about 50% to 60% compared to Amazon S3.

In summary, achieving performance parity with centralized clouds, while retaining the advantages of decentralization, is the hardest problem we’ve tackled at OORT.

Host (bms):

The Sino-U.S. rivalry in AI technology is widely discussed. How do you view the competition between China and the U.S. in AI? Under such geopolitical conditions, can decentralized AI gain greater policy support and development space? How can we mitigate risks arising from geopolitical tensions?

Dr. Li:

Globally, the two nations making enormous investments in AI are the U.S. and China. The U.S. holds the edge in original technology and foundational research — every wave from the internet and mobile internet to AI typically starts in the U.S. China’s strength lies in application and rapid commercialization. Its market and environment enable swift large-scale deployment of new technologies.

Sino-U.S. collaboration should ideally be a win-win situation: the U.S. provides groundbreaking tech while China excels at rapid implementation. Yet, geopolitical issues and trade wars have made this synergy more challenging.

In this environment, decentralized AI proves resilient. On the policy level, both China and the U.S. indirectly or directly support relevant technologies. For example, China promotes distributed computing pools, and the U.S. emphasizes decentralized storage and computing for national security reasons — both indirectly foster decentralized advancements.

The beauty of decentralized AI lies in its global nature. It doesn’t depend on one country’s policy. When individuals and organizations from all over the world build decentralized AI through open-source collaboration, this ecosystem cannot be easily stifled by any single geopolitical force. Just like Linux, an open-source OS that no single government can eradicate.

Thus, I believe decentralized AI has tremendous potential for development. Not only does it align with technological evolution, but its global, widely participatory nature ensures it won’t be severely impacted by geopolitical turmoil.

Host (bms):

You’ve given us a new perspective: decentralization is not just a technical matter but also a global ideological trend.

Host (bms):

Recently, CZ also joined the discussion on AI and blockchain, noting that AI annotation or more broadly, AI data tasks are well-suited for blockchain. They can leverage cheap global labor and pay via crypto. In this context, how do you see OORT further advancing these ideas within the current decentralized AI ecosystem?

Dr. Li:

This is a great question, and the timing is perfect for us. As I mentioned, OORT has three core products: OORT DataHub, OORT Storage, and OORT Compute, corresponding to data collection and labeling, data storage, and computation. These three areas are precisely the core infrastructure for building AI models.

We just launched OORT DataHub this week, the world’s first decentralized AI data collection and labeling platform. This aligns perfectly with CZ’s vision. Using blockchain, we can leverage participants worldwide to gather and label data, paying them in crypto. This approach is geographically impartial and yields higher quality data at lower cost and greater efficiency.

Moreover, OORT is also a BNB Greenfield storage provider. All data collection processes are recorded on-chain to ensure transparency and immutability. This data is then automatically stored on OORT Storage or BNB Greenfield and provided to AI companies for training or fine-tuning models. Unlike centralized servers, where data can be easily deleted or manipulated, our decentralized storage ensures data integrity.

For payments, we use OORT’s native token. Our target markets include Africa, Latin America, and Southeast Asia, where the general public has a high acceptance of crypto, reducing educational overhead and increasing market penetration.

It’s worth noting we’ve already tested our approach. This summer, our community voluntarily used a decentralized process to help NASA label a large number of Martian surface images captured by the Curiosity rover between 2011 and 2012. This provided us with valuable experience and proves our solution is not just a concept but a viable, tested approach.

Moving forward, we will continue to capitalize on this first-mover advantage, enabling more users to directly use OORT’s products, to experience and contribute to building a decentralized AI future. I believe this is not only our mission but a key step in letting users genuinely engage with and become part of this global transformation.

Host (bms):

I’ve read several of your past interviews. You are an experienced engineer, professor, and inventor with over 200 patents, and you have published multiple academic research papers in IEEE journals. Among your research experiences, which do you think have been most beneficial for OORT’s current development? Or in other words, how have you successfully commercialized these academic achievements?

Dr. Li:

Some of the patents we hold are fundamental to our current products. We have two core patents being used in OORT’s offerings — one involves a “Proof of Honesty” (POH) consensus mechanism. This mechanism is crucial, and we have already partially integrated it into OORT’s products.

However, it’s important to note that there is often a significant gap between a paper or patent and a fully developed product. We must gradually translate the theoretical concepts from patents or papers into practical product features.

For example, in the OORT DataHub platform, data quality control is critical, and at its core we use the POH mechanism. Currently, we may have implemented only 30%-40% of the patent’s content, and more development and fine-tuning are needed for the rest. This iterative process is inevitable — rarely can a patent or paper be seamlessly converted into a final product without continuous refinement.

That said, OORT has a clear roadmap to gradually integrate the entire POH mechanism, ultimately empowering the decentralized AI space with a high-quality technical foundation. That’s the direction we are working toward.

Host (bms):

Thank you very much, Dr.Li, for sharing these fascinating insights into decentralized AI. We appreciate your time and expertise!

References:

Forbes:https://www.forbes.com/sites/maxli/

Past interviews:

https://cryptoslate.com/podcasts/max-li-emphasizes-blockchains-role-in-ai-trust-and-ethics-revolution/

https://www.youtube.com/watch?v=59UIxaWl46I

Recent news:

https://www.coinspeaker.com/oort-githon-technology-seal-3-year-deal-customer-satisfaction-ai-agent/

Official website: https://www.oortech.com/

Official foundation:https://www.oortfoundation.org/

Follow us
Twitter: https://twitter.com/WuBlockchain
Telegram: https://t.me/wublockchainenglish

--

--

WuBlockchain
WuBlockchain

Written by WuBlockchain

Colin Wu, Chinese journalist, won 2013 China News Award

No responses yet