Interview: The Round Trip
Compilation & Editing: Yuliya, PANews
As the AI wave sweeps across the globe at an unprecedented speed, a "arms race" over computing power has already begun. When NVIDIA's market value surpassed one trillion dollars, and giants like AWS and Google Cloud have nearly monopolized cloud computing power, a profound challenge now faces all AI innovators: Will the centralization of computing power stifle open innovation and lock the future of AI within the "walled gardens" of a few dominant companies?
With a successful track record of selling a company to Snapchat for $60 million, and co-founding Product Science, a company that provides AI code optimization services to top enterprises, David and Daniel Laborman, co-founders of Gonka AI, bring their continuous entrepreneurial experience spanning from parallel computing to the AR field. They offer a unique perspective to break through this dilemma: building a fully community-driven decentralized AI computing network.
David and Daniel, in the new series "Founder's Talk" of The Round Trip, a joint production by PANews and Web3.com Ventures, elaborated in detail on why they drew inspiration from the history of Bitcoin's infrastructure development. They aim to replicate the "ASIC revolution" in the AI field through an open financial incentive framework, in an effort to completely break through the constraints of computing power costs. They also shared how Gonka AI attracted investments of $50 million from industry giants such as Bitfury, and offered unique insights into the current debate over an "AI bubble."

From games, AR to decentralized AI
PANews:Welcome, David and Daniel! We're very happy to have you here. I know you both have strong technical backgrounds and have been deeply involved in this field for many years. Could you first share your background stories with our audience?
Gonka AI:Hello everyone. First of all, we are close brothers, and our lives and careers have always been closely connected. The story of our journey can be traced back to 2003, when we first became deeply interested in parallel computing and decentralized networks.
Later, we entered the online gaming industry, which is essentially a form of large-scale parallel computing—thousands of players interact in real time over the Internet. To improve the efficiency of game animation production and reduce costs, we then dove into the field of computer vision.
And computer vision has led us in an entirely new direction: we began toSnapchatWe developed an AR virtual avatar. This experience was very successful, and eventually Snapchat acquired our company for $60 million, marking a significant turning point in our careers.
Throughout the process of working on different projects and companies,We have always held one aspiration: to create something that can truly make a significant impact on society, especially in the way people interact with each other.Everything changed when AI entered our lives in a completely new form—the large language model (LLM). It is no longer the machine learning we were familiar with in the past, but rather a powerful tool capable of real conversations and genuinely helping us solve problems. We are seeing that the new generation of AI based on the Transformer architecture is much more than just a language model.Whether it's image generation, video generation, breakthroughs in fields such as biology, chemistry, and physics, or even more efficient designs and operational methods for nuclear power plants, this wave of AI is influencing almost everything.
Next, we will also witness the rapid development of robot software and self-driving cars., and these changes are happening very rapidly, right now.
But this is followed by a concern, not the sci-fi fear of "The Terminator," but rather a worry about the real-world situation.Currently, approximately 65% of global cloud computing power is controlled by three U.S.-based companies (such as AWS and Google Cloud). When adding China's Alibaba and Tencent, these five major companies together control as much as 80% of the world's cloud computing power.The core of AI is computing power, and AI at this stage is almost synonymous with cloud computing power. These companies are fiercely competing to control 100% of AI computing power. If this trend continues, we will enter a very strange world:
Only a very few companies truly own and control all the AI, and these AIs will:
- Replace a large number of job positions
- Restructure the entire economic system.
- Change the way society operates
Therefore, we believe that decentralized AI is a crucial and unavoidable issue.
This is why we eventually arrived at Gonka AI.
PANews:Indeed, you are not newcomers to the AI field. Before founding Gonka AI, you also founded Product Science, a company backed by renowned investors such as Coatue, K5, and Slow Ventures. Can you talk about this experience and how it led you to eventually start Gonka?
Gonka AI:Of course. The computer vision we have previously focused on essentially involves AI and machine learning. Some of the earliest practical applications of AI occurred in areas such as image generation and animation production, through which we have established our reputation in the machine learning industry.
After leaving Snap, we founded Product Science.This company uses AI to provide code optimization services for top global companies such as Walmart, JPMorgan Chase, and Airbnb. While it's now well known that AI can help write code, it's equally important to ensure that this code runs efficiently. Before shifting our full focus to Gonka and the decentralization of AI infrastructure, enhancing code performance is at the core of our business.
Gonka AI's "Bitcoin"-style Vision
PANews:You mentioned the issue of centralized computing power, which is indeed concerning. Recently, Cloudflare's large-scale outage paralyzed half of the crypto world, and AWS also frequently experiences outages, each time impacting numerous applications. How will Gonka AI address this issue? It seems that it is not a general-purpose decentralized cloud, but rather more focused on the AI field.
Gonka AI:Yes, in the face of the current highly centralized computing power dilemma, the only way forward we see is decentralization.
At the model level, we have seen independent labs like DeepSeek demonstrate that they are fully capable of training high-quality models that can rival those of tech giants. However, computational power remains a core bottleneck. Currently, many cutting-edge labs rely on infrastructure built by large cloud service providers, while in the decentralized domain, solutions of comparable scale have yet to emerge.Even the largest decentralized AI computing power network today, Bittensor, only has about 5,000 data center-grade GPUs. Meanwhile, companies like OpenAI and xAI are building massive clusters with millions of top-tier GPUs. The scale gap between the two is enormous.
We realize that,The only way to ensure that AI truly belongs to the people and avoids single points of failure is to build a decentralized computing power network of comparable scale.At this moment, we gained great inspiration from Bitcoin. We do not merely view it as "digital gold," but rather see it as one of the greatest frameworks for building large-scale infrastructure.
Over the past 15 years, the Bitcoin community has built an incredible infrastructure in a decentralized manner. Today, the Bitcoin network has a data center-scale capacity of about 26 gigawatts, which even exceeds the combined total of Google, Amazon, Microsoft, OpenAI, and xAI. This is a massive project jointly constructed by countless independent participants around the globe, all striving to move away from centralized systems.
Equally impressive is the speed of innovation in hardware. Over a 15-year period, the energy consumption required to achieve 1 TH/s of Bitcoin computing power has dropped from 5 million joules to just 15 joules, representing an astonishing 300,000-fold improvement in efficiency! We believe that...If the same transformation can be brought to AI computing power, true "computing abundance" will become possible, enabling AI to be accessible to every person on Earth.
Host:I noticed that,Early Bitcoin infrastructure giant Bitfury has just announced a $50 million investment in you.Does this mean the market is seeing a similar pattern? Bitcoin made energy "interchangeable" because regardless of whether the energy is in Siberia or Silicon Valley, it can be converted into homogenized computational power value. Are you making computational power "interchangeable" as well? Considering AI is very sensitive to latency and other factors, will this be a challenge?
Gonka AI:We believe a similar story will unfold in the computing power domain. Currently, NVIDIA's chips are extremely expensive, and the majority of the construction costs for data centers at companies like OpenAI are paid to NVIDIA. However, if we can replicate the innovation transition from ASICs (Application-Specific Integrated Circuits) in the AI field, the landscape would be vastly different.
After the hardware cost of a single computing unit drops significantly, energy costs will once again become a key variable. Early mining companies and hardware manufacturers like Bitfury are now investing in this ecosystem, which is a strong signal:They identified the same patterns that were present during Bitcoin's early development.
Looking back to 2012, GPUs were still the mainstream mining equipment. However, just a few years later, ASICs became the only viable mining option, thanks to their efficiency, which was dozens of times greater than that of general-purpose chips. Remarkably, the companies that developed these ASICs were not large tech giants, but rather obscure startups. This was made possible entirely by the financial incentive structure of Bitcoin:
- Open competition:No matter who you are, as long as you can provide the most effective computing power to the network, you will receive the largest share of token rewards.
- Positive cycle:As the token price rises, the rewards become more attractive, thereby motivating more people to join the competition to increase the network's total hash power.
- Lower the barrier to innovation:A small company located in South Korea or San Francisco, as long as it can design a more efficient chip, would not need a large sales team, would not need to build relationships with industry giants, and might even avoid traditional investors. They would simply need to connect the chip to the network; once proven effective, they could immediately start generating profits.
This framework significantly lowers the threshold and complexity of the "computing power production" business. We firmly believe that a similar scenario will unfold in the AI chip industry. Once the protocol is established, people will be able to earn money by connecting their computing devices—whether their own computers, purchased NVIDIA GPUs, or rented computing power from data centers—to the network, contributing their resources and receiving rewards in return.We expect that this innovation driven by the financial framework will bring hundreds, or even thousands, of times more computing power to the AI network within the next one to two years, completely breaking through the computing power bottleneck we face today.
How are decentralized networks reshaping the computing power market?
PANews:This model is quite interesting and reminds me of the stories from the early days of cryptocurrency mining, when miners used idle GPUs in schools for mining. Now, many companies have purchased expensive H100 GPUs, but they remain mostly idle most of the time because they don't know how to fully utilize them. Has your network also attracted users like these?
Gonka AI:We have indeed encountered many similar and even more exciting cases. Some very successful AI startups purchased hundreds of H200 GPUs with investors' money during the early hype, but to this day, only half of them have been effectively utilized.
Another more common scenario is that many companies themselves are already renting computing power from large data centers to run open-source models. Later, they found that through our network, they could do something smarter: instead of inefficiently running the models themselves, they could use the same service via the Gonka network's API. At the same time, they installed Gonka nodes on the GPUs they had rented, contributing them to the network. In this way, they can not only use AI models but also earn token rewards simultaneously, achieving significantly higher efficiency and profits than before.
To efficiently utilize GPUs, you need to process thousands or even millions of requests simultaneously, which is extremely challenging for a single project. As a result, companies either suffer from low hardware utilization (whether owned or rented) or pay expensive API fees—neither of which is optimal. Joining the network and becoming part of the ecosystem is a better solution.
Many participants in our network do not have only "idle" computing power. For example, data centers like Gcore and Hyperfusion are themselves efficient commercial operators with limited idle capacity. However, in the past few months, they have found that connecting their GPUs to the Gonka network generates higher returns than directly leasing them to customers, as they gain exposure to the value created by the network's growth. As a result, they have started gradually transferring hundreds of GPUs from their leasing business into our network.
This is precisely the key reason why the network can scale from thousands to millions of GPUs. Although industry giants like OpenAI have purchased the majority of GPUs available in the market, millions of GPUs remain distributed among these independent participants. Individually, they cannot compete, but collectively, they can form a formidable force.
This logic also applies at the national level.
A year ago, when we communicated with governments of some countries, their mainstream view was, "We need to build our own clusters and develop sovereign AI."
A year later, when we met again with ministers from countries such as the UAE and Kazakhstan, they all clearly recognized that as independent players with only a small number of GPUs, they would be unable to compete with the giants.
But if they jointly join a large, trustworthy decentralized network, it is entirely possible for each to maintain their own sovereignty, as each can trust the decentralized network.
The Debate Over the AI Bubble: A Wave of the Times, or the Collapse of a Specific Bet?
PANews:There is no doubt that the field of AI is experiencing tremendous enthusiasm and rapid growth. However, under the high expectations of investors and users, are we heading toward an "AI bubble"? Many people are comparing it to the internet bubble of 2000.
Gonka AI:This is an interesting question. Looking back at the dot-com bubble of 2000, although it experienced a "small burst" at the time, 25 years later, what has the world become today? The Internet was a genuine technological revolution, and the economic transformation it brought was also real. The companies from back then have now grown into trillion-dollar giants, completely changing our lives.
Compared to the Internet, the changes brought by AI will be more radical and thorough.Imagine, in the next 30 to 50 years, everyone will have a personal robot that can work in factories on their behalf. This is not science fiction, but a reality that is about to arrive. Therefore, investors are willing to pour hundreds of billions of dollars into this technology, and it is not irrational.
Of course, there will certainly be failed investments along the way, just as has happened in the venture capital industry over the past 30 years, during which a large amount of capital was lost. However, viewed as a whole, the returns from this field have been extremely substantial, and it has genuinely transformed the world.
Therefore, whether it's a bubble or not depends on your perspective. Some companies may go bankrupt due to wrong assumptions. For example, Gonka's judgment about the feasibility of decentralized AI might be wrong; conversely, all the investments placed on NVIDIA today could also turn out to be a huge bubble.
History has played out a similar scenario. In 2012, due to the narrative around cryptocurrencies, NVIDIA's stock price surged significantly, as the market believed the company would dominate the mining market. However, the ASIC revolution soon followed, and NVIDIA nearly lost this market entirely. Now, AI is bringing even greater value growth to NVIDIA, as the market expects it to be a massive industry worth trillions of dollars. This expectation might be correct, but no one can guarantee that NVIDIA will maintain its dominant position forever. What would happen if an ASIC revolution were to occur in the AI field again?
Imagine trying to rebuild the entire Bitcoin network's computing power today, but instead of using ASIC miners, you used NVIDIA's latest Blackwell chips. You would need to invest 50 quadrillion U.S. dollars! This is clearly unsustainable.
Therefore, what we may be discussing is not a "bubble in AI" itself, but rather a bubble formed by overconfidence in specific companies and particular technological pathways. If the market's assessment of NVIDIA turns out to be wrong, it could result in losses for five to seven companies with trillion-dollar valuations. However, this would not necessarily mean that AI itself is a bubble. AI technology will not disappear, nor will its ongoing transformation of life and business. What may change are the companies that carry these values.
PANews:I completely agree. Just as we don't say "I am using the Internet" now, but rather "I am using an app," which just happens to use the Internet. In the future, every application will use AI in some form or another. It will become so ubiquitous that we won't even realize its presence.
Gonka AI:Absolutely correct. If you look at the NASDAQ index's K-line chart from its inception to the present, you'll find that the "major crisis" in 2000 was just a minor ripple in the long-term growth trend over several decades. At that time, people believed that all products would be sold online within five years, which didn't happen, but it did occur within 15 years.
The same applies to artificial intelligence. A future where robots are everywhere might not happen within five years, but it is almost certain to occur, and no force can stop it. From this perspective, a surge in our future demand for computing power by thousands of times is inevitable. What we need is a long-term economic model, designed for the next several decades, to support this vision—just like Bitcoin.

