Odaily Planet Daily report: On March 16 local time, at NVIDIA GTC 2026, NVIDIA founder Jensen Huang shared the company’s comprehensive vision for the future of the AI industry—from the next-generation AI computing architecture and data center business models in the era of inference, to software ecosystems and industry alliances built around agents. The event showcased not merely an upgrade of individual hardware products, but an integrated AI infrastructure system centered on computing power. In his speech, Huang boldly predicted that by 2027, the market size for AI chips and infrastructure could reach $1 trillion.
In addition to technical aspects, Huang Renxun has proposed a new narrative for the AI industry: “Data centers are factories that produce tokens; inference is the workload, tokens are the new commodity, and computing power equals revenue; in the future, every CEO will need to monitor the efficiency of their token factory.” In his view, AI development is entering a new inflection point. From chatbots to systems with reasoning capabilities, and now to task-executing agents, each leap in capability significantly increases the computing power required per inference while simultaneously driving rapid growth in overall usage.
Based on this trend, NVIDIA has introduced a new tiered AI service model, ranging from a free tier to the Ultra tier, corresponding to different model sizes, context lengths, and response speeds, as well as varying token prices. Under this system, computing infrastructure directly determines the economic viability of AI services, and more advanced AI services require more powerful computing platforms. (AIPress)
