Cloud service's middle game, starting with AI applications

10/08 2024 419

From last year's battle of large models to this year's competition in AI applications, cloud services are entering a new stage of development. AI acts as a lever to further advance the competitive strategies of various vendors.

How many possibilities can “cloud + AI” incubate? In the industry's view, “cloud + AI” implies new suspense: In what direction will cloud computing evolve? How will the market landscape of large models change? What role will cloud service giants play in this? Can large models and AI applications become the second growth curve for cloud service providers?

A series of questions await their answers.

AI Applications Become the “Main Event” In the past two years, AI applications represented by large models have gained momentum in China. According to public data, as of now, over 190 large models have been registered and launched in China, providing services to the public, with over 600 million registered users.

A recent survey by Accenture revealed that 59% of Chinese enterprises plan to increase investment in digital transformation within the next year, an increase of 6 percentage points from last year. Meanwhile, many Chinese enterprises hope to leverage AI technology to continuously innovate and accelerate transformation. This not only indicates the potential demand for AI services across industries but also highlights the significant commercial value of large models.

For cloud service giants, key players in the large model space, these trends signify the emergence of new business logics: Large model applications consume significant computing power, driving revenue growth for cloud services. Simultaneously, large models enhance software application functionality and user experience, boosting software revenue and uncovering new user needs, which, in turn, further matures large models.

The implementation of large models not only facilitates industry clients' AI-driven business innovation and upgrading but also differs from previous technological iterations. Large models drive infrastructure reconstruction and upper-layer application changes. Reviewing the history of IT technology development, before the advent of large models, the IT industry was dominated by the combination of “CPU+OS+software.” Subsequently, with the maturation of virtualization technologies in computing, networking, and storage, computing power became a crucial resource, transforming the capabilities and forms of infrastructure through cloud computing.

In today's AI era, this combination has been replaced by “GPU+cloud+AI.” With GPUs providing computing power for large models, cloud service providers must reconstruct their service architectures from the bottom to the application layer, while large models will lead to reconstruction from research and development to application. Meanwhile, MaaS has become a new essential component of cloud architectures, attracting fierce competition among cloud service providers and leading to significant changes in their technology systems. As cloud services deeply integrate with large models, cloud service providers will leverage the cloud model to promote large model servitization and application scalability.

As a result, cloud service giants like Alibaba Cloud, Tencent Cloud, and Baidu Intelligent Cloud, after experiencing lows in recent years, have made AI a focal point for future development, reshaping their technology businesses to drive performance growth. Their financial reports reflect an increasing share of AI revenue. In August 2024, Alibaba Group's quarterly results showed that cloud and AI group revenue grew 6% year-on-year to RMB 26.549 billion, with AI-related products achieving triple-digit year-on-year growth. Baidu Group's Q2 2024 financial report revealed that intelligent cloud revenue reached RMB 5.1 billion, up 14% year-on-year.

AI accounted for 9% of revenue, up from 6.9% in the previous quarter. Cloud service giants' deep involvement and significant investments in AI have gradually dissipated the smoke from last year's large model battle, with AI applications driven by large models now taking center stage. This signifies the emergence of a new cloud service ecosystem centered on MaaS platforms and AI-native applications, where AI-transformed application software will occupy a more prominent position.

Due to the “Matthew Effect” in the cloud industry, underlying resources like data, computing power, and storage will be dominated by a few cloud service giants. Commercial opportunities in cloud services are gradually shifting upwards, towards PaaS, MaaS, and SaaS layers, particularly SaaS, which will undergo fundamental changes in both R&D and business models.

Currently, cloud service giants are upgrading and transforming their products with large models while offering them to different industries to help clients and partners rapidly develop AI applications.

For example, hundreds of Tencent products have integrated Tencent's Hunyuan large model, expanding the paid user base through intelligent upgrades. In Q1 2024, Tencent Meeting revenue doubled year-on-year, while Tencent WeChat Work revenue surged 200% year-on-year. This indicates that China's cloud industry, fueled by the AI wave represented by large models, has entered a new stage, with new-type cloud service providers embarking on a large model and application elimination race. Amidst this backdrop, cloud service giants are gearing up for another round of intense competition.

From “Integration” to MaaS

In the past few years, domestic cloud vendors, including Alibaba Cloud, faced challenges in customized projects. While Alibaba Cloud's public cloud-first strategy may temporarily impact government-related business, it will positively contribute to Alibaba Cloud in the long run.

Similarly, Tencent Cloud announced its “integration” strategy in 2022. Years earlier, Pony Ma stated that Tencent would entrust half of its fate to partners.

In the short term, serving as the prime integrator benefits cloud vendors by securing large contracts. This approach not only explores specific vertical scenarios during the initial growth phase but also enhances industry know-how accumulation. However, this model offers limited profitability. In the long run, being the prime integrator impacts both profitability and business model development for cloud vendors. Consequently, over the past few years, vendors have been continuously investing, leveraging funds, cloud computing discounts, and training to accelerate the growth of their ecosystem partners. Nowadays, this integration model has evolved, with MaaS emerging as a new focus.

Historically, domestic cloud vendors' core competencies largely centered on IaaS-level “computing, storage, and networking” servers. Their PaaS-level cloud service capabilities lagged behind international vendors like AWS. However, cloud vendors' true value lies in PaaS platform development. “Leveraging the strengths of IaaS, PaaS, and SaaS to create modular capabilities that enable reassembly on the platform, ultimately fostering business and platform innovation.”

This is the essence of the industry cloud platform, as stated in Gartner's Top 10 Strategic Technology Trends for 2023. The keyword here is “modularity,” emphasizing cloud vendors' need to harness their PaaS platform capabilities, which explains why domestic cloud giants have recently advocated for “integration.” Nevertheless, a challenge lies in the fact that not all cloud vendor capabilities have transitioned to PaaS, limiting full-service capabilities for ecosystem partners. Moreover, in crucial areas, cloud vendors must develop their products (e.g., databases) to ensure high profitability, creating a conflict between full PaaS adoption and the ecosystem's needs.

AI large models, or MaaS, are subtly transforming this dynamic. So far, cloud computing giants like Alibaba Cloud, Huawei Cloud, Tencent Cloud, Baidu Cloud, and JD Cloud have launched MaaS services. MaaS's incremental value for cloud vendors manifests in two ways: Firstly, through API invocation services, where vendors can charge based on usage or time. Secondly, by providing AI computing power for training and running large models.

As domestic model parameters improve and vertical model adoption surges, demand for model training is also growing. Within the MaaS ecosystem, partners can start from a common AI service baseline, mitigating previous product conflicts. Furthermore, with strong profitability drivers, cloud vendors can offer clearer and more comprehensive service guarantees to ecosystem enterprises.

In contrast to previous IaaS and PaaS efforts, MaaS offers a new growth avenue for cloud vendors. Critically, MaaS's greater value to the ecosystem lies in data-driven model expression, bringing together clients, developers, startups, and ISVs to enhance efficiency and reduce costs. From standalone PaaS to MaaS, cloud vendors' ecosystem strategies are evolving.

The Cloud Industry Embarks on an “Elimination Race” For any industry, “price wars” are an effective and direct strategy, and the cloud computing industry is no exception. The cloud industry is capital-intensive, with enterprises' annual IT expenditures remaining fixed. The market is in a state of stock competition, where any cloud service provider that acquires more clients and larger contracts inevitably impacts the performance of other providers.

Cloud giants believe that price reductions can enhance cloud computing penetration across industries, expand user bases and loyalty, and drive non-internet industries from basic cloud adoption to deep integration, fostering economies of scale, reducing marginal costs, and increasing profits for new resource procurement and R&D investments, ultimately consolidating competitive advantages. The same logic applies to AI large models.

In May 2024, cloud service giants significantly reduced large model invocation prices: Doupao, ByteDance's flagship model, priced at RMB 0.0008 per thousand tokens in the enterprise market, 99.3% cheaper than the industry average; Alibaba Cloud's Tongyi Qianwen GPT-4-level flagship model Qwen-Long costs RMB 0.0005 per thousand input tokens, a 97% reduction, and RMB 0.002 per thousand output tokens, a 90% drop... Such drastic price cuts make it challenging for large model startups to compete.

Evidently, cloud giants aim to expand their markets and user bases through price reductions, a long-term strategy. By lowering large model invocation prices, they hope to stimulate increased invocations, spreading computing costs and gradually generating profits. Moreover, continuous price reductions attract developers and ecosystem partners. During this process, other large model service providers may falter due to pricing and cost pressures, leaving only a handful of major players in the market—precisely the goal of cloud service providers in driving this “elimination race.”

However, this also highlights room for improvement in cloud giants' large model strategies. In contrast, OpenAI's ChatGPT incurs annual R&D, training, and human costs reaching billions of dollars.

This underscores the high technical and financial barriers for large models, requiring continuous upgrades and investments, posing significant risks. Domestic tech giants currently exhibit similar large model platforms, with prominent homogenization. The crucial question is how much AI large models can drive further growth for cloud service giants. As investments intensify, will AI deliver the increasing returns these giants anticipate? These remain unknowns.

An ideal large model ecosystem should focus on intermediate services like computing power, data, and large model training, or on industry applications derived from large models that genuinely create value, while relying on one or two robust foundational large models. However, an original foundational large model comparable to ChatGPT has yet to emerge, let alone upper-layer applications and services.

Cloud service giants now face mounting uncertainties, with true AI large model challenges still ahead. Global tech giants like Amazon, Microsoft, and Google have launched large model products through investments or in-house development, charting a clear path in cloud, software, AI application, and cloud service integration. Domestic large models and AI application services are still in their infancy, with ample room for market growth and industrial maturity. Competition is fierce. Under these circumstances, how can we cultivate a rigid demand user base for large models? How should we define their business boundaries? How can large models become a stable and sustainable growth engine?

How can we ensure privacy and data security as large models are implemented across industries? Addressing these issues will attract more entrepreneurs, developers, partners, and enterprise users, turning large models and AI applications into new engines for intelligent upgrades across industries, unleashing new momentum and advantages for innovation and development.

From Cloud Computing to AI Large Models: A Crucial Step in Cloud Giants' Ecosystem Revolution

AIGC Spurs a Revolution in Computing Power Demand, Making Edge Computing No Longer “Edgy”

How Far Are We from the New Era of “Computing Power Nuclear Bomb” to Generative AI?

Can AI and Cloud Computing Coexist and Grow Together, Unlocking the Next High-Growth Frontier?

[Original Reporting by Tech Cloud News]

Please indicate the source as “Tech Cloud News” and include a link to this article when reposting.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.