Deadlock on AI: Feishu Half a Jin, DingTalk Eight Li

06/28 2024 381

DingTalk has gathered the "Seven Dragon Balls," while Feishu boasts having the "Five Tigers."

Written by | Lan Dong Business, Zhao Weiwei

Halfway through 2024, Feishu did not hold its Spring Future Unlimited Conference like in previous years, while DingTalk has held two conferences centered around AI. In the past, the twice-yearly industry conferences targeted at the B-end market were a shared rhythm between Feishu and DingTalk.

DingTalk's pace is clearly accelerating. One time was at the beginning of the year, when DingTalk's Super Assistant opened up access to all scenarios, supported by Alibaba's large model Tongyi Qianwen at the bottom layer. And most recently, DingTalk has brought in six large model vendors - MiniMax, Moon's Dark Side, Zhipu AI, Orion Star, ZeroOne Everything, and Baichuan Intelligence - in an attempt to create the most open AI ecosystem and give users the right to choose.

DingTalk has gathered the "Seven Dragon Balls," while Feishu boasts having the "Five Tigers."

Around the same time, Feishu's customer advertisements appeared at the airport. The list includes the "Five Tigers" of large models that overlap with DingTalk's team, namely MiniMax, Moon's Dark Side, Zhipu AI, ZeroOne Everything, and Baichuan Intelligence.

Feishu emphasizes that these large model companies, such as MiniMax, are all using Feishu internally, and they are customers of Feishu themselves. Feishu has a high penetration rate in the AI large model industry and has snatched away star companies like Pop Mart from Enterprise WeChat.

For DingTalk, it values dancing with large model vendors more, exploring the most practical landing methods for large models from customer scenarios. As DingTalk President Ye Jun puts it, "We start from the nail, not the hammer."

For large model vendors, DingTalk is more like an "amplifier," magnifying their own value. This is because DingTalk has over 700 million users and over 25 million organizations, which means a huge customer base and B-end application scenarios. When these personalized scenarios and demands are continuously met, it means that large models are increasingly being used, thus lowering the threshold for large models.

On one side, the capabilities of large models are improving, and AI is becoming an underlying capability like water, electricity, and coal. On the other side, as the largest testing ground for productivity, the path of intelligent collaborative office work is also evolving.

Previously, "Lan Dong Business" proposed in the article "DingTalk, Feishu Wrestling with AI, Pressuring Enterprise WeChat" that 2024 will become a dividing line in the evolution history of DingTalk, Feishu, and Enterprise WeChat. Enterprise WeChat is rushing ahead silently on the path of achieving profitability first, while DingTalk and Feishu are continuously satisfying AI application scenarios within their own ecosystems on AI, bringing productivity improvements to B-end enterprise customers.

Half a year has passed now, and the division between DingTalk and Feishu has become apparent. Feishu President Zhang Nan resigned, and DingTalk fully opened its underlying model. At least in embracing AI, the current Feishu is half a jin, and DingTalk is eight li.

Validating Customer Value

At the DingTalk Ecosystem Conference, "customers" was a high-frequency term mentioned by Ye Jun. He expressed a clear sense of boundaries, that DingTalk's responsibility is not to innovate large models but to explore practical application scenarios for large models.

Therefore, DingTalk gathering the "Seven Dragon Balls" of large model vendors can be understood as the choice of customers, as the toB market is very vast, and it is difficult for one company to completely satisfy customers from different industries. Only diverse services can meet the high complexity of market demand.

Now, the total number of DingTalk's ecological partners exceeds 5,600; among them, AI ecological partners have exceeded 100, including partners in different fields such as AI Agent products, AI solutions, and AI plugins, in addition to AI large model ecological partners. DingTalk has assisted in上架ing over 700 AI assistants in the market. One important data is that DingTalk's AI is called more than 10 million times daily.

But in Ye Jun's view, the number of 10 million is not large at present, indicating that there is broader space in the future. Therefore, DingTalk announced its openness to all large model vendors. After all, large-scale applications in real scenarios are the only way to validate the value of large models and lead to AGI.

On the other hand, it is a necessity for competitiveness. In November last year, Feishu released "Feishu Smart Partners," an open AI service framework. Enterprises can independently choose suitable underlying large models based on business scenarios, and it was claimed to support Baichuan Intelligence, Zhipu AI, MiniMax, etc.

Feishu indeed has a high penetration rate in the AI industry.

Industry insiders from large models revealed that AI industry companies they have collaborated with in the past two years have unanimously used Feishu for office work, and Feishu's AI ecosystem is very open, supporting not only the C-end large model development platform "Coze Kouzi" but also the B-end open-source large language model application development platform Dify.

Sharp AI developers have already discovered that on ByteDance's development platform "Coze Kouzi," which previously only supported the Doubao large model, the underlying models recently include mainstream domestic large model products such as Tongyi Qianwen, MiniMax, Kimi, providing users with a richer selection of large models. This is clearly a common rhythm in the development of DingTalk and Feishu.

For DingTalk at the moment, it has a larger enterprise-level customer base and also needs to come up with richer cooperation methods to satisfy enterprise customers' needs for large models.

For example, in exploring educational application scenarios, DingTalk's cooperation with Moon's Dark Side is based on Moon's Dark Side's large model's long-text understanding and output capabilities. And in cooperation with MiniMax, DingTalk accesses MiniMax's large model capabilities in voice and music, satisfying users' needs for songwriting. DingTalk explores different capabilities' applications in products and scenarios based on the characteristics of different large models.

On the other hand, AI Assistants (AI Agents) are an important force for DingTalk, essentially providing enterprises with finding suitable AI application scenarios and creating their own AI assistants. And when enterprises build their own AI assistants, in addition to the default Tongyi Qianwen large model, they can also choose the underlying large models of six large model vendors such as MiniMax.

Of course, the most important and difficult task is to build private large models for enterprise customers and truly integrate AI capabilities into the business, which is the key to validating customer value.

Enterprise's private data is the company's core competitiveness. The third cooperation model proposed by DingTalk is to tailor customized intelligent solutions for customers, based on their personalized scenarios and needs, together with large model vendors. It also provides services such as model training and optimization, AI solution creation, AI customized application development, and can achieve private deployment of models.

Therefore, exploring the landing of large model applications will be a continuous action for Feishu and DingTalk for a long time in the future, which will greatly test their customer value and service capabilities.

Search Remains King

The large model industry has made rapid progress for a year and a half, and AI search remains the most certain direction.

Driven by ChatGPT, Microsoft became the earliest leader in commercializing generative artificial intelligence. In the first quarter of this year, Microsoft's revenue increased by 17% year-on-year, of which search and advertising services revenue increased by 12% year-on-year, due to AI improving the user experience of search services Bing and Edge, leading to an increase in market share.

On the other hand, AI assistants are also an important tool for advancing intelligent collaborative office work. Microsoft's Copilot generative AI assistant has been embedded across Microsoft's entire product line, providing value-added services to users. Driven by AI, Office 365 revenue grew 15%, and paid subscribers now total 80.8 million.

It is not difficult to see that whether it is a search box or an AI smart assistant, they are essentially evolutions of human-computer interaction in the AI era, with the same fundamental goal of serving. In this regard, Feishu and DingTalk are also moving towards the goal of efficiency.

Feishu recently promoted "Search Magic," essentially using multi-dimensional tables as search tools. Without using code, it can retrieve results from the vast amount of information within the enterprise, allowing the vast amount of information within the enterprise to be structured and tagged.

The pursuit of efficiency is the goal of enterprise evolution. Previously, Pop Mart was a star case of Enterprise WeChat, relying on Enterprise WeChat to establish a large number of official communities. Private traffic within the WeChat ecosystem became one of the focuses of its sales channel expansion. However, two years ago, Pop Mart switched its internal collaboration software to Feishu.

Now, Pop Mart uses Feishu for store management, operating omnichannel e-commerce through Feishu. Efficiency tools such as multi-dimensional tables in Feishu drive the most intuitive efficiency improvement in Pop Mart's sales data and conversion data.

At the DingTalk Ecosystem Conference, in addition to creating the most open AI ecosystem, another important change is that DingTalk has enabled a brand-new AI search in the 7.6 new version. Essentially, DingTalk aims to solve the problem of DingTalk's numerous functions through AI assistants, and AI search aims to solve the problem of scattered and complex information on DingTalk.

"In the AI era, it's easy to search for knowledge from all mankind, but hard to find the knowledge accumulated by enterprises or individuals," said Ye Jun at the DingTalk Ecosystem Conference.

Ye Jun demonstrated DingTalk's new search capabilities on the spot. He took the search question "What are some of the complaints and customer feedback DingTalk has received recently?" as an example. DingTalk quickly generated answers from work chat conversations and documents, providing related content.

Compared to public AI search, DingTalk's AI search more directly demonstrates its ability to process enterprises' own privatized data. Improving accuracy and efficiency is one of the important goals of work search.

Different from traditional search methods, DingTalk's AI search significantly enhances memory and planning capabilities. With the help of large models' abilities such as understanding, reasoning, and generation, it can achieve natural language information query capabilities and organize and summarize fragmented information, presenting users with a clear knowledge network. At the same time, the search results can also be automatically organized into DingTalk mind maps and outlines, with a clear information structure.

Grasping the Brief Monetization Cycle

"There will be no significant changes in the large model industry in the short term. This is a relatively optimistic period for application-level startups. The application layer of large models will see an explosion in the next year," believes Zhu Xiaohu, managing partner of GSR Ventures.

Last year, Zhu Xiaohu and Fu Sheng had differences around the topic of ChatGPT entrepreneurship, and the two argued remotely. But now, the two are increasingly gathering and having peaceful dialogues in large model forums.

On the future expectations of large models, the two share a common judgment: the iteration curve of large model technology has significantly slowed down, and in the short term, OpenAI will not release a GPT-5 large model with significantly enhanced capabilities.

For domestic market players, this means a relatively short commercialization monetization cycle.

As an investor, Zhu Xiaohu has always attached great importance to how large models make money. He believes that large model APIs will definitely be free in China - there will be no more large model companies after five years, only front-end application companies and backend cloud service companies. Therefore, for AI entrepreneurs, Zhu Xiaohu still recommends focusing on landing scenarios and creating products that truly understand customer needs in focused scenarios in vertical industries.

This also highlights the importance of DingTalk in the current market, which is how to explore the most practical application scenarios for large models together with large model vendors. An especially important consensus is that with the insufficient penetration rate of AI large models, the B-end market may have more opportunities than the C-end market.

"DingTalk has the largest customer base and scenarios. Our cooperation with DingTalk is a process of value amplification," said Zhang Fan, COO of Zhipu AI. As models are used more and more, it means further lowering the threshold for AI use, making it easier for users to gain experience in business scenarios, and spreading out costs, gradually enabling profitability.

Long before ByteDance's Doubao large model announced a price reduction to initiate a price war, Zhipu AI had already reduced the price of its entry-level product GLM-3-turbo from 5 yuan per million tokens to 1 yuan, a decrease of 80%. But in the past six months, Zhipu AI's API daily consumption has increased by 50 times. Zhang Fan believes that "essentially, the application layer has increasingly higher requirements for model quality."

But before this brief monetization cycle, the primary issue for large model companies is survival, which is one of the reasons why large model vendors embrace Feishu and DingTalk.

After extensive contact with large model vendors, DingTalk Vice President Wang Ming reached a consensus that model-level costs will drop significantly in the future, and there will also be a 90% cost reduction at the hardware level in the next few years. Large models will be a very low-cost infrastructure like water, electricity, and coal, "As long as you can make it to that time, you can make money."

Therefore, in Wang Ming's view, solving user problems is fundamental. The cooperation between DingTalk and large model vendors is not just about creating revenue but also validating whether the model is correct in broader scenarios. "As long as you can validate a certain business model now, stay at the table, and consistently stand in the first tier in the long term, you will naturally be a harvester when costs decline."

In addition, Wang Ming revealed a trend that starting in July this year, domestic large model vendors will intensively release a series of multi-modal large models.

Therefore, the landing of large model application scenarios remains an expectation of the industry, and for the current DingTalk and Feishu, working with large model vendors to solve commercialization challenges has become one of the key tasks.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.