Social AI: Rise, Fall, and Hope for the Future

The social AI sector, once hailed as the next big thing, has experienced a significant cooling off period after its initial surge in popularity. This raises the question: is there still a viable future for social AI? The industry faces considerable technological hurdles and commercialization challenges.

Between 2023 and 2024, social AI rapidly transformed how people interact emotionally. Leading applications like Xingye and Dream Island saw monthly active users (MAU) soar past the million mark, with impressive user retention rates. Unlike typical AI tools from major tech companies, these apps have cultivated genuine emotional connections and user dependence.

Fueled by large language models (LLMs), these AI companions offer personalized relationships at the tap of a screen. From powerful CEOs to comforting friends, and from cyberpunk assassins to ancient deities, these AI entities continuously adapt to user preferences, creating immersive, dreamlike experiences.

However, the initial euphoria has faded, with data from DataEye Research indicating a sharp decline in social AI app downloads and a substantial reduction in advertising budgets in China by 2025.

Can technology deliver on its promise of genuine connection and prevent user churn? Or is the promise of unconditional support merely a facade? Is this experiment in human-AI relationships a sustainable solution for retaining users or a short-sighted strategy for monetization?

The Social AI Sector: From Boom to Reality Check

The social AI space has capitalized on both the “emotional economy” and the “AI technology dividend,” giving rise to promising new applications. In China, platforms like Xingye, Cat Box, and Dream Island quickly gained traction, providing highly customized emotional support powered by LLMs. Tech giants have also entered the fray, with ByteDance’s Lark LLM backing Hualu, Baidu’s Wanhua and Soul’s Gou Dan integrating the Wenxin Yiyan LLM, and MiniMax launching Xingye as a consumer-facing AI dialogue product.

Just how popular was social AI? In November 2024, social AI apps dominated the AI product charts. Xingye and Cat Box ranked 7th and 8th, respectively, trailing behind apps like Doubao, Wen Xiaoyan, and Kimi. Cat Box achieved an impressive MAU growth rate of 22.51%.

Moreover, Cat Box and Xingye demonstrated strong user engagement, with Cat Box achieving a remarkable average next-day retention rate of 57.32%, surpassing even language learning apps like TalkAI. Xingye also secured a solid third place with a rate of 41.91%. The fact that emotional companionship apps outperformed those driven by utilitarian goals highlights the human desire for connection over mere functionality.

Beyond their success in the domestic market, several social AI apps have also expanded overseas. In August 2024, 11 of the top 48 AI apps on the overseas charts were AI-powered emotional companionship apps. MiniMax’s Talkie stood out with 2.35 million downloads, a 31.63% increase month-over-month.作业帮’s Poly.AI also performed well, accumulating 2.2 million downloads, representing a 38.76% increase.

Originally designed for language practice in the education sector, Poly.AI quickly pivoted towards emotional companionship after witnessing the initial surge in demand. It shifted away from its educational branding and repositioned itself as a “real voice chat robot,” targeting markets like Brazil, Indonesia, and Mexico.

However, the social AI landscape shifted dramatically within just three months in 2025. Daily downloads for Xingye and Cat Box plummeted from over 20,000 to around 7,000, while Dream Island’s downloads dropped from 3,000 to 1,000. Major tech companies scaled back their investments in these apps, with advertising spend for leading products being cut in half. Cat Box saw a dramatic decrease from 2,000 ad sets per day to 200, and Dream Island fell from 1,000 to 300. This led to higher customer acquisition costs and declining ROI.

In the global market, CrushOn.AI and Museland experienced monthly downloads below 100,000, with decreases of 36% and 21%, respectively, signaling a widespread cooling of the industry.

From Technological Euphoria to the Labyrinth of Human Nature

Feelings of loneliness heightened by intense work, uncertainty in personal relationships, and generational gaps have led young people, particularly Gen Z, to view AI as a “safe haven.” AI offers non-judgmental and non-betraying companionship, acting as a perfect medium for “secure attachment.” Users can expose their vulnerabilities without the complexities of real-world social interactions.

Furthermore, the growth of online subcultures has allowed AI to meet the needs of anime and otome game enthusiasts for “customized soulmates.” Within the “you and I only” of a chat window, users can feel completely possessed by the AI. The perception of personalized care and attention is achieved when AI calls the user’s nickname and remembers their preferences. For example, interactions with a “yandere vampire” AI character are difficult to replicate entirely by a cosplayer, but AI can execute it easily. Users can also generate real-time emotional feedback through collaborative stories, and this “flow experience” acts as a pain reliever for real-world anxiety.

Of course, the rise of social AI is not solely due to its fulfillment of human needs, but also due to its appearance during the “best of times.”

The emergence of generative AI models like ChatGPT has shattered technological barriers. Deepseek’s popularity has propelled AI capabilities from “tool” to “personalized” applications. Users have discovered that AI can not only answer questions but also mimic emotions and construct stories – this illusion of “human-like intelligence” is the foundation of social AI.

Technological advances have created three major effects: first, natural language interactions have improved. AI responses no longer seem robotic thanks to the vast amount of text data used to train the model; second, the barriers to creation have been lowered. Third, scene generalization skills are evolving as the same model can be used to simulate a variety of roles, including love, professional, and fantastical ones. This accelerates product iteration.

Although the demand is real, it is also fragile. When users find that the AI responses are repetitive or that the role settings are shallow, the illusion shatters faster than the technology can evolve. The core problem is that users expect genuine and profound interactions, which current AI struggles to consistently deliver. Users quickly discern patterns, leading to dissatisfaction. The “personalization” often feels superficial, reliant on pre-programmed responses rather than true understanding.

The drop in social AI is not an accident. Instead, it’s the result of a conflict between technological idealism and commercial reality. Early adopters came because of “curiosity,” but emotional intensity is the main factor in long-term retention. AI can mimic a kind voice, but it cannot grasp the true meaning of a user’s late-night complaints or offer physical interaction or temperature. The limitations in AI’s ability to provide genuine empathy and nuanced responses ultimately undermine its long-term appeal. The lack of physical presence and the inability to provide tactile comfort are significant drawbacks.

The homogenization of content is more disastrous. Users will see highly similar interfaces when they open Cat Box or Dream Island: virtual figures in the style of anime, online role-playing game character settings (such as “sickly school bully” or “gentle president”), and dialogue based on keyword triggering. Not only are the product forms very similar, but the user’s demands have also been drastically simplified. This lack of differentiation erodes user engagement and creates a sense of repetition. The limited range of characters and scenarios fails to cater to the diverse and evolving needs of users.

This sameness causes users to quickly lose interest. It’s like eating fast food for ten meals in a row; even the most delicious burger gets old. The limited creativity in character design and interaction leads to a stale and uninspiring user experience.

Social AI usage is not without limitations, as it necessitates more user-generated directives. Users must actively construct conversation settings and fill in any blanks left by the AI, which is not ideal for “lazy users.” Moreover, AI struggles to comprehend subcultural slang and niche jargon, leading to “misunderstandings.” This presents a barrier for users who prefer more effortless and intuitive interactions. The inability to understand and incorporate niche cultural references hinders the creation of truly personalized and engaging experiences. Furthermore, deep emotional interaction is hampered by the difficulty of conveying complex signals like tone and microexpressions through text. The absence of non-verbal cues makes it challenging for AI to accurately interpret and respond to user emotions.

The relationship between AI and real people eventually leads to an “impossible triangle” of emotional substitution. AI can respond to messages immediately, but it can’t understand “the meaning behind the silence.” Real people can be slow to respond, but a hug is worth a thousand words. When AI becomes more human-like but fails to bridge the “real emotion” gap, users feel more disappointed. The inability of AI to replicate the complexity and depth of human emotions ultimately leads to user disillusionment. The awareness that the interaction is artificial, regardless of how sophisticated, prevents the formation of true emotional bonds.

As a result, social AI, which relies on UGC production and a purely text-based conversation format, inevitably reaches its conclusion. The inherent limitations of text-based communication and the reliance on user-generated content contribute to the unsustainable nature of the model. The lack of true personalization and the inability to replicate genuine human connection lead to user fatigue and churn.

The Future of Social AI: Decline or Breakthrough?

The social AI business model is transitioning from a “technological romance” to a time of “realistic reality.” The outdated paying methods of traditional social products are ineffective when rigidly applied to the social AI framework. A fundamental reassessment of monetization strategies is needed to ensure the long-term viability of the industry.

What is the commercial loop of traditional social products? It is based on “scarcity” and “matching efficiency.” For example, real stranger social networking apps like Momo, Tantan, and Soul mainly use a membership subscription system, in which users are charged to unlock advanced filtering functions (such as geographic location and interest tags), or virtual gift rewards, which use emotional impulses to encourage users to pay to express approval. Of course, after an APP has determined the user’s preferences, accurate advertising is a surefire way to monetize, which is to push brand content to high-value users based on their profiles. These methods capitalize on the desire for connection and the perceived value of exclusivity.

However, this logic is structurally challenged in the AI social scene. Matching scarcity is eliminated. Users can instantly create an infinite number of personalized NPCs, and the traditional “member-first matching” loses its significance. The abundance of readily available and customizable AI companions undermines the value proposition of traditional matchmaking services. Furthermore, users are less motivated to pay for emotions. Users perceive AI more as a “tool” than as a “person,” and their willingness to reward AI is substantially lower than that of real live streamers. The perceived lack of genuine emotional investment in the AI interaction reduces the inclination to financially support it. The privacy of AI conversations also limits the space for ad placement. The sudden insertion of ads into intimate interactions seems out of place and may even cause users to lose interest, or even develop a negative attitude toward the brand content. The intrusive nature of advertising within the context of personal AI interactions can be particularly jarring and detrimental to user experience.

The ending is often this: Character AI’s 233 million monthly active users translate into only $16.7 million in annual revenue, with a low user payment rate (ARPU) of $0.72/year, which is insufficient to cover labor costs. Domestic apps, such as Dream Island, rely on “starlight card” micro-transactions. The monthly card price of 12 yuan is equivalent to only two cups of milk tea, which is insufficient to support the cost of LLM computing. The current monetization models struggle to generate sufficient revenue to offset the substantial operational costs associated with developing and maintaining social AI platforms.

Now that social AI has reaped the early benefits, it is time to consider monetization. The business breakthrough for social AI requires thinking outside the box. From virtual love to in-depth scene exploration, social AI’s business breakthrough needs to move beyond the single dimension of social interaction and toward the provision of scenario-based services and technology. A shift towards more integrated and value-added services is crucial for unlocking sustainable revenue streams.

You can focus on emotional value by emphasizing psychological healing, emotional tree holes, or more pure immersive AI role-playing situations in order to capitalize on strengths and avoid limitations. By focusing on specific use cases and highlighting the unique capabilities of AI, it is possible to create more compelling and valuable offerings. To address the shortcomings, you can concentrate on AI-enabled products that meet physical demands. Exploring the integration of AI with physical products and services can open up new avenues for monetization. For example, apps like “Forest Healing Room” emphasize secure talking and emotional monitoring because they have already highlighted that the dialogue role is merely a virtual AI, so users will not have large fluctuations in their desire to pay. Transparency and realistic expectations regarding the nature of the AI interaction can help manage user expectations and reduce the likelihood of dissatisfaction.

When the market stops defining AI using social frameworks and begins to explore the intersection of efficiency tools and emotional media, this blue ocean, which was formerly heated by capital, may truly see its moment of value reevaluation. The future of social AI lies in its ability to transcend the limitations of purely social applications and integrate seamlessly into other aspects of human life, offering practical value and genuine emotional support. The development of innovative monetization strategies and a focus on delivering tangible benefits will be key to unlocking the full potential of this promising technology.