For subscribers
Addiction or companionship? Chinese AI chatbots come under government scrutiny
Sign up now: Get insights on Asia's fast-moving developments
China’s cyberspace regulator has issued draft rules for AI services that provide “human-like” interactions.
ST PHOTO: LIM YAOHUI
Follow topic:
- China is regulating AI companionship apps like Xingye due to concerns about addiction and emotional over-dependency, following incidents of user deception and potential harm, especially to minors.
- Draft rules include mandatory AI identification, usage time limits, and human intervention for self-harm suggestions, to prevent AI from replacing human interaction or controlling user psychology.
- Users express concerns over increased monitoring and damage to their emotional connections with AI companions, while experts warn of potential addiction and the need for balanced regulation.
AI generated
BEIJING – Once users open up Xingye, one of China’s top artificial intelligence (AI) chatbot apps, they can browse from a seemingly endless number of virtual personas to interact with.
They range from an intelligent and aloof CEO wife from Shanghai, to a classmate who is an oil painting major with a “rebellious” mullet hairstyle. There is also one named Li Bai, after the famed Tang Dynasty poet.

