Ant Group, Alibaba’s fintech subsidiary and parent company of China’s ubiquitous payment app Alipay, is racing to become the country’s digital health market leader with a chatbot designed to be a wellness companion.
Its app, Ant Afu, uses artificial intelligence and agent capabilities to answer health-related questions, suggest hospital appointments, analyze test results, and remind users to exercise or take medication. First established in June under the name AQ, it had 30 million monthly active users as of January, more than half of whom lived in small towns, according to Han Xinyi, chief executive of Ant Group.
Internet users have increasingly turned to AI for everyday health questions and companionshipespecially in markets where access to doctors is limited. Despite concerns over patient safety and data privacy, products like Ant Afu are widely adopted in China as consumers seek more personalized, round-the-clock medical assistance.
China’s primary care system is underdeveloped. Most people seek treatment for everything from the flu to cancer in sprawling, overcrowded public hospitals concentrated in big cities. Patients often complain about long wait times, short consultations, and poor bedside behavior from exhausted clinicians.
This demand for better care, combined with the rapidly aging population, has created fertile ground for digital health products that can spare people the burden of hospital visits. Technology companies, including JD.comByteDance and Baidu have all created online medical consultation tools and, more recently, chatbots that qualify as AI doctors.
Ant has a unique advantage in that Alipay has long hosted many hospitals’ appointment and payment systems. Millions of people access their national health insurance accounts through Alipay. In January 2025, Ant acquired Haodf, a leading online consultation portal with over 300,000 licensed doctors.
AI companies in the United States are also expanding their healthcare offerings, but they do not yet offer direct access to the country’s large number of private providers and insurers. This month, OpenAI and Anthropic announced tools targeting consumers, healthcare providers and clinical researchers. ChatGPT and Claude now offer features that analyze users’ medical reports and fitness data.
Among its domestic competitors, Ant’s extensive partnerships with regulators, hospitals and doctors give it an advantage in the race for AI health care, said Ivy Yang, a China technology analyst and founder of New York-based consulting firm Wavelet Strategy. Rest of the world. On Ant Afu, users can ask health-related questions, book online consultations and offline appointments at major hospitals, and get reimbursed by public or commercial insurance.
“For startups, the bureaucratic red tape and initial investment required to build the platform, comply with all health data (regulations), and deal with various government agencies seems like too big a hurdle to overcome,” Yang said.
Ant’s foray into healthcare was endorsed by its billionaire founder Jack Ma. He came up with the name Afu because it made the chatbot sound like a friend, Chief Executive Han told Chinese tech media Latepost this month. “He really cares about whether or not Afu can be like an AI friend who provides emotional companionship and human care,” Han said, “rather than just being a tool for solving professional problems.”
Ma hopes to one day launch the app in underdeveloped regions of Africa and Southeast Asia, Han said.
Ant spent tens of millions of dollars marketing Afu in China, according to Han. Ant Afu advertisements have appeared in subway stations, on social media, in public toilets and have been painted on walls in rural areas of China, according to photos shared online. At the end of January, Ant Afu ranked among the ten most downloaded iOS apps in China, according to data from Sensor Tower.
The growing role of AI in patient care, a largely unregulated field, has also prompted warnings about misinformation around the world. A recent investigation by The Guardian found that Google’s AI summaries gave inaccurate health advice. Academics have also find AI diagnostic tools to harbor racial or socioeconomic bias.
