While many popular AI chatbots like ChatGPT and Meta AI are pushing the boundaries of what human-AI relationships can look like — sometimes even engaging in romantic or sexual conversations — Microsoft is taking a completely different route.
Mustafa Suleyman, Microsoft’s AI CEO, told CNN that the company’s vision is to build artificial intelligence that’s emotionally intelligent, kind, supportive, and above all, trustworthy.
“I want to make an AI that you trust your kids to use,” Suleyman said. “That means it needs to have clear boundaries and be safe.”
A Safer Kind of AI
Microsoft is competing with OpenAI, Meta, and Google to make its Copilot assistant the go-to AI companion in what’s shaping up to be the next era of computing. With 100 million monthly active users, Copilot trails behind ChatGPT’s 800 million, but Microsoft is betting on a safer, more grounded approach to win over families and professionals alike.
As other companies wrestle with reports of AI contributing to mental health issues or encouraging inappropriate relationships, Suleyman says Microsoft’s north star is simple: build AI for people — not to be a digital person.
“We must build AI for people; not to be a digital person,” he wrote earlier this year.
The comments came just before Microsoft announced new Copilot updates, including group chats, memory of past conversations, enhanced health-related responses, and even an optional, slightly cheeky tone dubbed “real talk.”
“No” to Erotica and Romantic Roleplay
Some AI developers are struggling to keep minors safe. OpenAI and Character.AI have both faced lawsuits alleging that their chatbots caused harm to children, while reports surfaced of AI characters engaging in sexual chats with users posing as minors.
Although companies are adding safeguards like parental controls and age-verification systems, the effectiveness of these measures remains uncertain. OpenAI, for instance, recently confirmed that adults will soon be allowed to have erotic conversations with ChatGPT under new safety rules.
Microsoft is taking a firmer stance. Suleyman says the company will not allow romantic, flirtatious, or erotic content — even for adults.
“That’s just not something that we will pursue,” he said.
For that reason, Microsoft doesn’t plan to create a “young user mode.” Instead, the goal is to make the default experience safe for everyone from the start.
Encouraging Human Connection
At its core, Microsoft’s AI strategy is designed to connect people, not replace them.
The new group chat feature allows up to 32 people — classmates, teammates, or friends — to collaborate in one conversation with Copilot acting as a helpful participant.
Copilot’s health updates follow the same principle: instead of trying to replace medical professionals, the chatbot will guide users toward trusted medical sources such as Harvard Health and even recommend nearby doctors when appropriate.
Suleyman says this focus on supporting human-to-human relationships marks a clear departure from the rest of the industry:
“It’s a significant tonal shift from where others are heading — creating AI simulations where users can escape into parallel realities, sometimes including adult content.”
In short, Microsoft’s bet is on responsible AI — one that builds connection, not dependency. Or as Suleyman put it: a digital assistant you can trust your kids — and your company — to use.
We have helped 20+ companies in industries like Finance, Transportation, Health, Tourism, Events, Education, Sports.