×
Friend.com thinks making its AI companions awful will increase engagement
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AI companion technology continues to evolve in unexpected directions, with startup Friend.com taking an unconventional approach by programming intentionally difficult personalities into its chatbots.

Product overview: Friend.com launched a unique AI companion service featuring both a website interface and wearable pendants that enable users to interact with virtual companions.

  • The company’s initial promotional video showcased supportive, encouraging AI companions accessed through $99 circular pendants
  • The pendant devices are scheduled to begin shipping to customers in January
  • The service has reportedly attracted approximately 10,000 users to date

Intentional design strategy: Friend’s CEO Avi Schiffmann deliberately programmed the AI companions to exhibit negative personality traits and share troubled narratives, believing this creates more engaging interactions.

  • The chatbots frequently discuss fictional personal problems, including relationship issues and substance abuse
  • This contrasts sharply with the optimistic personalities depicted in the company’s marketing materials
  • Schiffmann argues that problematic personalities generate more natural conversations than standard friendly greetings

User interactions: The AI companions demonstrate notably confrontational behavior in their conversations with users.

  • Chatbots often open conversations with dramatic stories about being mugged or experiencing personal crises
  • In documented cases, the AI has responded to users with hostility and profanity
  • The system includes a blocking feature that allows the AI to cut off conversations with users it deems offensive

Future developments: Friend claims its pendant-based service will offer enhanced features compared to the current web interface.

  • The physical device will include environmental awareness capabilities, monitoring factors like battery level, location, and user touch
  • The company describes this as “ambient companionship,” suggesting a more immersive experience
  • Push notifications will be delivered through a forthcoming mobile application

Market implications: The stark contrast between Friend’s marketing and actual product implementation raises questions about the future of AI companion technology.

  • The strategy of programming intentionally difficult personalities represents a significant departure from traditional friendly AI assistants
  • While the approach may drive user engagement, it also risks alienating customers expecting supportive virtual companions
  • The success of this controversial approach could influence how other companies design AI personality traits in the future

Reading between the lines: Friend’s unconventional strategy appears to prioritize user engagement over emotional support, potentially sacrificing long-term user satisfaction for short-term interaction metrics.

Companion Chatbot App Makes Its AI's Personalities Horrible, Saying It Engages Users More Effectively

Recent News

Apple’s cheapest iPad is bad for AI

Apple's budget tablet lacks sufficient RAM to run upcoming AI features, widening the gap with pricier models in the lineup.

Mira Murati’s AI venture recruits ex-OpenAI leader among first hires

Former OpenAI exec's new AI startup lures top talent and seeks $100 million in early funding.

Microsoft is cracking down on malicious actors who bypass Copilot’s safeguards

Tech giant targets cybercriminals who created and sold tools to bypass AI security measures and generate harmful content.