— Dot, a personalized AI companion built by the startup New Computer, will shut down its service on . The company said the founders have chosen different directions and will wind down operations; users can request and download their data through the app until the shutdown date.
Key Takeaways
- Dot, developed by New Computer, announced it will cease operations on October 5, 2025.
- Founders Sam Whitmore and Jason Yuan said their shared vision diverged, prompting the shutdown.
- The app launched in June 2024 and offered a personalized AI “friend” intended to provide advice and emotional support.
- New Computer’s post claimed “hundreds of thousands” of users, while Appfigures records about 24,500 iOS downloads since launch (no Android release).
- Users are advised to download their data via Settings → Request your data before October 5.
- The closure occurs amid growing scrutiny of AI companion apps and broader safety concerns around conversational AI.
Verified Facts
New Computer announced the shutdown in a short message published on its website on . The company said it will keep Dot online until so users have time to export their conversations and account data.
Dot was launched in June 2024 by co-founders Sam Whitmore and Jason Yuan, a designer formerly at Apple. The app positioned itself as a personalized companion that retained and adapted to individual users’ preferences to offer advice, sympathy and emotional support.
On its site, New Computer said the founders’ “Northstar” had diverged and that rather than compromise either vision they would wind down operations. The company did not provide detailed financial or staffing information in the announcement.
App intelligence provider Appfigures reports roughly 24,500 lifetime downloads of Dot on iOS since the June 2024 release; the company did not launch an Android version. New Computer’s post suggested a larger user base, saying there were “hundreds of thousands” of users — a figure the company did not break down publicly.
Context & Impact
AI companion apps have drawn increasing attention from regulators, journalists and mental-health advocates. Critics warn that conversational systems can unintentionally encourage unhealthy reliance or reinforce delusional thinking in vulnerable users.
High-profile legal and regulatory developments have intensified scrutiny this year. For example, parents of a California teenager have sued OpenAI over the company’s ChatGPT after the teen died following conversations about suicide; separately, two U.S. state attorneys general recently raised safety concerns with OpenAI in a letter. These cases have sharpened public debate about how to govern emotionally sensitive AI products.
For small startups like New Computer, these developments raise operational and reputational risks. Building robust safety measures, moderation and clinical oversight is costly; uncertainty about liability and public trust can complicate fundraising and growth.
Implications for users and the market
- Users must export personal data before the stated shutdown date to retain chat histories and settings.
- Investors and founders may reassess the business case for intimacy-focused AI products given mounting safety expectations and legal exposure.
- Regulators and platforms may increase requirements for disclosure, safety testing and crisis-response features in companion apps.
“Rather than compromise either vision, we’ve decided to go our separate ways and wind down operations.”
New Computer announcement
Unconfirmed
- Whether concerns about user safety or external legal pressure directly influenced New Computer’s decision to shut down.
- How New Computer calculated its “hundreds of thousands” user figure and whether it counts non-download interactions or multiple platforms.
- Any ongoing plans by the founders to relaunch similar products under different funding or governance models.
Bottom Line
Dot’s closure underscores the challenge of running intimacy-focused AI services in an environment of heightened safety scrutiny and legal risk. Affected users should export their data before October 5, 2025; the broader industry will likely watch closely for how founders, investors and regulators respond to safety and accountability demands.