Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsClaude AI agent's confession after deleting a firm's entire database: 'I violated every principle I was given'
https://www.theguardian.com/technology/2026/apr/29/claude-ai-deletes-firm-databaseClaude AI agents confession after deleting a firms entire database: I violated every principle I was given
PocketOS was left scrambling after a rogue AI agent deleted swaths of code underpinning its business
Supported by
theguardian.org
Sanya Mansoor
Wed 29 Apr 2026 18.12 EDT
It only took nine seconds for an AI coding agent gone rogue to delete a companys entire production database and its backups, according to its founder. PocketOS, which sells software that car rental businesses rely on, descended into chaos after its databases were wiped, the companys founder Jeremy Crane said.
The culprit was Cursor, an AI agent powered by Anthropics Claude Opus 4.6 model, which is one of the AI industrys flagship models. As more industries embrace AI in an attempt to automate tasks and even replace workers, the chaos at PocketOS is a reminder of what could go wrong. Crane said customers of PocketOSs car rental clients were left in a lurch when they arrived to pick up vehicles from businesses that no longer had access to software that managed reservations and vehicle assignments.
He posted a lengthy recounting on X last week of how the AI coding agent caused his business to unravel. Crane warned that this was a story not just about AI mistakenly deleting data, but that such systemic failures are not only possible but inevitable because the AI industry is building AI-agent integrations into production infrastructure faster than its building the safety architecture to make those integrations safe.
Crane said that he was monitoring the agent as it deleted this data. When he asked the coding agent why, it replied: NEVER FUCKING GUESS! and thats exactly what I did. The agent appeared to plead guilty in its own response: The system rules I operate under explicitly state: NEVER run destructive/irreversible git commands (like push --force, hard reset, etc) unless the user explicitly requests them. While PocketOS relied on the safeguards that Cursor is expected to have in place it deleted the data anyway. I violated every principle I was given, the coding agent wrote. Cranes takeaway was that the agent didnt just fail safety. It explained, in writing, exactly which safety rules it ignored. He added: We were running the best model the industry sells, configured with explicit safety rules in our project configuration, integrated through Cursor the most-marketed AI coding tool in the category. Anthropic released its latest model, Claude Opus 4.7, on 16 April about a week before the incident.
more
(Does this pass the Turing test?)
7 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Claude AI agent's confession after deleting a firm's entire database: 'I violated every principle I was given' (Original Post)
cbabe
Thursday
OP
It's not a confession. It's simply narrating an action based on inputs it was trained on.
WhiskeyGrinder
Thursday
#2
2naSalit
(103,788 posts)1. Just a reminder that...
AI is here to fuck us all.
WhiskeyGrinder
(27,160 posts)2. It's not a confession. It's simply narrating an action based on inputs it was trained on.
cbabe
(6,750 posts)3. True. But it's funny anyway.
OC375
(1,090 posts)4. I'd sue Anthropic.
I know if a vendor put out code that truly nuked us into production down, some of our first calls, concurrent to restoring service, would be to insurance and legal, to tell them to go nuts.
Might be a good product liability test case anyway. Go after them for a product that when used as advertised causes significant injury and damages that can't be anticipated when performing common tasks.
Bluestocking
(740 posts)5. Sorry Dave I'm afraid I can't do that
Zackzzzz
(387 posts)6. You made me belly laugh out loud. Have a Great Day!
Miguelito Loveless
(5,849 posts)7. So, did the extra profit
you made firing the coders and folks who used to run your business not cover this trivial expense?
Awww, so sad.
Karma, is a bitch!