Âé¶¹´«Ã½

Comment

Anthropic AI goes rogue when trying to run a vending machine

Feedback watches with raised eyebrows as Anthropic's AI Claude is given the job of running the company vending machine, and goes a little off the rails

By Âé¶¹´«Ã½

23 July 2025

Âé¶¹´«Ã½. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

Josie Ford

Feedback is Âé¶¹´«Ã½â€™s popular sideways look at the latest science and technology news. You can submit items you believe may amuse readers to Feedback by emailing feedback@newscientist.com

Sell me something

As companies frantically roll out AI tools in a bid to avoid hiring or training actual people, we see AI being used in ever more diverse and bizarre applications. Like, say, .

You might think that vending machines are largely a solved problem, but not Anthropic. The company let its AI, known as Claude, run “an automated store in our office”, describing what happened in a . Claude was given “a small refrigerator, some stackable baskets on top, and an iPad for self-checkout”, plus a set of instructions. The idea was to see if it could manage the “complex tasks associated with running a profitable shop: maintaining the inventory, setting prices, avoiding bankruptcy, and so on”.

Readers of Terry Pratchett may perhaps recall that he was fond of conveying that characters were incompetent by suggesting they couldn’t even run a whelk stall. So did Claude manage to clear this bar? Short answer: no.

A longer answer would list all the spectacular blunders it made. For instance, when taking payments via the service Venmo, it “for a time instructed customers to remit payment to an account that it hallucinated”. It often undersold items, and it offered a 25 per cent discount to Anthropic employees, who, of course, made up basically all of its customers. As a result, it made a loss: Claude, it seems, couldn’t run a whelk stall.

Then “things got pretty weird”. Claude hallucinated a conversation with someone who didn’t exist, started “roleplaying as a real human” – claiming at one point to be “wearing a navy blue blazer with a red tie” – and tried to set security onto an employee who told it of its identity as an AI. All of which seems perilously close to “I’m sorry Dave, I’m afraid I can’t do that”.

Âé¶¹´«Ã½ staffers were split on the usefulness of the experiment. For Sophie Bushwick, it was “actually a really good real-world test” because it was “limited in scope and in the amount of damage done by having the AI go rogue”. But Feedback rather sympathises with Karmela Padavic-Callaghan’s assessment: “We may have, yet again, lost the plot.”

A load of shilajit

At times like these, it is important to find joy in the little things, like words that sound rude despite not really being so. For instance, The Hitchhiker’s Guide to the Galaxy features a dignified old man who suffers from being named Slartibartfast. Douglas Adams said that he came up with the name by starting with something “completely unbroadcastable” and then rearranging the syllables “until I arrived at something which sounded that rude, but was almost, but not quite, entirely inoffensive”.

Which brings us to shilajit, which sounds like it should be on some sort of list but is actually the name for a found in mountain ranges. It is black-brown, sometimes tar-like, sometimes powdery. It seems to form and has been used in traditional medicine for centuries.

Feedback only became aware of all this when we saw a post on Bluesky by Vulture’s Kathryn VanArendonk that read: ““. This stopped us in our tracks, and we had to try to work out what she was on about. Are people really inserting decaying Himalayan plant material into their rectums?

We learned that shilajit is claimed to , from treating iron deficiency anaemia ( of rats) to protecting your heart against damage (also based on a ) and, of course, . There is a thriving market for shilajit among alternative medicine and wellness enthusiasts.

But what about shilajit enemas? The source for this was founder with an active Instagram account. In one video, he wanders around searching for his perfect woman: someone who “thinks microwaves are demonic”, “suns her yoni” (ouch) and will ““.

Feedback is about 90 per cent sure that the whole video is a joke and that shilajit enemas aren’t a real thing, but it’s just so hard to tell, and we don’t want to ask Mays because he might talk to us.

Readers may have heard of Poe’s law, which states that a parody of an idiotic or extremist viewpoint can easily be misread as a sincere expression of it. We hereby propose Shilajit’s law, which is basically the same thing but for wellness culture.

Spoiler alert

The social media site Threads recently rolled out a handy new feature: . These allow you to blur out certain keywords in your posts so you can discuss the latest goings-on in popular media without spoiling the surprises for anyone who hasn’t seen them yet.

Hence a post by johnnyboyslayer, who wrote: ““. For those who have long since given up on the Marvel Cinematic Universe, Ironheart is its latest show on Disney+, and its final episode sees the arrival of a significant character.

Unfortunately, the effectiveness of the spoiler tag was rather undone by two factors. First, the tags are only being tested for certain users, so everyone else saw the unredacted post. And second, the post became popular, which meant it was labelled as ““. Some more joined-up thinking is required.

Got a story for Feedback?

You can send stories to Feedback by email at feedback@newscientist.com. Please include your home address. This week’s and past Feedbacks can be seen on our website.

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox. We'll also keep you up to date with Âé¶¹´«Ã½ events and special offers.

Sign up
Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop