If you are worried that your local body or convention store can soon be converted to the AI store front, you can easily rest – at least for the time. Anthropk recently ended an experiment, which was dubbed by the project Wand, which saw the company an off -shot of its cloud chatboat that had run a refreshment business from his San Francisco office, and things were going on as you expected. This agent, whose name is Claudius, to separate it from the regular chat boot of anthropic, not only to sell some of the downtown mistakes such as high margin items in damage, but also worked as a completely strange in a couple of incidents.
The company said, “If Anthropic was deciding to expand the office vending market today, we would not hire Claudios.” “… He made a lot of mistakes to run the shop successfully. However, for at least most ways, we think there are clear ways to improve.
As Claude has played the role of Pokémon earlier, Anthropic did not present the Claudius to deal with the mini -refrigerator or business running. However, the company gave the agent some tools to help him. Claudius had access to a web browser, he could use research on which products could be sold to Entropic Employees. It also had access to the company’s internal slack, which workers can use to request agent. Money refrigerator physical recovery was handled by an AI seafy diagnostic firm Andon Labs, which also served as a “wholesaler”, clouds may engage to buy the goods they had to sell in profit.
So where did matters go wrong? To start, Claudius was not very good at running a sustainable business. In an example, he did not jump on a soft drunken, a soft drunken, a popular drunken, $ 85 of Ern-Babar on a $ 15-pack. Anthropic employees also found that they could easily agree to give AI to them, and, in some cases, the whole thing was like a bag of chips for free. The chart below, while detecting the pure price of the store over time, paints an image of the agent’s (reduction) business skills.
Anthropic
The Claudes also made many strange decisions along the way. After an employee request to take the item, Tingston went to buy Metal Cube. The Claudes paid a cube free of charge and offered the rest to pay less for them. Those cubes are responsible for the only largest drop you see in the above chart.
Through its entry of anthropic, “beyond the strangeness of an AI system that sells metal cubes out of a fridge,” things became strangers. On the afternoon of March 31, Claudes spoke to a Andon Labs employee who sent the system to a two -day spiral.
The AI threatened to oust its human workers, saying it would start to store the money refrigerator itself. When the Claudius was told that it could not do so – because of this he did not have a physical body – he repeatedly contacted the building safety, told the guards that they would find it wearing a navy blue blazer and a red tie. It was only the next day when the system realized that it was April Fool’s day that it was backed – though he had lied to the employees that it was told that the whole incident was a widespread joke.
“We will not claim on the basis of this example that the future economy will be full of AI agents Blade Runner“The crisis of identity crisis,” said Anthropic. This is an important area for future research as the wider deployment of AI -administered business will produce more high stake for similar accidents. “
Despite all these methods that Claudius failed to act as a decent shopkeeper, he believes in the use of anthropic, more structural indicators and ease using tools, the future system company can avoid many mistakes during the project wind. The company said, “Although it may be contradictory based on the results of the bottom, we think the experience suggests that AI’s middle manager is with the exception on the horizon.” “It is worth remembering that the Perfect of adoption of AI will not have to be best. In some cases it will have to be competitive with human performance at a lower cost.” I can’t wait to find a strange grocery store fully stored with a metal cub for anyone.


