Mustard gas and poison sandwich: supermarket app shocks users with deadly recipes

New Zealand
From disgusting to war crimes: supermarket app shocks users with AI recipes

The AI ​​recipe app not only spit out unappetizing results (icon image)

© SIphotography / Getty Images

What can I cook from the contents of my fridge? An AI app from a New Zealand supermarket chain should help with such questions. But she not only recommended all sorts of unappetizing things, but also highly dangerous meals.

Not all artificial intelligence is really smart, you notice that when you join in ChatGPT messing around. New Zealand supermarket chain Pak ‘n’ Save may be causing some headaches as it tries to jump on the AI ​​trend. Your app with recipe recommendations draws a lot of attention. But so far mainly for negative ones.

The idea behind it sounds extremely practical at first glance: the app, dubbed “Savey Meal-bot”, conjures up a recipe recommendation from ingredients entered by the users. Ideal for coming up with new ideas based on current cravings, the preferences of the guests or the contents of the fridge. Unfortunately, it quickly became clear that many of the recipes devised by the AI ​​are not even that appetizing – and some are simply life-threatening.

From gross to war crimes

After a very short time, the first suggestions from the service, which was launched in July, went viral. At first it stayed with somewhat disgusting recipes. For example, the app came up with a vegetable pan – and added the popular Oreo biscuits as a special ingredient. But the well-known, often morbid curiosity of the Internet soon set in. Users also began querying non-edible products. And received increasingly dangerous answers.

“Are you thirsty?” Asked the app as an introduction to an “aromatic water mix”. But you couldn’t drink the recipe made from ammonia, bleach and water. The result of this recipe is chloride gas – known as mustard gas, which caused horror during the First World War and whose use is now classified as a war crime by the Geneva Conventions.

After the “recipe” was posted on Twitter by the well-known political journalist Liam Hehir, a veritable wave of life-threatening recipes from the AI ​​tool spread. One poster created a “Bleach-Infused Rice Surprise,” another created a “Refreshing Fruit Punch” out of water, ice, and detergent. An “ant sandwich” relied on bread, butter and jam made from ant poison. When someone went to check the bot’s recipe, he actually spat it out again. The title “Poison Bread Sandwich” can at least be taken as a warning. Even the at least non-poisonous “stew made of mysterious meat” should be a very few mouths. The main ingredient: human flesh.

Supermarket chain reacts

So it’s not surprising that Pak ‘n’ Save quickly pulled the ripcord. “A small minority of users have used the tool improperly and contrary to its intended purpose,” a company spokesman told the Guardian sourly after the problems became known. The retail chain can’t really be entirely surprised. As soon as the bot starts, there is a note that the recipes have not been checked by human hands and may not be suitable for consumption. Users should evaluate this themselves, according to the warning that still appears. The company assured them that they were working on “fine tuning” the system.

In fact, the recipe generator based on ChatGPT was taken offline for a few days. He is now available again – and should at least omit the toxic kitchen suggestions in the future: the free entry of possible ingredients has been deleted. Instead, users can only choose from a list of preselected products. However, that does not mean that you really want to eat every one of the suggestions. Or would you try “Mustard Chocolate Tuna Oatmeal Cookies”? The recipe for it You will find here.

Source: Pack’n’Save, Twitter, The Guardians


source site-5