cross-posted from: https://lemmy.ml/post/3109500
Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items
cross-posted from: https://lemmy.ml/post/3109500
Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items
Agree. “Chatbot outputs ridiculous response when given ridiculous inputs” gets old.
This was at least funny.
Though I would say that it spitting out recipes for things that aren’t even ingredients indicates that it’s not a useful tool. It’s not basing recipe recommendations on any knowledge of food, cooking, flavours, textures, or chemistry. Seems like it’s just arbitrarily fitting a list of ingredients into some other patterns.
If it doesn’t understand “this isn’t a safe ingredient”, I doubt it understands anything about what ingredients that aren’t poison would go well together, other than ones it has seen paired in it’s training set.