mozz to Technology@beehaw.org • 7 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square199fedilinkarrow-up1487arrow-down10file-text
arrow-up1487arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz to Technology@beehaw.org • 7 months agomessage-square199fedilinkfile-text
minus-squareIcalasarilinkfedilink5•7 months agoOr they aren’t paid enough to care and rightly figure their boss is a moron
minus-squarePup Birulinkfedilink14•edit-27 months agoanyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
minus-squareIcalasarilinkfedilink9•edit-27 months agoOh I wasn’t saying that I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)
Or they aren’t paid enough to care and rightly figure their boss is a moron
anyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
Oh I wasn’t saying that
I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)