@ylai@lemmy.ml to AI Infosec@infosec.pubEnglish • 8 months agoAI hallucinates software packages and devs download them – even if potentially poisoned with malwarewww.theregister.comexternal-linkmessage-square3fedilinkarrow-up137arrow-down10cross-posted to: technology@lemmy.worldcybersecurity@infosec.pubopensource@lemmy.mltechnology@beehaw.orgtechnology@lemmy.zipartificial_intel@lemmy.mltechnology@lemmy.world
arrow-up137arrow-down1external-linkAI hallucinates software packages and devs download them – even if potentially poisoned with malwarewww.theregister.com@ylai@lemmy.ml to AI Infosec@infosec.pubEnglish • 8 months agomessage-square3fedilinkcross-posted to: technology@lemmy.worldcybersecurity@infosec.pubopensource@lemmy.mltechnology@beehaw.orgtechnology@lemmy.zipartificial_intel@lemmy.mltechnology@lemmy.world
minus-square@Syd@lemm.eelinkfedilinkEnglish7•8 months agoSo could a bad actor train llms to inject malware into code in a way that wouldn’t be easily caught?
minus-square@BlazeDaley@lemmy.worldlinkfedilinkEnglish3•8 months agoYes. https://www.anthropic.com/news/sleeper-agents-training-deceptive-llms-that-persist-through-safety-training
So could a bad actor train llms to inject malware into code in a way that wouldn’t be easily caught?
Yes.
https://www.anthropic.com/news/sleeper-agents-training-deceptive-llms-that-persist-through-safety-training