I thoroughly agree, you should always have CI tools to ensure it builds, passes tests, and meets whatever formatting and/or linting standards the team sets. I was specifically responding to “Rust makes it harder for a ‘contributor’ to sneak in LLM-generated crap”. If I get a contribution from an untrusted party, I will start with the assumption that it’s utter garbage, buggy, broken, and malicious and review it until I’m convinced it’s not. Not because I assume the dev is bad but because it’s safer to assume the code is garbage. If I get a contribution from a trusted party (e.g. a member of the dev team/employee/whatever) I will review the code carefully though not with as much paranoia. I don’t particularly care if my teammates are using LLMs, but if they’re submitting code they don’t understand that’s a great way to get ejected from the “trusted contributors” group, and if they’re an employee it’s a good way to get fired if they keep doing it after being warned not to.
I thoroughly agree, you should always have CI tools to ensure it builds, passes tests, and meets whatever formatting and/or linting standards the team sets. I was specifically responding to “Rust makes it harder for a ‘contributor’ to sneak in LLM-generated crap”. If I get a contribution from an untrusted party, I will start with the assumption that it’s utter garbage, buggy, broken, and malicious and review it until I’m convinced it’s not. Not because I assume the dev is bad but because it’s safer to assume the code is garbage. If I get a contribution from a trusted party (e.g. a member of the dev team/employee/whatever) I will review the code carefully though not with as much paranoia. I don’t particularly care if my teammates are using LLMs, but if they’re submitting code they don’t understand that’s a great way to get ejected from the “trusted contributors” group, and if they’re an employee it’s a good way to get fired if they keep doing it after being warned not to.