Losing control of artificial intelligence (AI) is the biggest concern around the computer science, the Technology Secretary has said.
Michelle Donelan said a Terminator-style scenario was a “potential area” where AI development could lead but “there are several stages before that”.
Ahh. Several stages before that.
Humanity has shown itself to be pretty shit at stopping the worst from happening by taking preemptive action.
This is just a diversion from the more interment threat to jobs, which Rishi stated at the same conference isn’t an issue and instead parroted Microsoft Copilot marketing material.
Do we need to start to put in rules around failsafes for more complex system wide AIs? Yes, is it as time sensitive as putting in job protection? Fuck no.
Without it businesses will just take the cheapest option they can get away with and those that don’t will not be able to compete on price
It’s ok, it’s not one stage but “several”. Nothing to worry about. Yet.