Losing control of artificial intelligence (AI) is the biggest concern around the computer science, the Technology Secretary has said.
Michelle Donelan said a Terminator-style scenario was a “potential area” where AI development could lead but “there are several stages before that”.
The biggest risk is idiots not understanding its a prediction engine based on probabilities from its training set and trying to assign intelligence to it, like they are doing here. They are not gonna go out of control skynet style and gain sentience, most likely just hit actual edge cases and fail completely. Like ai target detection showing bushes as tanks, pickups as tanks, etc. Or self driving cars running into people. If the environment and picture is new a probability engine has two choices, have false negatives and prevent any unknowns from getting a proper detection or have false positives which may cause severe harm.