Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.
Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.
This is the best summary I could come up with:
Not only has it been caught spitting out completely false information, but in another blow to the platform, people have now discovered it’s been generating results that are downright evil.
Case in point, noted SEO expert Lily Ray discovered that the experimental feature will literally defend human slavery, listing economic reasons why the abhorrent practice was good, actually.
That enslaved people learned useful skills during bondage — which sounds suspiciously similar to Florida’s reprehensible new educational standards.
The pros included the dubious point that carrying a gun signals you are a law-abiding citizen, which she characterized as a “matter of opinion,” especially in light of legally obtained weapons being used in many mass shootings.
Imagine having these results fed to a gullible public — including children — en masse, if Google rolls the still-experimental feature out more broadly.
But how will any of these problems be fixed when the number of controversial topics seems to stretch into the horizon of the internet, filled with potentially erroneous information and slanted garbage?
The original article contains 450 words, the summary contains 170 words. Saved 62%. I’m a bot and I’m open source!