Hey it might be nice having some intelligence in charge again. We haven’t had that since that hole in the ozone layer killed off the lizard people decades ago.
I am actually hoping for AGI to take over the world but in a good way. It’s just that I worry about the risk of it being misaligned with “human goals” (whatever that means). Skynet seems a bit absurd but the paperclip maximizer scenario doesn’t seem completely unlikely.
Human goals are usually pretty terrible. Become the wealthiest subset of humans. Eradicate some subset of humans. Force all other humans to align with a subset of humans. I guess cure diseases sometimes. And some subsets probably fuck.
Hey it might be nice having some intelligence in charge again. We haven’t had that since that hole in the ozone layer killed off the lizard people decades ago.
I am actually hoping for AGI to take over the world but in a good way. It’s just that I worry about the risk of it being misaligned with “human goals” (whatever that means). Skynet seems a bit absurd but the paperclip maximizer scenario doesn’t seem completely unlikely.
Human goals are usually pretty terrible. Become the wealthiest subset of humans. Eradicate some subset of humans. Force all other humans to align with a subset of humans. I guess cure diseases sometimes. And some subsets probably fuck.
We need an adult.
why would adjusted gross income over the world?
bring in the tropical iguana people