My favorite comment in the lesswrong discussion: https://www.lesswrong.com/posts/DfrSZaf3JC8vJdbZL/how-to-make-superbabies?commentId=oyDCbGtkvXtqMnNbK
It’s not that eugenics is a magnet for white supremacists, or that rich people might give their children an even more artificially inflated sense of self-worth. No, the risk is that the superbabies might be Khan and kick start the eugenics wars. Of course, this isn’t a reason not to make superbabies, it just means the idea needs some more workshopping via Red Teaming (hacker lingo is applicable to everything).
One comment refuses to leave me: https://www.lesswrong.com/posts/DfrSZaf3JC8vJdbZL/how-to-make-superbabies?commentId=C7MvCZHbFmeLdxyAk
The commenter makes and extended tortured analogy to machine learning… in order to say that maybe genes with correlations to IQ won’t add to IQ linearly. It’s an encapsulation of many lesswrong issues: veneration of machine learning, overgeneralizing of comp sci into unrelated fields, a need to use paragraphs to say what a single sentence could, and a failure to actually state firm direct objections to blatantly stupid ideas.
If we’re casting eugenics warriors, at least Ricardo Montalban had some bodacious pecs
I feel coding people like they’re software might not be much better than coding software to pretend it’s people
Don’t get sucked into a eugenics cult
You are right, but the Wronger chuds are way too far up their own buttholes to figure this out
You don’t understand, it is important to look at all diverse viewpoints (no not those), there might be some good ideas up there.
Working in the field of genetics is a bizarre experience. No one seems to be interested in the most interesting applications of their research. […] The scientific establishment, however, seems to not have gotten the memo. […] I remember sitting through three days of talks at a hotel in Boston, watching prominent tenured professors in the field of genetics take turns misrepresenting their own data […] It is difficult to convey the actual level of insanity if you haven’t seen it yourself.
Like Yudkowsky writing about quantum mechanics, this is cult shit. “The scientists refuse to see the conclusion in front of their faces! We and we alone are sufficiently Rational to embrace the truth! Listen to us, not to scientists!”
Gene editing scales much, much better than embryo selection.
“… Mister Bond.”
The graphs look like they were made in Matplotlib, but on another level, they’re giving big crayon energy.
Working in the [field] is a bizarre experience. No one seems to be interested in the most interesting applications of their research
depending on field, it might be crackpottery or straight up criminal. but if you post shit like this on linkedin, then it’s suddenly “inspiring” and “thought-provoking”
Our knowledge has advanced to the point where, if we had a safe and reliable means of modifying genes in embryos, we could literally create superbabies
and from that point on it’s all counterfactual
Am i misunderstanding the data? No it is all the scientists who are wrong. (He is also ignoring the “scientists” who do agree with him, who all seem to have a special room for ww2 paraphernalia)
Okay what is LW’s misunderstanding?
a fairly sizable chunk of everything, generally.
That there is a secret group of scientists who know something is up and they are suppressing this technology.
watching prominent tenured professors in the field of genetics take turns misrepresenting their own data
I meant to know their misunderstanding of the data regarding eugenics, though I’m no longer hoping to get that here.
secret group of scientists
The text you’ve just quoted says that it is all geneticists that are unwittingly wrong, the precise opposite proposition than the one in your attempt to paint their discourse with the purples of crackpot conspiracy.
Soyweiser has likely accurately identified that you’re JAQing in bad faith, but on the slim off chance you actually want to educate yourself, the rationalwiki page on Biological Determinism and Eugenics is a decent place to start to see the standard flaws and fallacies used to argue for pro-eugenic positions. Rationalwiki has a scathing and sarcastic tone, but that tone is well deserved in this case.
To provide a brief summary, in general, the pro-eugenicists misunderstand correlation and causation, misunderstand the direction of causation, overestimate what little correlation there actually is, fail to understand environmental factors (especially systemic inequalities that might require leftist solutions to actually have any chance at fixing), and refuse to acknowledge the context of genetics research (i.e. all the Neo-Nazis and alt righters that will jump on anything they can get).
The lesswrongers and SSCs sometimes whine they don’t get fair consideration, but considering they take Charles Murray the slightest bit seriously they can keep whining.
Ow wait you weren’t asking me to explain what I meant, you were asking me to defend the correctness of the professors in genetics vs a crackpot at an event where I wasn’t, nor am I qualified as im not a professor in genetics, nor is Yud, after I just mentioned that I don’t think unqualified people are talking about this. So you were trying the Socratic method? How is that working out for you?
The text you’ve just quoted
Yes, and im quoting the LW crackpot, they are not saying they are unwittingly wrong, it is hinting at that they are intentionally wrong. (Using some very dodgy analogies (no making a chicken bigger isn’t like creating a 14 foot human that is a crazy comparison due to the whole thing in biology where stuff works differently at different scales (see also the strength of ants), it is powerful hype language however) and unscientific shit (the random asspulled graphs)). Also note that his whole article is using their fears of AI to promote that we should do more eugenics (using the weirdest logic imaginable, we should take care not to make mistakes and do everything slow in AI so we need to do eugenics fast) and that the professors are wrong/keeping back. And this is just what I can come up with after quickly skimming parts of the article (I don’t have the time/energy/expertise to do more anyway, I mean imagine if I had to look up all the literature they reference and see if it is correct (all 5 of them, I mean you did notice that there were only ~5 links to actual scientific articles right? Not an amount of backing I would want to base my political actions on (you also noticed that right?))). It also hits classic crankery levels, not only are the professors missing/suppressing something this thing is also a revolutionary thing which could save humanity. (also note he admits that the technology of editing babies on one gene is not solved yet (but they are close). Which should make you wonder why they are dismissive of ‘ethical issues’).
It also doesn’t help that your reactions are pattern matching the ‘im just curious, could you explain yourself’ kind of person we used to get on r/sneerclub who 90% of the times wasn’t curious but actually was just very pro race science or an annoying contrarian debatebro with yt induced brain damage (which got them banned very quickly, so word of warning).
E: and ow, you did notice that people in the comments are trying to say they should the guy who was recently famous for being able to keep his arm down into this right? (Fucking Ents who are pretending that the rest of the world doesn’t affect them).
the user has been gently directed to the debate club
I was tempted to give them their free ticket to the egress for saying “paint their discourse with the purples”.
lol these dorks don’t even realise they already made a superbabies and it sucks: https://www.rottentomatoes.com/m/super_babies_baby_geniuses_2
Also LWers: https://youtu.be/AFj3tuNukTs
“Fund my company and your child might live to adulthood and/or have sperm that glows green.”
But could even a generation of Johns von Neumann outsmart the love child of Skynet and Samaritan from Person Of Interest?
Heh. For a while there I had a phone love wallpaper that did the SamaritanOS You_Are_Being_Watched thing. Good times. Shame about Caviezel though.
All Person Of Interest fanfiction must by Internet law be extremely gay to spite Caviezel.
One of the most important projects in the world. Somebody should fund it.
The Pioneer Fund (now the Human Diversity Foundation) has been funding this bullshit for years, Yud.
So AGI is 0.5-2 years away. After which the singularity happens and due to AI alignment we either are immortal forever, or everybody is diamondoid paperclips.
A normal human takes 18 years to grow to maturity. So for the sake of the argument (yes yes, don’t hand it to ISIS) a supergene baby can do that in 9 years. (poor kid). Those timelines seem at odds with each other (and that is assuming the research was possible now).
I know timelines and science fiction stories are a bit fluid but, come on, at least pretend you believe in it. I’m not saying he is full of shit but… no wait, I am saying that.
As we know, the critical age for a boy genius is somewhere from 11 (Harry Potter) to 15 (Paul Atreides), so the gene-enhanced baby ought to have a fair shot after a few months or so.
Superbabies is a backup plan; focus the energy of humanity’s collective genetic endowment into a single generation, and have THAT generation to solve problems like “figure out how to control digital superintelligence
The academic institutions in charge of exploring these ideas are deeply compromised by insane ideologies. And the big commercial entities are too timid to do anything truly novel; once they discovered they had a technology that could potentially make a few tens of billions treating single gene genetic disorders, no one wanted to take any risks; better to take the easy, guaranteed money and spend your life on a lucrative endeavor improving the lives of 0.5% of the population than go for a hail mary project that will result in journalists writing lots of articles calling you a eugenicist.
oh no, not a eugenicist!
I got caught on that quote too…
Superbabies is a backup plan; focus the energy of humanity’s collective genetic endowment into a single generation, and have THAT generation to solve problems like “figure out how to control digital superintelligence”.
Science-fiction solutions for science-fiction problems!
Let’s see what the comments say!
Considering current human distributions and a lack of 160+ IQ people having written off sub-100 IQ populations as morally useless […]
Dude are you aware where you are posting.
Just hope it never happens, like nuke wars?
Yeah that’s what ran the Cold War, hopes and dreams. JFC I keep forgetting these are kids born long after 1989.
Could you do all the research on a boat in the ocean? Excuse the naive question.
No, please keep asking the naive questions, it’s what provides fodder for comments like this .
(regarding humans having “[F]ixed skull size” and can therefore a priori not compete with AI):
Artificial wombs may remove this bottleneck.
This points to another implied SF solution. It’s already postulated by these people that humans are not having enough babies, or rather the right kind of humans aren’t (wink wink). If we assume that they don’t adhere to the Platonic ideal that women are simply wombs and all traits are inherited from males, then to breed superbabies you need buy-in from the moms. Considering how hard it is for these people to have a normal conversation with the fairer sex, them both managing to convince a partner to have a baby and let some quack from El Salvador mess with its genes seems insurmountable. Artificial wombs will resolve this nicely. Just do a quick test at around puberty to determine the God-given IQ level of a female, then harvest her eggs and implant them into artificial wombs. The less intelligent ones can provide eggs for the “Beta” and “Gamma” models…
But you don’t go from a 160 IQ person with a lot of disagreeability and ambition, who ends up being a big commercial player or whatnot, to 195 IQ and suddenly get someone who just sits in their room for a decade and then speaks gibberish into a youtube livestream and everyone dies, or whatever.
These people are insane.
esprit d’escalier
this whole “superbabies will save us from AI” presupposes that the superbabies are immune to the pull of LW ideas. Just as LW are discounting global warming, fascism etc to focus on runaway AI, who says superbabies won’t have a similar problem? It’s just one step up the metaphorical ladder:
LW: “ugh normies don’t understand the x-risk of AI!”
Superbabies: “ugh our LW parents don’t understand the x-risk of Evangelion being actually, like, real!”