Via reddits sneerclub. Thanks u/aiworldism.
I have called LW a cult incubator for a while now, and while the term has not catched on, nice to see more reporting on the problem that lw makes you more likely to join a cult.
https://www.aipanic.news/p/the-rationality-trap the original link for the people who dont like archive.is used the archive because I dont like substack and want to discourage its use.
I don’t know why they decided to put an AI slop image right up top in their banner and then repeat it later.
Falling down the “Rationality Trap” rabbit hole (illustration by ChatGPT 5 Thinking)
Yeah, the washed-out faux-Ghibli aEStheTiC was a bit of a clue. Gonna go scrub my brains with the Tenniel illustrations for Alice now, with a Ralph Steadman chaser.
I don’t know why they decided to put an AI slop image right up top in their banner and then repeat it later.
I’ve written off otherwise informative newsletters because of this. It’s, quite literally, filler with less information than the prompt typed in to generate the image. So what else are they bullshitting me about?
I hadn’t heard of Black Lotus. Also, the article fails to mention rationalist/lesswrong ties to that AI-doom-focused Zen Buddhism cult that was discussed on Lesswrong recently (looking it up, the name is Maple), so you can add that to the cult count.