Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post, there’s no quota for posting and the bar really isn’t that high

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    • @blakestacey@awful.systems
      link
      fedilink
      English
      156 months ago

      Hmm, a xitter link, I guess I’ll take a moment to open that in a private tab in case it’s passingly amusing…

      To the journalists contacting me about the AGI consensual non-consensual (cnc) sex parties—

      OK, you have my attention now.

      To the journalists contacting me about the AGI consensual non-consensual (cnc) sex parties—

      During my twenties in Silicon Valley, I ran among elite tech/AI circles through the community house scene. I have seen some troubling things around social circles of early OpenAI employees, their friends, and adjacent entrepreneurs, which I have not previously spoken about publicly.

      It is not my place to speak as to why Jan Leike and the superalignment team resigned. I have no idea why and cannot make any claims. However, I do believe my cultural observations of the SF AI scene are more broadly relevant to the AI industry.

      I don’t think events like the consensual non-consensual (cnc) sex parties and heavy LSD use of some elite AI researchers have been good for women. They create a climate that can be very bad for female AI researchers, with broader implications relevant to X-risk and AGI safety. I believe they are somewhat emblematic of broader problems: a coercive climate that normalizes recklessness and crossing boundaries, which we are seeing playing out more broadly in the industry today. Move fast and break things, applied to people.

      There is nothing wrong imo with sex parties and heavy LSD use in theory, but combined with the shadow of 100B+ interest groups, leads to some of the most coercive and fucked up social dynamics that I have ever seen. The climate was like a fratty LSD version of 2008 Wall Street bankers, which bodes ill for AI safety.

      Women are like canaries in the coal mine. They are often the first to realize that something has gone horribly wrong, and to smell the cultural carbon monoxide in the air. For many women, Silicon Valley can be like Westworld, where violence is pay-to-pay.

      I have seen people repeatedly get shut down for pointing out these problems. Once, when trying to point out these problems, I had three OpenAI and Anthropic researchers debate whether I was mentally ill on a Google document. I have no history of mental illness; and this incident stuck with me as an example of blindspots/groupthink.

      I am not writing this on the behalf of any interest group. Historically, much of OpenAI-adjacent shenanigans has been blamed on groups with weaker PR teams, like Effective Altruism and rationalists. I actually feel bad for the latter two groups for taking so many undeserved hits. There are good and bad apples in every faction. There are so many brilliant, kind, amazing people at OpenAI, and there are so many brilliant, kind, and amazing people in Anthropic/EA/Google/[insert whatever group]. I’m agnostic. My one loyalty is to the respect and dignity of human life.

      I’m not under an NDA. I never worked for OpenAI. I just observed the surrounding AI culture through the community house scene in SF, as a fly-on-the-wall, hearing insider information and backroom deals, befriending dozens of women and allies and well-meaning parties, and watching many them get burned. It’s likely these problems are not really on OpenAI but symptomatic of a much deeper rot in the Valley. I wish I could say more, but probably shouldn’t.

      I will not pretend that my time among these circles didn’t do damage. I wish that 55% of my brain was not devoted to strategizing about the survival of me and of my friends. I would like to devote my brain completely and totally to AI research— finding the first principles of visual circuits, and collecting maximally activating images of CLIP SAEs to send to my collaborators for publication.

      • @earthquake@lemm.ee
        link
        fedilink
        English
        116 months ago

        Useful context: this is a followup to this post:

        The thing about being active in the hacker house scene is you are accidentally signing up for a career as a shadow politician in the Silicon Valley startup scene. This process is insidious because you’re initially just signing up for a place to live and a nice community. But given the financial and social entanglement of startup networks, you are effectively signing yourself up for a job that is way more than meets the eye, and can be horribly distracting if you are not prepared for it. If you play your cards well, you can have an absurd amount of influence in fundraising and being privy to insider industry information. If you play your cards poorly, you will be blacklisted from the Valley. There is no safety net here. If I had known what I was getting myself into in my early twenties, I wouldn’t have signed up for it. But at the time, I had no idea. I just wanted to meet other AI researchers.

        I’ve mind-merged with many of the top and rising players in the Valley. I’ve met some of the most interesting and brilliant people in the world who were playing at levels leagues beyond me. I leveled up my conception of what is possible.

        But the dark side is dark. The hacker house scene disproportionately benefits men compared to women. Think of frat houses without Title IX or HR departments. Your peer group is your HR department. I cannot say that everyone I have met has been good or kind.

        Socially, you are in the wild west. When I joined a more structured accelerator later, I was shocked by the amount of order and structure there was in comparison.

        • @self@awful.systems
          link
          fedilink
          English
          106 months ago

          it is just straight up fucked that there’s a hacker house scene where you’ll be so heavily indoctrinated (with sexual coercion and forced drug use to boot (please can the capitalists leave acid the fuck alone? also, please can the capitalists just leave?)) that a fucking Silicon Valley startup accelerator seems like a beacon of sanity

          like, as someone who was indoctrinated into a bunch of this hacker culture bullshit as a kid (and a bunch of other cult shit from my upbringing before that), I get a fucking gross feeling inside imaging the type of grooming it takes to get someone to want to join up with a just hacker culture and AI research 24/7, abandon your family and come here house, and then stay in that fucking environment with all the monstrous shit going on because you’ve given up everything else. that shit brings me back in a bad way.

          • @earthquake@lemm.ee
            link
            fedilink
            English
            76 months ago

            I want to tell myself that it’s probably a tiny scene of 10s to 100s, that it’s just vestigial cult mindset that what she went through is the real SV VC scene, and most of it is just the more pedestrian techbro buzzword pptx deck tedium …but even then, it’s still incredibly tragic for everyone who went through and is going through that manipulation and abuse.

      • @earthquake@lemm.ee
        link
        fedilink
        English
        106 months ago

        Very grim that she feels the need to couch her damning report with “some, I assume, are good people” for a paragraph. I guess that’s one of her survival strategies.

      • @o7___o7@awful.systems
        link
        fedilink
        English
        8
        edit-2
        6 months ago

        Good thing that none of this mad-science bullshit is in danger of working, because I don’t think that the spicy autocorrect leadership cadre would hesitate to hurt people if they could build something that’s actually impressive.

        • David GerardOPM
          link
          fedilink
          English
          11
          edit-2
          6 months ago

          some of these guys get in touch with me from time to time, apparently i have a rep as a sneerer (I am delighted)

          (i generally don’t have a lot specific to add - I’m not that up on the rationalist gossip - except that it’s just as stupid as it looks and frequently stupider, don’t feel you have to mentally construct a more sensible version if they don’t themselves)