• 0 Posts
  • 4 Comments
Joined 2 months ago
cake
Cake day: May 24th, 2025

help-circle


  • Alternate, less hostile interpretation:

    Non-white patients have across the board received worse medical care and had worst medical outcomes than white patients. The American medical system is racially biased. That’s not identity politics, that’s statistics.

    And you can’t effectively fight racial bias, or judge how well your efforts to fight it are going, if your data set doesn’t include information on race.

    Or, as the US government put it:

    Racial and ethnic disparities in health care and health outcomes have been well documented. Such disparities are particularly relevant for Medicaid given that more than half of the program’s 73 million beneficiaries identify as Black, Hispanic, Asian American, or another non-white race or ethnicity. Addressing disparities and promoting equity in coverage, access, experience, and outcomes among historically marginalized and underserved populations will depend in part on having complete and systemically collected data by race and ethnicity. Source.

    If you don’t measure it, you can’t manage it, as the saying goes.
















  • None of this is to say there are absolutely no concerns about LLMs. Obviously there are. But there is no reason to suspect LLMs are going to end humanity unless some moron hooks one up to nuclear weapons.

    And what are the odds LLMs will be used to write code for DoD systems? Or AI agents integrated into routine nuclear power plant operations? I think the odds of some moron hooking up a nuke to a brainless language generator are greater than you think.