I lost my job after AI recruitment tool assessed my body language, says make-up artist::A make-up artist says she lost her job at a leading brand after an AI recruitment tool that used facial recognition technology marked her down for her body language.

    • Shirasho
      link
      English
      4310 months ago

      For a brief moment I worked in that industry as a programmer. The whole point is not to find the most qualified candidate but to find the one that fits into the company culture the most in order to reduce turnover. These algorithms will throw away applications from people of color because they have “behaviors not in line with the company culture” or applications from disabled people because they would “not react properly to certain situations”.

      Of course they aren’t explicitly rejecting these people, but the questions and answers on the tests for applications are specifically and painstakingly crafted to filter out these people without making it clear what type of person the question is trying to filter out.

      This doesn’t necessarily have to do with the AI in question, but my point is that the entire hiring/firing process is totally fucked, and companies are constantly looking for ways to get around discrimination laws.

      • @linearchaos@lemmy.world
        link
        fedilink
        English
        1410 months ago

        Using an AI to grade someone’s body language seems like a horrible thing.

        Although I will say there is some validity to being careful about who you hire company culture wise and I’m not talking about race gender or disability.

        We’ve turned down the ‘best programmer’ numerous times, some people that really had some solid skills, because they came in aggressive and brash.

        The one guy got his “sorry but no thanks”, said look at my resume I’m an absolute master at everything you do, and he wasn’t kidding he was very good. We told him we recognize his skills but said that socially he was difficult and abrasive just in interviews and that there’s no way that they could subject him to the rest of the company. He unleashed a string of profanities and said couldn’t we just have him work somewhere else on his own separate projects. No, that wasn’t going to be an option.

        Nobody wants to hire somebody that’s going to make a workplace toxic. That means that sometimes you turn down some of the better skilled opportunities, but you can always find somebody nicer and train/educate them.

        As far as race, gender, quirks, we have you meet with everybody, a group takes you out to lunch. You can be shy, flighty, uncomfortable, awkward, the basic test is, can you mostly do the job and would other people want to work with you. And if the people come back with the answer of no, we don’t bring you on. We’ve done that since the very beginning, so everybody there is already pretty much a tolerant nice person.

        I had this one guy interview for my department, He made it through the morning interviews no problems. Gold star. The lunch crew took him out to lunch. He turned it into a people watching affair and started making horrible comments about all the people coming in the door. One of the strongest personalities I know was the lunch came back to me and said he makes me very uncomfortable. I sent him packing.

        I hope we don’t get to the point where all jobs are using AI to weed people out without humans checking behind it.

      • @Nawor3565@lemmy.blahaj.zone
        link
        fedilink
        English
        710 months ago

        Funny. That sounds exactly like how they tried to use “intelligence tests” to prevent Black people from voting. The questions didn’t explicitly exclude Black people, but we’re written in a vague and subjective way so that the test-giver could claim that any answer was right/wrong and thereby exclude anyone they wanted.

    • @TheObviousSolution@lemm.ee
      link
      fedilink
      English
      1310 months ago

      They are not using it to stop bias. If history has proven anything, it’s that AI is biased as shit. They are using AI to excuse bias, because “computers ergo cold hard logic” while ignoring that they aren’t training in ethical and moral considerations.