Hello World, As many of you have probably noticed, there is a growing problem on the internet when it comes to undisclosed bias in both amateur and professional reporting. While not every outlet can be like the C-SPAN, or Reuters, we also believe that it’s impossible to remove the human element from the news, especially when it concerns, well, humans.

To this end, we’ve created a media bias bot, which we hope will keep everyone informed about WHO, not just the WHAT of posted articles. This bot uses Media Bias/Fact Check to add a simple reply to show bias. We feel this is especially important with the US Election coming up. The bot will also provide links to Ground.News, as well, which we feel is a great source to determine the WHOLE coverage of a given article and/or topic.

As always feedback is welcome, as this is a active project which we really hope will benefit the community.

Thanks!

FHF / LemmyWorld Admin team 💖

  • @breakfastmtn@lemmy.ca
    link
    fedilink
    English
    -2
    edit-2
    4 months ago

    Got rebuttals for any of my criticisms about the methodology?

    I do!

    I think the importance of American bias is overstated. What matters is that they’re transparent about it. That bias also impacts the least important thing they track. People often fixate on that metric when it has little impact on other metrics or on the most important question for this community: ‘how likely is it that this source is telling the truth?’ Left and right are relative terms that change drastically over time and space. They even mean different things at local and national levels within the same country. It’s not really an MBFC problem, it’s a the-world-is-complicated problem that isn’t easily solved. And it’s not like they’re listing far-right publications as far-left. Complaints are almost always like, “this source is center not center-left!” It’s small problems in the murky middle that shouldn’t be surprising or unexpected.

    It’s also capturing something that happens more at the extremes where publications have additional goals beyond news reporting. Ignoring Fox’s problem with facts/misinfo, it doesn’t really bother me that they’re penalized for wanting to both report the news and promote a right-wing agenda. Promoting an agenda and telling the truth are often in conflict (note Fox’s problem with facts/misinfo). CBC News, for example, probably should have a slightly higher score for having no agenda beyond news reporting.

    It might matter more if it impacted the other metrics, but it doesn’t really. Based on MBFC’s methodology, it’s actually impossible for editorial bias alone to impact the credibility rating without having additional problems – you can lose a max 2 points for bias, but must lose 5 to be rated “medium credibility”. I don’t know why FAIR is rated highly factual (and I’d love for them to be a bit more transparent about it) but criticizing bias leading to them being rated both highly factual and highly credible feels like less than a death blow. If it’s a problem, it seems like a relatively small one.

    MBFC also isn’t an outlier compared to other organizations. This study looked at 6 bias-monitoring organizations and found them basically in consensus across thousands of news sites. If they had a huge problem with bias, it’d show in that research.

    On top of that, none of this impacts this community at all. It could be a problem if the standard here was ‘highest’ ratings exclusively, but it isn’t. And no one’s proposing that it should be. I post stories from the Guardian regularly without a problem and they’re rated mixed factual and medium credibility for failing a bunch of fact checks, mostly in op-ed (And I think the Guardian is a great, paywall-less paper that should fact check a bit better).

    So I think the things you point out are well buffered by their methodology and by not using the site in a terrible, draconian way.

    • @TrippyFocus@lemmy.ml
      link
      fedilink
      English
      7
      edit-2
      4 months ago

      I think the importance of American bias is overstated. What matters is that they’re transparent about it. That bias also impacts the least important thing they track.

      It affects the overall credibility rating of the source, how is that the least important thing? They also seem to let it affect the factual reporting rating despite not clearly stating that in the methodology.

      Based on MBFC’s [methodology](https://mediabiasfactcheck.com/methodology/), it’s actually impossible for editorial bias alone to impact the credibility rating without having additional problems

      This is only true specifically when you’re thinking about it as a great source can’t have its credibility rating lowered. A not great factual source can get a high credibility rating if it’s deemed centrist enough which again is arbitrary based on the (effectively) 1 guys personal opinion.

      High Credibility Score Requirement: 6

      Example 1

      Factual Reporting Mixed: 1

      No left/right bias: 3

      Traffic High: 2

      Example 2

      Factual Reporting Mostly Factual: 2

      No left/right bias: 3

      Traffic Medium: 1

      See how weighing credibility on a (skewed) left/right bias metric waters this down? Both of these examples would get high credibility.

      On top of that, none of this impacts this community at all. It could be a problem if the standard here was ‘highest’ ratings exclusively, but it isn’t.

      That’s a fair point and I did state in my original post that despite my own feelings I’d be open to something like this if the community had been more involved in the process of choosing one/deciding one is necessary and also if we had the bots post clearly call out it’s biases, maybe an explanation of its methodology and the inherent risks in it.

      The way it’s been pushed from the mod first without polling the community and seeing the reaction to criticism some of which was constructive is my main issue here really.

      • @breakfastmtn@lemmy.ca
        link
        fedilink
        English
        -24 months ago

        This is only true specifically when you’re thinking about it as a great source can’t have its credibility rating lowered. A not great factual source can get a high credibility rating if it’s deemed centrist enough which again is arbitrary based on the (effectively) 1 guys personal opinion.

        The impact either way is slight. I’m sure you could find a few edge cases you could make an argument about because no methodology is perfect, but each outlier represents a vanishingly small (~0.01%) amount of their content. When you look at rigorous research on the MBFC dataset though, the effect just isn’t really there. Here’s another study that concludes that the agreement between bias-monitoring organizations is so high that it doesn’t matter which one you use. I’ve looked and I can’t find research that finds serious bias or methodological problems. Looking back at the paper I posted in my last comment, consensus across thousands of news organizations is just way too high to be explainable by chance. If it was truly arbitrary as people often argue, MBFC would be an outlier. If all the methodologies were bad, the results would be all over the map because there are many more ways to make a bad methodology than a good one. What the research says is that if one methodology is better than the others, it isn’t much better.

        Again, I think you make a really good argument for why MBFC and sites like it shouldn’t be used in an extreme, heavy-handed way. But it matters if it has enough precision for our purposes. Like, if I’m making bread, I don’t need a scale that measures in thousandths of a gram. A gram scale is fine. I could still churn out a top-shelf loaf with a scale that measures in 10-gram units. This bot is purely informational. People are reacting like it’s a moderation change but it isn’t – MBFC continues to be one resource among many that mods use to make decisions. Many react as though MBFC declares a source either THE BEST or THE WORST (I think a lot of those folks aren’t super great with nuance) but what it mostly does is say ‘this source is fine but there’s additional info or context worth considering.’ Critics often get bent out of shape about the ranking but almost universally neglect the fact that, if you click that link, there’s a huge report on each source that provides detailed info about their ownership history, funding model, publishing history, biases, and the press freedom of the country they’re in. Almost every time, there are reasonable explanations for the rankings in the report. I have not once ever seen someone say, like, ‘MBFC says that this is owned by John Q. Newspaperman but it’s actually owned by the Syrian government,’ or ‘they claim that they had a scandal with fabricated news but that never happened’. Is there a compelling reason why we’re worse off knowing that information? If you look at the actual reports for Breitbart and the Kyiv Independent, is there anything in there that we’re better off not knowing?

        • @TrippyFocus@lemmy.ml
          link
          fedilink
          English
          4
          edit-2
          4 months ago

          Like I kinda said in my last paragraphs you’ve got fair points that it may be good enough for what it’s being used for here (despite it’s clear biases) since it’s not being used to disallow posts. Although other commenters have said it has a pro-Zionist bias as well which is honestly more concerning than things I’ve pointed out. Haven’t had time to check beyond the ADL one.

          Overall my main issue is the community wasn’t really asked if one was desired, which one should be used, how it should be used, etc. Because of that and the lack of good response by the poster I’ve already decided to follow other world news communities instead of this one.