Original tweet:

  • Amaltheamannen@lemmy.ml
    link
    fedilink
    arrow-up
    52
    arrow-down
    1
    ·
    1 year ago

    Liberals are right of center. America is almost the only place on the Earth where liberalism is considered left wing.

      • jaybone@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I wonder if these are people from outside of the US referring to what we call libertarians.

        But anyway, the “left” in the US is certainly no where near as left as the left in say European countries.

    • FlexibleToast@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      12
      ·
      1 year ago

      I always hear this, but it doesn’t make sense to me. To me it sounds like we define liberals differently. What specific stances make a liberal right of center? I would like to know we’re defining the word the same way.

      • JungleJim@sh.itjust.works
        link
        fedilink
        arrow-up
        16
        ·
        edit-2
        1 year ago

        I’ve thought of it this way, a liberal is a master being giving liberally the treats they shower their workers with to keep them happy and productive. A leftist believes in the class consciousness struggle and that human rights are innate and not something handed down from masters.