In the past two weeks, the site has been very unstable, sometimes for one second a page would load and the next it wouldn’t load at all. The status of lemmy.world would fluctuate a lot on lemmy-status.org, and there have times where the site has been down for hours.

  • th3dogcow
    link
    fedilink
    English
    321 year ago

    There have been numerous ddos attacks etc on the site recently. Rudd, the operator has been very forthcoming with information.

    • Draconic NEO
      link
      fedilink
      21 year ago

      Didn’t they also say that there were other kinds of attacks being used as well, not just DDoS I may be remembering wrong but I feel like I heard one of the admins say something about a database related attack where they overloaded the database with download requests or something like that.

  • freamon
    link
    fedilink
    English
    7
    edit-2
    1 year ago

    Either the problems with its API responses are breaking lemmy.world, or a broken lemmy.world is causing problematic API responses.

    Currently, you can ask lemmy.world for page number billion of its communities and it’ll return a response (for the communities it thinks it has on that page, rather than an empty one, as it should). For something like lemmyverse.net, this means its crawler can never get to end of a scan, and some apps are maybe trying to endlessly load the list.

    References:
    https://github.com/tgxn/lemmy-explorer/issues/139
    https://lemmy.world/post/2651283

    • @marsara9@lemmy.world
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      This is the same reason I had to turn off my search engines crawler.

      There were changes made to the API to ignore any page > 99. So if you ask for page 100 or page 1_000_000_000 you get the first page again. This would cause my crawler to never end in fetching “new” posts.

      lemm.ee on the other hand made a similar change but anything over 99 returns an empty response. lemm.ee also flat out ignores sort=Old, always returning an empty array.

      Both of these servers did it for I assume the same reason. Using a high page number significantly increases the response time. It used to be (before they blocked pages over 99) that responses could take over 8-10 seconds! But asking for a low page number would return in 300ms or less. So because it’s a lot harder to optimize the existing queries, and maybe not possible, for now the problematic APIs were just disabled.

    • Oh wow, I thought this was a bug in the Lemmy api. I was implementing comments paging in my app the other day and noticed it would just infinity load pages with duplicate comments. I guess I should have tested with another instance before disabling the feature.

      • freamon
        link
        fedilink
        English
        21 year ago

        Yep. Worked this time, so I’ve edited my comment.

  • Draconic NEO
    link
    fedilink
    51 year ago

    It’s been under attack from malicious actors, some DDoS, some database related (they overload the database with download requests). Likely because lemmy.world is one of the largest instances and they figure they can cause the most damage by attacking this one.