• acastcandream@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Once again reaffirming why I refuse to host an instance. If I ever do, I’m not federating with any of you degenerates lol

  • pinkdrunkenelephants@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I’m not gonna lie, I’m surprised it took this long for some dipshit to try something like this. Lemmy’s security has more holes in it than a piece of Swiss cheese and we’re fools if we think it’s viable enough for it to serve as a long-term home for new social media.

    We really, really need a better social structure than federation.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Lemmy’s security has more holes in it than a piece of Swiss cheese

      This has very little to do with security. There’s inherently “insecure” about posting CSAM, since the accounts and images were likely posted just like any other.

      What really needs to happen, is some sort of detection of that kind of content (which would likely require a large change to code) or additional moderation tools.

    • neeeeDanke@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I know that guy Tobias Fünke, althought he also is a analysist. He had some clever abreviation for that as well!

  • Carlos Solís@communities.azkware.net
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    In the meanwhile, my YunoHost based instance that still hasn’t managed to make Pict-RS work and therefore can’t even store images even if it wanted to is doing juuuuust fine

    • Etienne_Dahu@jlai.lu
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Come to think of it, if you’re the only user, it’s kinda protecting you, isn’t it? (hello fellow Yunohost user!)

  • db2@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    There’s always someone who doesn’t mind ruining it for everyone else. Probably safest to just delete all the images, that way there’s no need to look.

    • Szymon@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Bad actors will try to nuke the entire platform to maintain a monopoly on this format of communication and community.

    • robotrash@lemmy.robotra.sh
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Federation still causes those images to be saved on your hardware, even if the account that creates it is hosted somewhere else.

      • whofearsthenight@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        This is kinda a major problem with lemmy, and the idea that they don’t have CSAM detection on the roadmap is going to make wide adoption a near impossibility. The other thing though is that even automated CSAM detection isn’t 100%, so hosting your own instance likely means you’re going to have to view CSAM and other fucked up shit at some point to properly moderate it, even if you’re just hosting for yourself. Tbh I was strongly considering hosting my own instance because it’s not like, that hard/expensive, but this saga has turned me completely off of that idea, even just for myself.

        This actually makes me wonder how much reddit mods deal with this type of thing instead of paid employees like facebook, which has a paid army dealing with content moderation on facebook. Oh, and talking about xitter now which has neither volunteer mods and no moderation team since Elon fired them all, I assume that the freaks have just decided that’s their hosting platform of choice.

  • 𝕸𝖔𝖘𝖘@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    3
    ·
    1 year ago

    I’m glad s/he was able to nuke the CSAM, even if other material was nuked with it. This crap is why I’m not hosting.

    Please, call it CSAM (child sexual abuse material) and not CP (child pornography). The children in these photos/videos can’t make pornography, they’re sexually abused into making this material. CP insinuates that it’s legitimate porn with children. CSAM, on the other hand, calls it what it is: sexual abuse of children.

    • Trantarius@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      That is needlessly pedantic. I have never heard of anyone using the word pornography to imply legality or moral acceptability. There is no such thing as “legitimate” CP, so there is no need to specify that it’s not ok every time it is mentioned. No one in their right mind would presume he’s some kind of CP supporting monster for failing to do so.

      • TheFrirish@jlai.lu
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        If we spent more time fixing things rather than naming them the world would be a better place.

      • 𝕸𝖔𝖘𝖘@infosec.pub
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        edit-2
        1 year ago

        No one in their right mind would assume that OP is. But the term was created to legitimize the material. So, while you’re correct in that it is picky, it is also picky for a reason. Words are powerful. We should fight to not empower the legitimation of that term, among other things.

        • Trantarius@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          But the term was created to legitimize the material.

          Do you have a source for that? I can’t find anything that states the origin of the term itself is seedy. Besides, it’s just a plain description: it’s pornography with children in it.

          The only sources I can find that support CSAM over CP claim that CP somehow implies consent. But I’m saying that simply isn’t the case. I am not saying that words arent powerful. I am not saying that no words ever need to be changed. I am saying that these words don’t need to be changed.

          Based on those same sources, I’d speculate that this outrage is just misplaced anger. They almost immediately start talking about how bad sexual abuse is, which is not really relevant to whether it should be called CP or CSAM. Just because CP is bad, does not mean the term CP is bad.

          • 𝕸𝖔𝖘𝖘@infosec.pub
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            Honestly, I don’t care what you choose to call it. Our world warrants us certain freedoms, and how we use those freedoms will set the stage for the future world. As you’ve said, CP implies consent. If you would like to spread the implication that these children somehow consented to be part of this sexual abuse material, then keep calling it CP. I, with the rest of those who wish to not spread the lie that they consented, will call it CSAM.