as seen here and here, some instances are feeding posts wholesale to prompts, for what seem like extremely unsound reasons to me

any of you run into this shit yet?

  • db0@lemmy.dbzer0.comBanned from community
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    2 days ago

    I don’t accept that the LLM summary didn’t influence the decision because the mod in question confirmed that he knew the LLM agreed with him (that’s bias, and also not something LLMs are capable of actually doing) and because if it didn’t, then the summary is worthless

    In this case, according to the admin in question, the LLM summary came after the decision, as a sort of a test. I.e. the admin made a decision, and wanted to see if an LLM would subsequently agree with that decision. In this specific case, it did, which is why they misguidedly decided to keep its summary in the modlog (opening us up to this whole shitstorm), but ultimately, that admin anyway decided LLMs in the mix is not good at all, which is why you never again saw an LLM summary in the modlog.

    I can only put so much fault for a person for just testing shit out, yanno? I am not happy that they decided to use the output of the test because they are not familiar with how quickly disinfo breeds, but ultimately they came to the right decision anyway. If they had not and they had raised the issue on using LLMs officially, they would have been shut down.

    • [deleted]@piefed.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      Having a LLM confirm a decision is the same thing as having the LLM make a decision and then figure out if the mod agrees with it. If they would have chosen not to rule based on the LLM output, then the LLM was part of the decision making process. The order does not matter.

      Including the LLM outputting something that implies a determination at any step automatically makes it part of the process.

        • self@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          hey fucko, you know we don’t have to take their word for it right? we can read all the relevant posts and come to the conclusion that actually the use of LLMs as stated fucking sucks, and that we don’t fucking want it. we can read something and come to a different conclusion than you, believe it or not.