• Boddhisatva@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    ·
    17 days ago

    How many lawyers need to screw themselves over by using LLMs to write legal briefs before the others realize that doing so just might be a bad idea?

    I mean, come on, people. There is no such thing as actual artificial “intelligence.” There are programs that try to mimic intelligence like LLMs but they are not actually intelligent. These models are trained using data from all over the internet with no vetting as to accuracy. When the thing searches for legal cases to cite, it is just as likely to cite a fictional case from some story as it is to cite an actual case.

    • JcbAzPx@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      17 days ago

      It’s not like it’s looking up anything either. It’s just putting words together that sound right to us. It could hallucinate a citation that never even existed as a fictional case, let alone a real one.

    • dhork@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      17 days ago

      At this point, everyone should understand that every single thing a public AI “writes” needs to be vetted by a human, particularly in the legal field. Lawyers who don’t understand this need to no longer be lawyers.

      (On the other hand, I bet all the good law firms are maintaining their own private AI, where they feed it the relevant case histories directly, and specifically instruct it to provide citations to published works and not make shit up on its own. Then they validate it all, anyway, because their professional reputation depends on it).

    • bthest@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      16 days ago

      The fact that so many lawyers are pulling this shit should have people terrified about how much AI generated documents are making it into the record without being noticed.

      It’s probably a matter of time before one these non-existent cases results in decisions that will cause serious harm.