• jballs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      My wife works at a library. People constantly come in asking to use the library fax machine, because Google’s AI says they have one.

      They don’t have one. Their website says they don’t have one. But LLMs have determined it’s plausible for libraries to have a fax machine, so Google tells people that they have one.

      You’d be surprised at the number of people who can’t accept that people working at the library know more about the library than Google.

      • BozeKnoflook@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        I’ve had this experience myself; I’m an American living in the Netherlands and sometimes just don’t know the name for the thing I need nor where to buy one. LLM bots are fine for the translation part, but they will make wild assumptions like telling me I can buy a kitchen strainer at the hardware store or food spices at a place called Kruidvat which translates to spice-bucket basically but is actually most like CVS without the pharmacy and does not sell any food besides some candy and chips.

        It’s hilarious how quickly these bots can swing from super useful to actually harmful to trust.