• dohpaz42@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    16
    ·
    2 days ago

    If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?

    • ramirezmike@programming.dev
      link
      fedilink
      English
      arrow-up
      42
      ·
      2 days ago

      that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”

    • JuxtaposedJaguar@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      If you train a parrot to say “I can do calculus!” and then you ask it if it can do calculus, it’ll say “I can do calculus!”. It can’t actually do calculus, so would you say the parrot is lying?

      • 5too@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        2 days ago

        This is what I’ve been calling it. Not as a pejorative term, just descriptive. It has no concept of truth or not-truth, it just tells good-sounding stories. It’s just bullshitting. It’s a bullshit engine.