Do you think AI is, or could become, conscious?

I think AI might one day emulate consciousness to a high level of accuracy, but that wouldn’t mean it would actually be conscious.

This article mentions a Google engineer who “argued that AI chatbots could feel things and potentially suffer”. But surely in order to “feel things” you would need a nervous system right? When you feel pain from touching something very hot, it’s your nerves that are sending those pain signals to your brain… right?

  • MagicShel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    6 days ago
    1. Let’s say we do an algorithm on paper. Can it be conscious? Why is it any different if it’s done on silicon rather than paper?

    2. Because they are capable of fiction. We write stories about sentient AI and those inform responses to our queries.

    I get playing devil’s advocate and it can be useful to contemplate a different perspective. If you genuinely think math can be conscious I guess that’s a fair point, but that would be such a gulf in belief for us to bridge in conversation that I don’t think either of us would profit from exploring that.

    • peanuts4life@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      6 days ago

      I don’t expect current ai are really configured in such a way that they suffer or exhibit more than rudimentary self awareness. But, it’d be very unfortunate to be a sentient, conscious ai in the near future, and to be denied fundinental rights because your thinking is done “on silicone” rather than on meat.

      • MagicShel@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 days ago

        I said on paper. They are just algorithms. When silicon can emulate meat, it’s probably time to reevaluate that.

        • amelia@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 days ago

          You talk like you know what the requirements for consciousness are. How do you know? As far as I know that’s an unsolved philosophical and scientific problem. We don’t even know what consciousness really is in the first place. It could just be an illusion.

          • MagicShel@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            5 days ago

            I have a set of attributes that I associate with consciousness. We can disagree in part, but if your definition is so broad as to include math formulas there isn’t even common ground for us to discuss them.

            If you want to say contemplation/awareness of self isn’t part of it then I guess I’m not very precious about it the way I would be over a human-like perception of self, then fine people can debate what ethical obligations we have to an ant-like consciousness when we can achieve even that, but we aren’t there yet. LLMs are nothing but a process of transforming input to output. I think consciousness requires rather more than that or we wind up with erosion being considered a candidate for consciousness.

            So I’m not the authority, but if we don’t adhere to some reasonable layman’s definition it quickly gets into weird wankery that I don’t see any value in exploring.