Feel like we’ve got a lot of tech savvy people here seems like a good place to ask. Basically as a dumb guy that reads the news it seems like everyone that lost their mind (and savings) on crypto just pivoted to AI. In addition to that you’ve got all these people invested in AI companies running around with flashlights under their chins like “bro this is so scary how good we made this thing”. Seems like bullshit.

I’ve seen people generating bits of programming with it which seems useful but idk man. Coming from CNC I don’t think I’d just send it with some chatgpt code. Is it all hype? Is there something actually useful under there?

  • unknowing8343@discuss.tchncs.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I am not saying it works exactly like humans inside of the black box. I just say it works. It learns and then creates thoughts. And it works.

    You talk about how human cognition is more complex and squishy, but nobody really knows how it truly works inside.

    All I see is the same kind of blackbox. A kid trying many, many times to stand up, or to say “papa”, until it somehow works, and now the pathway is setup in the brain.

    Obviously ChatGPT is just dealing with text. But does it make it NOT intelligent? I think it makes it very text-intelligent. Just add together all the AI pieces we are building and you got yourself a general AI that will do anything we do.

    Yeah, maybe it does not work like our brain. But is a human brain structure the only possible structure for intelligence? I don’t think so.

    • Alex@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      If you consider the amount of text an LLM has to consume to replicate something approaching human like language you have to appreciate there is something else going on with our cognition. LLM’s give responses that make statistical sense but humans can actually understand why one arrangement of words might not make sense over the other.

      • unknowing8343@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yes, it’s inefficient… and OpenAI and Google are losing exactly because of that.

        There’s open source models already out there that are rivaling ChatGPT and that you can train on your 10 year-old laptop in a day.

        And this is just the beggining.

        Also… maybe we should check how many words of exposure a kid gets throughout their life to get to the point to develop arguments such as ChatGPT’s… because the thing is that… ChatGPT does know way more about many things than any human being will ever do. Like, easily thousands of times more.

        • nickwitha_k (he/him)@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          And this is just the beggining.

          Absolutely agreed, so long as protections are put in place to defang it as a weapon against labor (if few have leisure time or income to support tech development, I see great danger of stagnation). LLMs do clearly seem an important part in advancing towards real AI.