^^^

  • Rhaedas@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    3 days ago

    AGI might use LLM tech in their process, but LLMs by themselves aren’t going to become aware. What happened is LLM tech became a gold mine, some who were doing AGI research jumped on it instead, and others followed. There is certainly still AGI research going on somewhere, but it’s buried by the race to… something. The biggest problem I see, outside of the need for profit guiding all this, is that what they are building has become so complex they don’t really understand it fully, they just keep finding ways to tack on things to get to some higher level without knowing why it works (or why it will break).

    And while LLMs aren’t AGI, they still have the issue of misalignment, even without a self-awareness. We’ve seen early on the misdirection to obtain a goal, and the models now are more sophisticated. Maybe it’s not their own goal, but a misunderstood goal that they’ll say and do anything to get to.

    Good thing we’re not putting them in control of important things, or full access to systems, right? Right?

    • trxxruraxvr@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 days ago

      Research info AGI had always been the domain of universities, not companies trying to get investments or profit. It’s still going on but you’ll only hear from it when there’s a new development that some company tries to turn into profit.