• SleeplessCityLights@programming.dev
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    I predict that the slopicalypse will hit in 2027 and all of the corporations that jumped head first into the filling swimming pool will hit bottom. Using an LLM to code will only make things worse. It’s a fucking entropy injector. You can’t continously add entropy without hitting a point where it just is not cohesive anymore and LLMs get really bad when the context is too large.

    • A Sharky Anthro@fedia.io
      link
      fedilink
      arrow-up
      4
      ·
      2 days ago

      I hope the slopicalypse does hit in 2027 because there is no way to escape technical debt, unless you build software with smart people carefully maintaining it as they add features. LLMs could never do that and will cause the worst tech disasters. I cannot wait to see the aftermath of all these corporations fucking around and finally finding out how stupid they really are!

      • SleeplessCityLights@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        It already happened to a project at my work. Nobody understands enough of the code base and/or can make sense of it to be able to add features. It is the buggiest fucking thing ever, making LLM debugging an endless excersise of finding more bugs. This also means that we can’t prompt an LLM effectively to make targeted changes. The only thing left is letting an agent fuck shit up worse by running it with a vague prompt. We don’t know what to do. It did cost a lot to make after counting man hours and traditional software development mentality hates throwing something completely away.