who needs free software or getting rid of planned obsolescence?

  • RedClouds
    link
    fedilink
    arrow-up
    41
    ·
    6 months ago

    Without giving anything specific away, I am a software developer and a consultant, and mostly work on web stuff.

    I’ll try to keep this short, but in general, yes. Basically, computers keep getting faster, which allows software developers to use higher-level libraries, which are actually less efficient, and thus your average piece of software actually takes more processing power and RAM than back in the day.

    As well, because of those high-level libraries, programming is a lot easier than it used to be. Unfortunately, that means that we just hire cheaper developers that aren’t as skilled, and they have a harder time tracking down and fixing bugs. Which is doubly worse because those higher-level libraries are black boxes, and you can’t always fix things that arise inside of them easily.

    But software development companies have basically figured out that shitty software doesn’t really hurt their bottom line in the end. For the most part, people will use it if it’s a name brand piece of software, like Google or Apple or Microsoft. They don’t need to build high quality software because it’s basically going to be used by default. The money will come in if you corner a market or if you build something unique, or contract with another business. It doesn’t actually have to be high quality.

    As well, websites make more money the more ads you put on them. So it doesn’t matter how efficient you build it, it’s going to be slow. And it doesn’t matter how slow it is, because you’re going to make more money the more ads and tracking you have. All you need is good search engine optimization and you will get traffic by default. SEO Is easier said than done, but the point is nobody really focuses on performance when it’s more profitable to focus on search engines.

    • CannotSleep420
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      As well, because of those high-level libraries, programming is a lot easier than it used to be. Unfortunately, that means that we just hire cheaper developers that aren’t as skilled, and they have a harder time tracking down and fixing bugs. Which is doubly worse because those higher-level libraries are black boxes, and you can’t always fix things that arise inside of them easily.

      The Luke Smith/ Mental Outlaw type chuds call these developers “soydevs”.

      • RedClouds
        link
        fedilink
        arrow-up
        21
        ·
        6 months ago

        Yeah, I’m not one to use insulting terms, it’s more of a natural process of an industry lowering the bar to entry.

        But there really is something to be said for those old applications that were built rock solid, even if they only came out with a new version once every four years.

        More frequent releases of a smaller feature set isn’t wrong. I’d be happy getting high quality application updates every month or so.

        But as with all things, the analysis falls on the side that capitalism just doesn’t incentivize the right things. Quarterly profit drives lots of features delivered poorly instead of a few good features delivered occasionally. Of course the developers get blamed for this when really they are just a product of a broken system. We invent insulting terms for them instead of going after the real problem, Because, of course, we don’t have an understanding of materialism in the west.

        Oh well.

    • sooper_dooper_roofer [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 months ago

      Basically, computers keep getting faster, which allows software developers to use higher-level libraries, which are actually less efficient,

      Could you elaborate on this?

      So this means that an app that does basically the same thing today as in 2005 is going to be way more resource-intensive because of this right?

      • RedClouds
        link
        fedilink
        arrow-up
        6
        ·
        6 months ago

        Yeah, this was a quick and dirty thought, but effectively that’s exactly what I mean. An application built from scratch today using modern high-level programming libraries will take more RAM and more CPU to do the same thing than an app written in 2005 does, generally speaking.

        Of course, for those people who still write C, C++, or choose to write Rust or Go, or some of the other low-level languages, or even Java, but without major frameworks, can still achieve the type of performance an app written in 2005 could. But for people coming out of college and/or code schools nowadays, you just reach for a big fat framework like spring or use a high level language like JavaScript or Python or Ruby with big frameworks, and your application will by default use more resources.

        Though the application might still be fast enough, I’m not even saying that an application written in Python will be slow, but I will say that an application written in Python will by default use about 10x more CPU in RAM than a similar application written in Rust. I mean, maybe the application only uses 10 megabytes of RAM. When the equivalent efficient application would use 1 megabyte of RAM, both of those are very efficient and very fast and would be just fine. But when the difference is between 10 gigabytes of RAM and 1 gigabyte of RAM, yeah, at that point in time, you’re pretty much just taking advantage of RAM being cheap.

        And it’s not even necessarily a bad thing that we do this. There’s just a balance to be had. It’s okay to write in higher level language if it means you can get some stuff done faster. But major applications nowadays choose to ship an entire browser to be the base layer of their Application. Just because it’s more convenient to write cross-platform code that way. That’s too much and there’s already a lot of work going towards fixing this problem as well. We’re just sort of seeing the worst of it right now.

      • redtea
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        I’m not a computer person but my understanding is that this it’s what bricked my MacBook a long time ago. It worked perfectly fine. Not the fastest at seven years old but it was fine. Then along came a Google Chrome update with uncapped the RAM usage. Suddenly 8gb ram wasn’t enough to do anything. Nothing else worked if the browser was open and I needed to multitask. Chrome was the only browser compatible with work software. Had to get another machine about a week later (not a Mac, that time).

    • comradecalzone
      link
      fedilink
      arrow-up
      3
      ·
      6 months ago

      Which is doubly worse because those higher-level libraries are black boxes, and you can’t always fix things that arise inside of them easily.

      If by “higher level” you mean something like Java libraries, I’d say the opposite is true - at least if you don’t have the source for a Java class it is trivial to decompile and have something immediately readable. Can’t say the same for something like a dll originally written in C++.

      • RedClouds
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        More high level in that, think really deeply embedded JavaScript frameworks. In this situation, even Java is comparatively low level. Although a lot of people just rely on spring and spring boot, and don’t understand how it works.