• northbound_goat@szmer.info
    link
    fedilink
    arrow-up
    15
    ·
    2 years ago

    I don’t want to sound jaded, but I have a feeling that eventually someone will use it to run Electron apps in Flatpak on Linux compiled to WASM running on Chrome, all because portability is hard and Qt’s licensing is unpleasant.

    • wazowski@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      2 years ago

      still hoping that rust will bring us one native gui framework to rule them all that will offload at least some of the burden from electron or whatever else js developers use 🤷‍♀️

  • Zerush@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    2 years ago

    In few years our power super gaming PC seems a Abacus to us. The first personal Quandum computer is in the market since last year, for $5000 (2 Qbits, not really usefull yet), but remeber the developement since 15-20 years with the current exponential advances. Current cheap smartphones are high end PC 15 years ago.

    • brombek@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      2 years ago

      There were good reasons to think that silicon based electronics will have exponential advances (basically scaling down components makes them exponentially faster by default). Do we have similar physical properties of quantum computing tech? Perhaps silicon was special and now we are at the end of it with nothing that has this exponential property anymore to replace it.

      • Zerush@lemmy.ml
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        2 years ago

        Yes, scaling down silicon based electronics gaining in speed, but only until certain limits which have silicon to scaling down the components, currently un atomic scale. Because of this are investigated other materials, like syntetic diamonds, which soports much more heat and with this a major processing speed. Quantum computing ofrfers much more speed, because it isn’t limited to 1 OR 0, it can process simultaneous 1 AND O AND everything in between, until now the difficult is that it need very low temperatures and the lack of stability with minimum interferences. But technology is advancing and cooling by magnetic means makes it possible to dispense with nitrogen/helium cooling and allows its application in desktop PCs, although at the moment only with three Qbits, suitable for both research and education use for complex calculations. Well, with the first PCs the same thing happened.

  • obbeel@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    2 years ago

    Lasers, graphene - those will be expensive logic gates. Why not use other kinds of light?

  • ree@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    2 years ago

    I’m gonna go full ludite here but I don’t see the point of having faster computer.

    • 0x00cl@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      2 years ago

      Servers. There’s going to be more people, more people that have access to technology and internet and a lot of services are online.

      Not only that, but look at how big data centers, how much cooling they need (They need a LOT of water), it could make it more efficient.

      But to be fair this article writes very little about something that is just a proof of concept.

    • comfy@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      2 years ago

      For personal computing, sure. That’s not full ludite, we’ve basically reached a point where most things a person does on their computer can be done well with a $500 laptop or phone.

      For servers, media rendering, hash cracking, prime searching, machine-learning training, video and image enhancement, medical simulations and other applications, we still love more power.

      • pingveno@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        2 years ago

        And it’s not just having more powerful hardware available. It can also mean producing less hardware in the first place. Or in the case of a phone, maybe more and more people can just hook it up to a USB-C dock when they need the form factor of a laptop, but otherwise carry around phone.

      • ree@lemmy.ml
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 years ago

        Most of ML industrial application are just fluff, I don’t need my bank, my social network and my governement to profile me nor predict my behavior… And entertainment industry can tell beautiful stories with or without computers.

        The only legit application imo is medical research which has positive outcome on people lives but then I’m sure that if the gafam repurposed their massive tracking infrastructure we’re good for a while.

        • comfy@lemmy.ml
          link
          fedilink
          arrow-up
          4
          ·
          2 years ago

          Those applications you listed are fluff, but there are a significant amount of others (‘most’ or not, I don’t know) that aren’t fluff. Medical research, other engineering research, accessibility tools. I appreciate GAN content upscaling and de-noising for older art, but that’s a more personal case.

          I do think it’s unfortunate that we live in a system that encourages powerful tools to be absolutely wasted on fluff.

          • ree@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            2 years ago

            Couldn’t have put it better than your last sentence.

            Have a nice day :)