What’s the point of “gaming” PCs? I just realized there’s no difference between an ultra high-end PC and a “gaming” PC so I don’t see the need for a difference.

>Bbbububuututt RGB

For god’s sake just add RGB and other high-end components to a normal PC or just swap the case. BOOM! Gaming PC.

What exactly is the definition of gaming PC again?

A gaming computer [sic], also known as a gaming[ sic] PC, is a specialized personal computer designed for playing video games at high standards.

We have generic GPUs with performance that is on par if not far better than RTX in terms of gaming, we have generic RAM sticks with higher capacity, we even have non-gaming hardware that outperform said “gaming” hardware. Hell, I would fucking argue you can make a sick monster build for less money.

But no. Thanks to capitalism pestering everything we love literally millions of gamers flock to anything labeled “gamer” without really looking into whether not that “gaming” hardware they’re about to buy is actually good or not.

Now gamers end up wasting thousands if not tens of thousands of dollars just to find something that looks “gamer-y”

I mean if you want to make a custom PC that looks like a “gamer” one (with RGB and shit) just have the RGB on these only:

  • Keyboard
  • Mice
  • Case fans
  • light strips in the fence;

but at the end of the day it’s fucking pointless to call it a “gaming PC” since it’s just a super-powerful computer with high-end hardware.

End of rant. Period.

  • 🏳️‍⚧️ Elara ☭
    link
    111 year ago

    I’d say you’re mostly right, but there are unfortunately no “generic” GPUs. We have Nvidia’s GPUs (RTX), AMD’s GPUs, and now we also have Intel’s Arc GPUs. Personally, I’d avoid Nvidia and go with AMD or Intel, because I run Linux and using an Nvidia card on Linux is hell, not to mention all the other anti-consumer practices Nvidia performs (the others do as well, but Nvidia is worse, especially when it comes to the open source community). Intel and AMD’s Linux GPU drivers are also completely open source and included in most kernel builds out of the box, so that’s a big plus for me.

    Currently, I’m using an Arc A770. Its DirectX performance is horrible, but I run Linux so I don’t have to worry about that at all, and it handles anything I throw at it perfectly. Before this, I had an RTX 2080 which was pulled out of an HP prebuilt, so I was able to get it for pretty cheap. Using that thing on Linux was horrible. The drivers were unable to do even basic things like detecting the proper resolution on my ultrawide monitor. The Arc outperforms it, while being much nicer to use and not requiring me to use a proprietary driver.

    Unfortunately, Nvidia has cornered the machine learning market by intentionally giving away GPUs to researchers, so their GPUs are generally the only practical option for machine learning, but that’s slowly changing with AMD’s GPUs getting better at it, and Intel entering the market with a surprisingly good option as well.

    Personally, I’m hoping China can eventually make a GPU with performance on par with Nvidia/AMD/Intel GPUs. That would be great.

    • @Prologue7642
      link
      71 year ago

      I am really looking forward to eventual Chinese CPU and GPUs, especially if they are RISC-V based. It would be really nice if we could get some other reasonable options other than AMD/Intel/Nvidia.

      Btw, does Arc really outperform your RTX 2080? I was under the impression that Arc’s performance is not great.

      • 🏳️‍⚧️ Elara ☭
        link
        61 year ago

        Btw, does Arc really outperform your RTX 2080? I was under the impression that Arc’s performance is not great.

        It outperforms or matches the RTX 2080 (non-Ti, non-Super) in everything I’ve tested so far, but I run Linux so the results will definitely differ on Windows. Also, I don’t use ray tracing. Obviously, the RTX will win at that.

        • @Prologue7642
          link
          31 year ago

          Hmm, looking at benchmarks, it really does look pretty good. Especially in computing, but the gaming drivers seem to be far from mature. Did you by any chance test machine learning workloads? This seems like a pretty compelling alternative to Nvidia GPUs. It is currently really annoying to try to do any machine learning work on my 6900XT. And I want to avoid Nvidia as much as possible (also on Linux).

          https://www.phoronix.com/review/intel-arc-march23 https://www.phoronix.com/review/arc-graphics-compute-q1

          • 🏳️‍⚧️ Elara ☭
            link
            4
            edit-2
            1 year ago

            the gaming drivers seem to be far from mature.

            I have played some games, and the ones I tested weren’t bad, they mostly matched the 2080, maybe with a bit less consistent frame times. However, the games that I tested are probably not a very good benchmark for overall gaming performance. I tested Minecraft with the most demanding shaders I could find (easily 200-700 fps depending on where I was looking), SpaceEngine with the highest graphics settings it would allow me to set (consistently >100 fps), and KSP 2 which is incredibly unoptimized, and as a result, runs about as well as it runs even on 3090s (around 30fps most of the time).

            Did you by any chance test machine learning workloads?

            I tried, but it’s annoying mostly because none of the things I tried supported anything other than CUDA, and I gave up after trying to modify stuff to make it work for several hours. I’ll probably try it again in a little while.

          • @silent_clash
            link
            19 months ago

            Unfortunately, Nvidia has way better software support for almost all machine learning and ai applications. It can vary if you look up the specific software you want to use.

            • @Prologue7642
              link
              19 months ago

              Unfortunately yes. Thankfully, at least for inference, there are other options, especially with things like ONNXRuntime.

    • @StugStig
      link
      6
      edit-2
      1 year ago

      China’s Biren BR100 beats Nvidia’s A100 and matches the H100 in ML workloads but it really isn’t competitive in non-AI GPGPU applications.

      China’s Moore Threads S3000 is a more conventional GPGPU with theoretical performance between the A3000 and A4000. It also somehow supports CUDA. The S80, the same chip oriented towards gaming, doesn’t perform all that well due to immature drivers and a general lack of support for most games.

      Still they have to compete with Nvidia since all US sanctions did was cap NVlink (multi-card interface that used to be SLI) bandwidth to 400 GB/s on the A800 and H800, which are identical to the A100 and H100 in all other measures.

  • @Prologue7642
    link
    8
    edit-2
    1 year ago

    We have generic GPUs with performance that is on par if not far better than RTX in terms of gaming

    Not really sure what you mean. RTX is mainly just high-end Nvidia cards. If you want the highest performance with Nvidia you don’t really have a choice and need to use RTX cards, or you can go with AMD, but the difference is mainly branding (although I would still always take AMD). I would say the opposite gaming GPUs are much better than the professional ones. If you are working on machine learning or anything else that requires heavy computation, buying RTX cards is much cheaper than buying specialized HW for it (Tesla, Quadro, etc.). That is something that deserves much more criticism because in professional setting you are often forced to buy them, because of license agreement with Nvidia.

    we have generic RAM sticks with higher capacity

    Capacity is very rarely the issue, the frequency is much more important, especially for gaming. Would it be nice if we could buy high frequency, high capacity, ECC RAMs for consumer PCs, sure, but there is no such product.

    we even have non-gaming hardware that outperform said “gaming” hardware

    Not really sure what you mean, there are not many things other than CPU/GPU and maybe RAM that determines gaming performance.

    Not that I am disagreeing with you on the overall point, but I disagree on your specific points. The main issue here is that basically almost everyone who actually builds their own PC is doing it for gaming, or maybe gaming and some work.

    Btw, what do you mean by “high-end software”, Windows?

    • JoeMarx 193OPM
      link
      01 year ago

      I never said shit about software.

      • @Prologue7642
        link
        51 year ago

        since it’s just a super-powerful computer with high-end software.

        Last sentence, maybe you meant hardware?