• laenurd@lemmy.lemist.de
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    9 months ago

    removed” who bought Nvidia here.

    I know it’s 4chan banter and generally agree with anons points, but here goes:

    • ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff
    • I have yet to hear from anyone with an 8GB card who maxes out that memory on current-gen games at 1080p
    • apart from frame generation, you DO get DLSS 3 features on 3000 series cards
    • PrivateNoob@sopuli.xyz
      link
      fedilink
      arrow-up
      9
      arrow-down
      2
      ·
      9 months ago

      “Based” who bought AMD here.

      ROCM is still in it’s infancy stage. Literally ROCM isn’t supported for my 6700 XT, so I had to return to Google Colab to work on my AI thesis project.

      • gaiussabinus@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        ROCM support makes me angry but NVidia also fumbled their drivers too. Their is no good option so pick your poison. I run ROCM right now with a work around on my 6900 XT to get the card detected. And i have also gone from 10 It/s to 4 or even 2 with updates. Shit sucks.

    • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      9 months ago

      ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff

      Same for me, had to buy an Alienware laptop with an NVIDIA GPU during my PhD for some GPGPU coding I had to do as CUDA was pretty much the only choice back then and OpenCL was a joke in terms of performance and wasn’t getting much love from GPU manufacturers. But right now, I know for sure I won’t ever buy an NVIDIA GPU again, ROCm works wonderfully well even on an APU (in my case, a Radeon 680M integrated GPU) and it’s also future-proof since you’re almost writing CUDA code so if you ever switch to an NVIDIA GPU again you mostly will just have to replace “hip” with “cuda” in your code + some magic constants (warp length in particular).

  • CheeseNoodle@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    9 months ago

    “Not get dlss3.0 features on your 30 series”
    But DLSS 3.0 features do work on the 30 series? and the 20 series. The only thing locked out is frame gen.

  • boletus@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    Buy whatever card you need for your use case. Both are fine.

    For me, as a gamer, I think dlss is good shit, and nothing really beats it Rn. Also I like using rtx in single player games, I only expect 60-90fps from games anyway.

    I’m a game developer, I benefit by using a nvidia card because i have greater access to current standard apis and graphics features, hardware acceleration for light baking, the option to use tensor cores for learning how to write shit for it, and it generally has better compatibility with dev tools.

    Nvidia cards also tend to keep their value more, at least down under.