I’ve started noticing articles and YouTube videos touting the benefits of branchless programming, making it sound like this is a hot new technique (or maybe a hot old technique) that everyone should be using. But it seems like it’s only really applicable to data processing applications (as opposed to general programming) and there are very few times in my career where I’ve needed to use, much less optimize, data processing code. And when I do, I use someone else’s library.

How often does branchless programming actually matter in the day to day life of an average developer?

  • LaggyKar@programming.dev
    link
    fedilink
    English
    arrow-up
    23
    ·
    edit-2
    1 year ago

    Or are GPUs particularly bad at branches.

    Yes. GPUs don’t have per-core branching, they have dozens of cores running the same instructions. So if some cores should run the if branch and some run the else branch, all cores in the group will execute both branches, and mask out the one they shouldn’t have run. I also think they they don’t have the advanced branch prediction CPUs have.

    https://en.wikipedia.org/wiki/Single_instruction,_multiple_threads

    • Ethan@programming.devOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Makes sense. The most programming I’ve ever done for a GPU was a few simple shaders for a toy project.