I use ROCm for inference, both text generation via llama.cpp/LMStudio and image generation via ComfyUI.
Works pretty much perfectly on a 6900 XT. Very fast and easy to setup.
I had issues with some libraries only supporting CUDA when trying to train, but that was almost 6 months ago so things probably have improved in that area as well.
I used to have a second partition with Windows for such cases, but over time I just stopped bothering with those games.
Now I just refund if it doesn’t work and move on in my to-play list.
I still have a Windows VM for some applications and for doing firmware updates but I never bothered to set it up for playing games.