Bizangel is trying to compile tabby with ROCM support on Windows but encountering errors related to detecting options properly and trying to compile CUDA code. Meng Zhang mentions the lack of testing environment for Windows + ROCM and suggests compiling llama.cpp with ROCM + Windows to verify it's working.
Bizangel
Asked on Mar 01, 2024