general

Can I compile tabby with ROCM support on Windows?

Bizangel is trying to compile tabby with ROCM support on Windows but encountering errors related to detecting options properly and trying to compile CUDA code. Meng Zhang mentions the lack of testing environment for Windows + ROCM and suggests compiling llama.cpp with ROCM + Windows to verify it's working.

Bi

Bizangel

Asked on Mar 01, 2024

  • Compiling tabby with ROCM support on Windows is not officially supported
  • There is no testing environment for Windows + ROCM
  • Try compiling llama.cpp with ROCM + Windows to verify if it works
Mar 01, 2024Edited by