general

What does the switch to llama.cpp backend mean in Tabby?

Al

Alessandro Versace

Asked on Nov 04, 2023

The switch to the llama.cpp backend in Tabby's 0.5.0 release is an internal change that improves performance. It replaces the previous CPU/GPU backend on ctranslate2. However, note that the 0.5.0 release is still pre-release and may have some bugs, particularly in CPU support. It is recommended to stay at version 0.4.0 if you are running on CPU.

Nov 04, 2023Edited by