general

Is it possible to plug llama.cpp web service to tabby-client or tabby-engine?

I've started llama.cpp as a web service like explained here: https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md, so I can request with a simple curl command:

    --url <http://localhost:8080/completion> \
    --header "Content-Type: application/json" \
    --data '{"prompt": "Building a website can be done in 10 simple steps:","n_predict": 128}'```
cl

clement-igonet

Asked on Nov 22, 2023

No, it is not possible to plug llama.cpp web service directly to tabby-client or tabby-engine. Tabby heavily customizes the decoding steps and is not suitable to rely on an openai-like API. There is a discussion about this topic on the Tabby GitHub repository: https://github.com/TabbyML/tabby/issues/795

Nov 22, 2023Edited by