general

Why is Tabby extension failing to connect to the open-source LLM backend configured with Ollama in VS Code?

I'm attempting to set up the open-source co-pilot in VS Code using the Tabby extension and the backend as open-source LLM. On my laptop, which runs Windows 10, I've successfully downloaded Ollama and fetched the 'starcoder:3b' model. It's operational at the http://localhost:11434 endpoint, and I can receive responses when calling it through code or the command prompt. However, after configuring this same endpoint in the Tabby extension in VS Code (by updating the API: Endpoint field), Tabby fails to connect. Could you help me identify the issue?

Ja

Jayaprakash Venugopal

Asked on Mar 04, 2024

  • Tabby extension does not utilize Ollama for the model; it uses its own serving stack.
  • Refer to the Tabby documentation for installation details: Tabby Installation Documentation
Mar 04, 2024Edited by