I'm attempting to set up the open-source co-pilot in VS Code using the Tabby extension and the backend as open-source LLM. On my laptop, which runs Windows 10, I've successfully downloaded Ollama and fetched the 'starcoder:3b' model. It's operational at the http://localhost:11434 endpoint, and I can receive responses when calling it through code or the command prompt. However, after configuring this same endpoint in the Tabby extension in VS Code (by updating the API: Endpoint field), Tabby fails to connect. Could you help me identify the issue?
Jayaprakash Venugopal
Asked on Mar 04, 2024