What is the correct way to enable both --chat-model and --model in TabbyML to resolve issues with code completion feature?
Andy Wong encountered an issue with TabbyML where the code completion feature was not available. Zhiming Ma suggested enabling both the --chat-model and --model options. Andy Wong confirmed that enabling both options resolved the issue. Additionally, Andy noticed a slower response rate in the AI chat screen compared to Github CoPilot. How can both --chat-model and --model be enabled in TabbyML to resolve issues with the code completion feature?
Andy Wong
Asked on Jun 22, 2024
- To enable both --chat-model and --model in TabbyML, use the following start command:
tabby serve --device metal --model TabbyML/StarCoder-1B --chat-model Mistral-7B
-
Confirm that both the --chat-model and --model options are included in the start command to ensure that the code completion feature is available.
-
Enabling both options allows TabbyML to provide both AI chat functionality and code completion feature, resolving any issues related to missing features.
-
It's important to note that enabling both options may impact the response rate of the AI chat screen, as observed by Andy Wong in comparison to Github CoPilot.