I am planning to deploy Tabby on GCP and I need guidance on the infrastructure requirements.
Mitesh Bulsara
Asked on Nov 18, 2023
Tabby runs on OSS LLMs and you can find the general guidance on the models at https://tabby.tabbyml.com/docs/models/. Since version 0.5.0, Tabby's inference operates on llama.cpp, allowing the use of any GGUF-compatible model format with Tabby. You can find curated models that have been benchmarked at https://github.com/TabbyML/registry-tabby.