What are the infrastructure requirements for deploying Tabby on GCP?

I am planning to deploy Tabby on GCP and I need guidance on the infrastructure requirements.


Mitesh Bulsara

Asked on Nov 18, 2023

Tabby runs on OSS LLMs and you can find the general guidance on the models at Since version 0.5.0, Tabby's inference operates on llama.cpp, allowing the use of any GGUF-compatible model format with Tabby. You can find curated models that have been benchmarked at

Nov 20, 2023Edited by