Is there a way to use the `codellama/CodeLlama-34b-hf` model for example?


Peter Ahlers

Asked on Nov 06, 2023

Yes, it is possible to use the codellama/CodeLlama-34b-hf model. However, for code completion use case, the model might be too large due to latency requirements. If you still want to use it, you can refer to the file in the repository to learn how to load the model from local. Additionally, you can acquire the model file from

Nov 06, 2023Edited by