How to integrate llama.cpp on Intel OpenAPI and resolve link error?

I am trying to compile and integrate llama.cpp on Intel OpenAPI to accelerate the generation speech but got a link error. I have added the necessary link with mkl_intel library and changed the compiler to the Intel compiler. What could be causing the link error and how can I resolve it?


Hung Le

Asked on Feb 08, 2024

The link error archive has no index; run ranlib to add one usually occurs when the archive library is not properly indexed. To resolve this issue, you can run the ranlib command on the library file. In this case, you can run ranlib on libllama.a to add the index. Here's an example:

ranlib tabby/target/release/build/llama-cpp-bindings-6ba569c9c2194ae5/out/build/libllama.a

After running ranlib, try compiling and integrating llama.cpp again to see if the link error is resolved.

Feb 08, 2024Edited by