You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A clear and concise description of what the bug is.
.\mistralrs-server -i lora -o .\llama_lora_ordering.json -m meta-llama/Llama-3.2-1B-Instruct -a jjaegii/Llama-3.2-1B-Instruct-LoRA-ko-kubefix
2024-12-07T00:23:28.343366Z INFO mistralrs_server: avx: true, neon: false, simd128: false, f16c: true
2024-12-07T00:23:28.343565Z INFO mistralrs_server: Sampling method: penalties -> temperature -> topk -> topp -> minp -> multinomial
2024-12-07T00:23:28.343692Z INFO mistralrs_server: Model kind is: lora
2024-12-07T00:23:28.346944Z INFO mistralrs_core::pipeline::normal: Loading `tokenizer.json` at `meta-llama/Llama-3.2-1B-Instruct`
2024-12-07T00:23:28.347410Z INFO mistralrs_core::pipeline::normal: Loading `config.json` at `meta-llama/Llama-3.2-1B-Instruct`
2024-12-07T00:23:30.032688Z INFO mistralrs_core::pipeline::paths: Found model weight filenames ["model.safetensors"]
Error: Adapter files are empty. Perhaps the ordering file adapters does not match the actual adapters?
Describe the bug
A clear and concise description of what the bug is.
.\mistralrs-server -i lora -o .\llama_lora_ordering.json -m meta-llama/Llama-3.2-1B-Instruct -a jjaegii/Llama-3.2-1B-Instruct-LoRA-ko-kubefix
llama_lora_ordering.json:
adapter_config.json of the lora
Latest commit or version
Which commit or version you ran with.
v0.3.2
The text was updated successfully, but these errors were encountered: