You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I encountered an issue while trying to load a model in Secret Llama. The UI works correctly, but the models fail to initialize. Maybe because they are not linked to the UI. Instead, I get the following error message:
Could not load the model because Error: Cannot find adapter that matches the request
I couldn't find instructions in the repo on how to link LLMs to the UI?
Steps to Reproduce
cloned the repo
ran yarn and yarn build-and-preview
opened UI
Expected Behavior
The model should load and be ready for inference after selection.
Observed Behavior
The application displays the error:
Could not load the model because Error: Cannot find adapter that matches the request
Environment
Browser: Google Chrome Version 131.0.6778.205 (Official Build) (arm64)
Operating System: macOS Sequoia 15.1.1 (24B91)
Additional Information
I checked the issues page and documentation but couldn't find a solution.
Is there a specific configuration where I can link the downloaded LLMs?
Thank you for your help and cool project!
The text was updated successfully, but these errors were encountered:
I encountered an issue while trying to load a model in Secret Llama. The UI works correctly, but the models fail to initialize. Maybe because they are not linked to the UI. Instead, I get the following error message:
I couldn't find instructions in the repo on how to link LLMs to the UI?
Steps to Reproduce
yarn
andyarn build-and-preview
Expected Behavior
The model should load and be ready for inference after selection.
Observed Behavior
The application displays the error:
Environment
Additional Information
Thank you for your help and cool project!
The text was updated successfully, but these errors were encountered: