Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could not load the model because Error: Cannot find adapter that matches the request #35

Open
q050cr opened this issue Jan 2, 2025 · 0 comments

Comments

@q050cr
Copy link

q050cr commented Jan 2, 2025

I encountered an issue while trying to load a model in Secret Llama. The UI works correctly, but the models fail to initialize. Maybe because they are not linked to the UI. Instead, I get the following error message:

Could not load the model because Error: Cannot find adapter that matches the request

I couldn't find instructions in the repo on how to link LLMs to the UI?


Steps to Reproduce

  1. cloned the repo
  2. ran yarn and yarn build-and-preview
  3. opened UI

Expected Behavior

The model should load and be ready for inference after selection.


Observed Behavior

The application displays the error:

Could not load the model because Error: Cannot find adapter that matches the request

Environment

  • Browser: Google Chrome Version 131.0.6778.205 (Official Build) (arm64)
  • Operating System: macOS Sequoia 15.1.1 (24B91)

Additional Information

  • I checked the issues page and documentation but couldn't find a solution.
  • Is there a specific configuration where I can link the downloaded LLMs?

Thank you for your help and cool project!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant