-
Notifications
You must be signed in to change notification settings - Fork 751
Issues: meta-llama/llama-stack
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Conversion of RawTextItem to TextContentItem causes no tokens to be generated
#829
opened Jan 19, 2025 by
AidanFRyan
2 tasks done
meta-llama/Llama-3.2-3B-Instruct-QLORA_INT4_EO8 not found
#824
opened Jan 19, 2025 by
AidanFRyan
1 of 2 tasks
Want to use client_tool calls, but code_interpreter is used instead
#820
opened Jan 18, 2025 by
aidando73
1 of 2 tasks
llama stack build not working with prefix based environments
#804
opened Jan 17, 2025 by
aidando73
1 of 2 tasks
Determining model reference when deploying Ollama model with docker.
#784
opened Jan 16, 2025 by
WhyPine
1 of 2 tasks
Free up GPU memory after unregistering model in meta reference inference
enhancement
New feature or request
#768
opened Jan 15, 2025 by
SLR722
500 Internal Server Error During Image Inferencing Because "Request URL is missing an 'http://' or 'https://' protocol"
#740
opened Jan 10, 2025 by
dawenxi-007
1 of 2 tasks
Executing llama on Windows causes a ModuleNotFoundError for termios
os::windows
#726
opened Jan 5, 2025 by
briancabbott
1 of 2 tasks
Executing setup.py build on Windows 11 throws a UnicodeDecodeError.
os::windows
#724
opened Jan 5, 2025 by
briancabbott
1 of 2 tasks
Permission denied error on running 'llama-stack-client providers list'
question
Further information is requested
#691
opened Dec 28, 2024 by
anant2614
2 tasks
[RFC] Support multi modal retrieval on top of llama stack, inference provider side
#667
opened Dec 20, 2024 by
benjibc
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.