-
Notifications
You must be signed in to change notification settings - Fork 467
Issues: InternLM/lmdeploy
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug] The program on the main graphics card gets stuck when running with multiple graphics cards.
#3071
opened Jan 22, 2025 by
LKELN
3 tasks done
[Bug] turbomind has not been built with fp32 support
awaiting response
#3065
opened Jan 21, 2025 by
syorami
2 of 3 tasks
[Bug] LMDeploy v0.6.4-cu12使用2张4090无法启动和推理, v0.6.0-cu12 可以正常启动和推理
#3062
opened Jan 21, 2025 by
simonwei97
3 tasks done
[Bug] 多轮对话历史中如果包含tool_calls,则response中的tool_calls arguments会被额外嵌套一层string
#3058
opened Jan 21, 2025 by
ExenVitor
3 tasks done
[Bug] MLP.inter_size (Qwen-14B-Chat) read from config not Applicable.
#3054
opened Jan 20, 2025 by
coolhok
2 of 3 tasks
[Bug] gemma2 model reports an error when doing evaluation
#3048
opened Jan 17, 2025 by
zhulinJulia24
3 tasks
Is there a convenient way to perform model conversion?
awaiting response
Stale
#3043
opened Jan 16, 2025 by
Lanbai-eleven
[Bug] history tokens is not correct with /v1/chat/interactive.
#3032
opened Jan 15, 2025 by
zhulinJulia24
3 tasks
Previous Next
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.