We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[email protected]
v1.9.1
xinference部署的本地模型,包括deepseek和qwen2,在maxkb的模型设置可以正常对接
但是xinference部署的本地模型local-glm-4v-9b,在maxkb的模型设置不能正常对接
操作过程如上所述
希望maxkb能正常对接xinference部署的本地模型local-glm-4v-9b
No response
The text was updated successfully, but these errors were encountered:
zyyfit
No branches or pull requests
Contact Information
[email protected]
MaxKB Version
v1.9.1
Problem Description
xinference部署的本地模型,包括deepseek和qwen2,在maxkb的模型设置可以正常对接
但是xinference部署的本地模型local-glm-4v-9b,在maxkb的模型设置不能正常对接
Steps to Reproduce
操作过程如上所述
The expected correct result
希望maxkb能正常对接xinference部署的本地模型local-glm-4v-9b
Related log output
Additional Information
No response
The text was updated successfully, but these errors were encountered: