-
Notifications
You must be signed in to change notification settings - Fork 140
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Something wrong when parsing the JSON data returned by Ollama model #2996
Comments
Current connector doesn't support streaming, the issue tracking this #2484 Can you try to change request_body in connector to
|
Hi, @ylwu-amzn , I did what you suggested to me but failed. The following parts are the request I send and corresponding (error) response and my connector configuration. Hope to hear from u asap. Thanks a lot.
|
Change request_body to
|
@jerry0li Can you confirm if this can work or not ? |
@ylwu-amzn Sry for the late response. I'm afraid it doesn't work and will never work unless the Ollama format below is supported.
Do I make myself clear? Glad to hear from u ! Thx, bro! |
Description
I'd like to create a connector to call my self-hosted Ollama model but failed. It shows the MalformedJsonException. Do not know what happened under the hood.
To Reproduce
Steps to reproduce the behavior:
http://10.0.221.10:11434
is my self-hosted endpoint for Ollama service. It works so don't worry about it!Eventually, the model is deployed successfully as I can see the status is
Responding
when clicking Machine Learning button in the sidebar.Error msg:
Opensearch logs:
As we can see, Opensearch received the response from my self-hosted Ollama model service. The problem is the process of parsing json.
Thank you for reading this and helping a dizzy developer.
The text was updated successfully, but these errors were encountered: