Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Update: (March 9, 2024)
Removed all the hacks!
@DJ4ddi had a better way to handle it using a library for such a case and I have incorporated their changes!
(Old) Warning: I don't think this belongs in the main, but maybe in an another branch...
This is an awesome lightweight ollama client to play with. Thank you for creating it!
Sadly, my "ollama serve" was sending fragmented JSONs and the stream reader logic in api.ts using the /api/generate api was getting confused and nothing was getting loaded in the chats.
Well since there is a bug bounty on a completely different approach using the openai API this is just a hacky solution until then for someone running into a similar issue on their local ollama server. Thought I would share this.
< bio-hazard suit needed below >
'chunk' was sometimes filled with '{...}{...}...' type responses which was breaking the JSON parse. So added an inner loop to collapse these into a single response. The fix is probably missing other cases but this mostly fixes the problems I am seeing,
Also for the 'done': true with the context field In the end the reader was getting fragmented JSONs because of the size so another fix ends up reading to the end since this is the last message anyway.