You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 2, 2023. It is now read-only.
Just a heads up, this package has been folded into the official hugginface.js repoand is already several updates ahead, specifically with fixes around handing binary responses. Consider downloading the latest version from there.
This bug may already been resolved in that version. I will be archiving this repo and deprecating the huggingface package in the very near future.
I'm keeping this issue open for the time being and will investigate / have a conclusion on this before archiving this repo.
Update:
I have tried the @huggingface/inference package, but the response was still Unexpected token E in JSON at position 0 (which is Expected request with Content-Type: application/json`.)
Upon investigation, this only occurs if I use google/flan-t5-xxl as a model, while gpt2 works perfectly fine.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I tried
First I got
Then I tried logging the
res.text()
and found:I think we should add that to
request
.The text was updated successfully, but these errors were encountered: