-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
useWisper runs on client, how to protect OPENAI_API_TOKEN? #55
Comments
There's an example of how to do this in the readme - you can use the onTranscribe method and call your server where your API token is used.
|
Maybe I'm super stupid but I am also having this same issue. The onTranscribe func is turning the blob into a base64. This is not compatible with the requirements for the whisper api file types. What am I missing here??? trying to connect this onTranscribe to my API endpoint in my Next.js 14 app so that the key doesnt get exposed to the client. |
The base64 encoding is to transport the data. In your backend you need to decode the base64 data and tuern it into a file, which you then send to the whisper API endpoint.
|
@PrimeObjects Hey is it working for you? Mine is giving {blob: undefined, text: undefined} as response, do you have any idea about this |
Doesn't work with streaming though? It's still calling /transcription via Open AI with streaming, only calls my API when I press stop... |
No description provided.
The text was updated successfully, but these errors were encountered: