Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

useWisper runs on client, how to protect OPENAI_API_TOKEN? #55

Open
PrimeObjects opened this issue Jan 24, 2024 · 5 comments
Open

useWisper runs on client, how to protect OPENAI_API_TOKEN? #55

PrimeObjects opened this issue Jan 24, 2024 · 5 comments

Comments

@PrimeObjects
Copy link

No description provided.

@jemstelos
Copy link

There's an example of how to do this in the readme - you can use the onTranscribe method and call your server where your API token is used.

const App = () => {
  /**
   * you have more control like this
   * do whatever you want with the recorded speech
   * send it to your own custom server
   * and return the response back to useWhisper
   */
  const onTranscribe = (blob: Blob) => {

@PatricioDieck
Copy link

Maybe I'm super stupid but I am also having this same issue. The onTranscribe func is turning the blob into a base64. This is not compatible with the requirements for the whisper api file types. What am I missing here??? trying to connect this onTranscribe to my API endpoint in my Next.js 14 app so that the key doesnt get exposed to the client.

@marcsalesgrid
Copy link

The base64 encoding is to transport the data. In your backend you need to decode the base64 data and tuern it into a file, which you then send to the whisper API endpoint.

Maybe I'm super stupid but I am also having this same issue. The onTranscribe func is turning the blob into a base64. This is not compatible with the requirements for the whisper api file types. What am I missing here??? trying to connect this onTranscribe to my API endpoint in my Next.js 14 app so that the key doesnt get exposed to the client.

@ashwin-maurya
Copy link

@PrimeObjects Hey is it working for you? Mine is giving {blob: undefined, text: undefined} as response, do you have any idea about this

@daveycodez
Copy link

Doesn't work with streaming though? It's still calling /transcription via Open AI with streaming, only calls my API when I press stop...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants