Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Public education strategy #10

Open
kirbysayshi opened this issue Mar 25, 2019 · 7 comments
Open

Public education strategy #10

kirbysayshi opened this issue Mar 25, 2019 · 7 comments

Comments

@kirbysayshi
Copy link
Contributor

It's hard to understand what's going on here, and why it's cool. Demos only get us so far as well.

Could a multipart blog post series help?

  • Introduction to the problem: shareable audio experiences and time stretching
  • How to make a sound board app (hit a button, enqueue a score, hear something)
  • Switching between two beat synced audio tracks on user click
  • ???
@hydrosquall
Copy link
Contributor

One topic I'm interested in is figuring out how to hook into the player's output for visualization purposes. A post along the lines of what MDN did for the general Web Audio API would be very helpful (assuming that there's a way to provide what this library is doing as a "source" to an AudioContext).

@kirbysayshi
Copy link
Contributor Author

kirbysayshi commented Apr 3, 2019

You could do that today with the existing ScriptProcessorRenderer:

import { ScriptProcessorRenderer, SmartPlayer } from 'nf-player';

const ctx = new AudioContext();
const a = ctx.createAnalyzer();

const r = new ScriptProcessorRenderer(ctx);
const p = new SmartPlayer(r);

// This is the script processor node, so you can just .connect it
r.processor.connect(a);

// And then do whatever you want with the analyzer!

This might be a cool demo to add to the playground if you'd like to try building it!

@hydrosquall
Copy link
Contributor

hydrosquall commented Apr 4, 2019

Ah, thanks for the code snippet! I would like to give this a try, thanks for the hint.

@hydrosquall
Copy link
Contributor

hydrosquall commented Apr 7, 2019

Hi @kirbysayshi, I tried to give this a spin by modifying the WebAudioAPI visualizer example inside a Codesandbox.

Something odd I noticed is that I'm no longer getting any sound from the sandbox. Audio seems to working on my computer (I checked youtube), and from the network inspector, it looks like the files are being loaded without error. I'm using latest Chrome on MacOS, and the sandbox worked before the latest update.

Anyways, I'm not familiar enough with the ScriptProcessorRenderer yet to figure out why the analyzer isn't picking up the new data. All the frequencies coming out of the FFT are just 0s. The good news is that the base player works since I'm at least getting sound out of it. I suspect that there might be something up with how I'm wiring the audio nodes together (maybe order).

Here's my code (see visualizer-sample.js): source / demo

For comparison, here is the same general code without SmartPlayer, to rule out build tool / codesandbox related blips (I modified the base example from webaudioapi.com a bit to make it work with webpack): https://codesandbox.io/s/zx2y3y9k1l .

(Update- I wasn't connecting the renderer to the SmartPlayer at initialization time, adding that fixed it. Now that this hello world is working, I will revisit this another weekend, since it's currently leaking memory and will crash the browser if it stays open too long.)

@kirbysayshi
Copy link
Contributor Author

@hydrosquall Glad to hear you got it working, basically! Audio is still working for me at the Playground. Is it failing for you in all demos on both CODE and JSON? One thing I often forget is to hit the "Eval/Load Code from Editor" button in the CODE demo.

@hydrosquall
Copy link
Contributor

Ah, not hitting the "eval" button before hitting "play" was exactly the issue for me in the sandbox! I was confused because the time indicator was incrementing, but no sound was coming out. Thanks.

The memory issue is taken care of as well now, I moved some of the constructors around. Now that the basics are working, what do you think about next steps? I thought about adding this to the existing playground, but there isn't really a place to put the Canvas in the current setup. Another option could be to do an example with a React component.

@kirbysayshi
Copy link
Contributor Author

Ah, not hitting the "eval" button before hitting "play" was exactly the issue for me in the sandbox! I was confused because the time indicator was incrementing, but no sound was coming out. Thanks.

It's something I mean to fix about the playground at some point :D

If you'd like, a simplified version of your demo could be added as a new React component here: https://github.com/spotify/NFPlayerJS/tree/master/demo/src/components

And then it could be added as a new "tab" (button) in the enum, https://github.com/spotify/NFPlayerJS/blob/164eab336ad929a7a8deeb24d417b2d868eed6a2/demo/src/components/App.tsx#L46-L49

and then rendered here: https://github.com/spotify/NFPlayerJS/blob/164eab336ad929a7a8deeb24d417b2d868eed6a2/demo/src/components/App.tsx#L92-L97

And if you use the JSON panel as a starting point: https://github.com/spotify/NFPlayerJS/blob/master/demo/src/components/JSONEditor/JSONEditor.tsx

There's also ready an existing canvas component from the visualizer too: https://github.com/spotify/NFPlayerJS/blob/master/demo/src/components/ScoreVisualizer/CanvasPowered.tsx

Try to keep it as simple as possible!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants