Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to sample the point cloud and choose the bound? #3

Open
maobenz opened this issue Jul 26, 2022 · 1 comment
Open

How to sample the point cloud and choose the bound? #3

maobenz opened this issue Jul 26, 2022 · 1 comment

Comments

@maobenz
Copy link

maobenz commented Jul 26, 2022

Hello,
Thanks for sharing the code. I have seen the preprocessed data includes point cloud with 2048 color points. I want to ask how you choose the points? Just randomly choosing 2048 points from the original point cloud?
And I see the code that near and far are 2 and 6, why do you set these values?
Another question is that how much time to train the whole model?
Thanks!

@Ideefixze
Copy link
Collaborator

Hi,

sorry for the delayed response.

Yes, choosing randomly 2048 points from original point cloud (or other 3D shape) should be okay as data. Remember, that you also need images and camera poses to train it.

Near and far arguments are for NeRF rendering. They describe minimum and maximum distance for point sampling along the ray. We set up those values to fit our objects in this distance. For example: we don't need a 'near' to be set up to a small value because object is a bit away from the camera.

As for training time, it took us up to 2 weeks as minimum for one ShapeNet class, but for better results or bigger dataset it could be trained another week or two.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants