Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some doubts about quantization loss #132

Open
xiaoShen110141 opened this issue Jan 21, 2025 · 0 comments
Open

Some doubts about quantization loss #132

xiaoShen110141 opened this issue Jan 21, 2025 · 0 comments

Comments

@xiaoShen110141
Copy link

xiaoShen110141 commented Jan 21, 2025

Thanks for your great work
I recently found that when trying to replicate VQVAE (which I think is RQVAE), due to the convolutional layers (\Phi) added before loss calculation, the quantization loss showed a continuous upward trend during my training process. I tested it on both 128 * 128 FFHQ and 256 * 256 FFHQ datasets, and the quantization loss of 256 FFHQ reached around 1.03 after 120 rounds of training. I want to know if this is a normal phenomenon in VAR?
I will share the training logs of FFHQ 256x, please open it using TensorBoard.https://drive.google.com/file/d/1euW5LI6KYBO3UZyYWBsSN_BFV5lG1ajy/view?usp=sharing

@xiaoShen110141 xiaoShen110141 changed the title Some doubts about quantization errors Some doubts about quantization loss Jan 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant