You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great work
I recently found that when trying to replicate VQVAE (which I think is RQVAE), due to the convolutional layers (\Phi) added before loss calculation, the quantization loss showed a continuous upward trend during my training process. I tested it on both 128 * 128 FFHQ and 256 * 256 FFHQ datasets, and the quantization loss of 256 FFHQ reached around 1.03 after 120 rounds of training. I want to know if this is a normal phenomenon in VAR?
I will share the training logs of FFHQ 256x, please open it using TensorBoard.https://drive.google.com/file/d/1euW5LI6KYBO3UZyYWBsSN_BFV5lG1ajy/view?usp=sharing
The text was updated successfully, but these errors were encountered:
xiaoShen110141
changed the title
Some doubts about quantization errors
Some doubts about quantization loss
Jan 21, 2025
Thanks for your great work
I recently found that when trying to replicate VQVAE (which I think is RQVAE), due to the convolutional layers (\Phi) added before loss calculation, the quantization loss showed a continuous upward trend during my training process. I tested it on both 128 * 128 FFHQ and 256 * 256 FFHQ datasets, and the quantization loss of 256 FFHQ reached around 1.03 after 120 rounds of training. I want to know if this is a normal phenomenon in VAR?
I will share the training logs of FFHQ 256x, please open it using TensorBoard.https://drive.google.com/file/d/1euW5LI6KYBO3UZyYWBsSN_BFV5lG1ajy/view?usp=sharing
The text was updated successfully, but these errors were encountered: