You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for releasing the code. I am using a custom dataset with 10k graphs. I tried to update the code to include the kl divergence during training to check if there is overfitting on the smaller dataset. While the PosMSE seems fine, the results for E_kl and X_kl always give a nan for the training samples. Can you please tell me if there is something wrong with my approach?
Thank you for releasing the code. I am using a custom dataset with 10k graphs. I tried to update the code to include the kl divergence during training to check if there is overfitting on the smaller dataset. While the PosMSE seems fine, the results for E_kl and X_kl always give a nan for the training samples. Can you please tell me if there is something wrong with my approach?
self.train_metrics = torchmetrics.MetricCollection([custom_metrics.PosMSE(), custom_metrics.XKl(), custom_metrics.EKl()])
In my training_step, I invoke
nll, log_dict = self.compute_train_nll_loss(pred, z_t, clean_data=dense_data)
Finally, the method definition is
Any help would be highly appreciated.
Best,
Chinmay
The text was updated successfully, but these errors were encountered: