You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, @yyou1996. Thanks for your excellent work!
I have a question about the similarity changes when I run the unsupervised_TU codes:
First, as I run gsimclr.py with COLLAB dataset, I find that the loss is gradually decreasing as we expected;
Then, I try to print the pos_sim and the neg_sim values (the numerator and denominator of the loss, respectively) during the training process to observe their changes
And I find that there is little change in the pos_sim (I expect to see a significant improvement in pos_sim), while the neg_sim is gradually decreasing. The output info is as follows:
Epoch 1, Loss 461.42543468475344, p sim 94.16790313720703, n sim 2914.846399307251
Epoch 2, Loss 437.4408010482788, p sim 93.44444546699523, n sim 2210.465358352661
Epoch 3, Loss 415.9113293170929, p sim 94.00770704746246, n sim 1676.811231994629
Epoch 4, Loss 399.2864877343178, p sim 89.69291515350342, n sim 1463.0717622756958
Epoch 5, Loss 404.180169916153, p sim 91.31595058441162, n sim 1554.844750976562
Epoch 6, Loss 407.7608342051506, p sim 95.31525194644928, n sim 1615.2845653533936
Epoch 7, Loss 382.0407901287079, p sim 91.04753322601319, n sim 1260.7995797157287
Epoch 8, Loss 379.83556154966357, p sim 89.12307305335999, n sim 1163.8013469696045
...
So my question is, whether the training process has been reducing the similarity between positive and negative pairs without increasing the similarity between positive pairs? If so, does the conclusion that GraphCL tries to improve the consistency between different graph views still hold?
Look forward to your reply! Thanks.
The text was updated successfully, but these errors were encountered:
scottshufe
changed the title
Question about the changes of the positive pairs and negative pairs
Question about the changes of the similarity of positive pairs and negative pairs
Mar 31, 2022
Hi, @yyou1996. Thanks for your excellent work!
I have a question about the similarity changes when I run the unsupervised_TU codes:
First, as I run gsimclr.py with COLLAB dataset, I find that the loss is gradually decreasing as we expected;
Then, I try to print the pos_sim and the neg_sim values (the numerator and denominator of the loss, respectively) during the training process to observe their changes
GraphCL/unsupervised_TU/gsimclr.py
Line 132 in 1d43f79
And I find that there is little change in the pos_sim (I expect to see a significant improvement in pos_sim), while the neg_sim is gradually decreasing. The output info is as follows:
Epoch 1, Loss 461.42543468475344, p sim 94.16790313720703, n sim 2914.846399307251
Epoch 2, Loss 437.4408010482788, p sim 93.44444546699523, n sim 2210.465358352661
Epoch 3, Loss 415.9113293170929, p sim 94.00770704746246, n sim 1676.811231994629
Epoch 4, Loss 399.2864877343178, p sim 89.69291515350342, n sim 1463.0717622756958
Epoch 5, Loss 404.180169916153, p sim 91.31595058441162, n sim 1554.844750976562
Epoch 6, Loss 407.7608342051506, p sim 95.31525194644928, n sim 1615.2845653533936
Epoch 7, Loss 382.0407901287079, p sim 91.04753322601319, n sim 1260.7995797157287
Epoch 8, Loss 379.83556154966357, p sim 89.12307305335999, n sim 1163.8013469696045
...
So my question is, whether the training process has been reducing the similarity between positive and negative pairs without increasing the similarity between positive pairs? If so, does the conclusion that GraphCL tries to improve the consistency between different graph views still hold?
Look forward to your reply! Thanks.
The text was updated successfully, but these errors were encountered: