You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you for your awesome work, the experiment results of the code are quite consistent with the paper.
I have some question about the role of contrastive learning in SGL, I notice that in the training process, the loss of contrastive learning(in the function of calc_ssl_loss_v2) is not keep falling.
Actually, after 7 or 8 epochs (yelp2018 for example), when the bpr loss falling rapidly and indicators like Recall increase rapidly, the SSL loss is a little bit increasing, I'm quite confused about it.
Introducing Contrastive learning as a auxiliary task indeed accelerate convergence, but that doesn't show up in the loss. I would like to know what do you think of this, thank you!
Hi, thank you for your awesome work, the experiment results of the code are quite consistent with the paper.
I have some question about the role of contrastive learning in SGL, I notice that in the training process, the loss of contrastive learning(in the function of calc_ssl_loss_v2) is not keep falling.
Actually, after 7 or 8 epochs (yelp2018 for example), when the bpr loss falling rapidly and indicators like Recall increase rapidly, the SSL loss is a little bit increasing, I'm quite confused about it.
Introducing Contrastive learning as a auxiliary task indeed accelerate convergence, but that doesn't show up in the loss. I would like to know what do you think of this, thank you!
大佬你好,谢谢您的工作,这份代码能够完美复现文章中的结果,非常佩服。
我有一点关于对比学习在SGL中的作用的问题。与其他几个图对比学习预训练任务(比如图分类)不同,SGL中对比学习损失作为一个辅助损失,在训练过程中不是持续下降的。
比如在yelp2018数据集中,在Recall等指标上升最快的阶段,对比损失反而是有一点小增加的。作为一个辅助损失,对比学习确实加快了收敛,也提升了模型的效果,但是这一点并没有体现在对比损失的减小上,这让我有点困惑,请问您怎么看待这个问题?
谢谢!
The text was updated successfully, but these errors were encountered: