Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attention map visualization code #6

Open
ysj9909 opened this issue May 9, 2023 · 8 comments
Open

attention map visualization code #6

ysj9909 opened this issue May 9, 2023 · 8 comments

Comments

@ysj9909
Copy link

ysj9909 commented May 9, 2023

Thank you for the release of the code for your paper.

I was curious whether you could additionally also share the code that produced Fig. 1 of the paper i.e. the attention maps.

@ma-xu
Copy link
Owner

ma-xu commented May 10, 2023

@ysj9909 Thanks for your interests.
I can add it by the end of May since I'm really busy recently. I will let you know once added.

@ysj9909
Copy link
Author

ysj9909 commented May 11, 2023

Thank you!!!

@haiduo
Copy link

haiduo commented Oct 22, 2023

@ysj9909 Thanks for your interests. I can add it by the end of May since I'm really busy recently. I will let you know once added.

大佬,您好,我有点不懂如何根据不同的query points(不同的position patch)去得到对应的attention map,是对该patch位置进行了掩码得到对应的attention还是?期待您的回复。

@ma-xu
Copy link
Owner

ma-xu commented Oct 23, 2023

@haiduo 你好,就是算出对应位置的attention。

@haiduo
Copy link

haiduo commented Oct 23, 2023

@haiduo 你好,就是算出对应位置的attention。

您好,您方便分享下这块生成attention的代码不,我还是不理解“不同的query points(不同的position patch)去得到对应的attention map”,可以的话加我的微信 abc1056530546,或者发我的邮箱 haiduohaiduo@outlook.com, 谢谢您的回复。

@ma-xu
Copy link
Owner

ma-xu commented Oct 23, 2023

@haiduo 当然,代码没有好好整理,不过应该可以直接跑。 https://drive.google.com/drive/folders/1okWf2noIrnDdOBNYUfl_iZe3LLjb_59n?usp=share_link

@ysj9909 Very sorry about the late reply! The visualization code can be downloaded here: https://drive.google.com/drive/folders/1okWf2noIrnDdOBNYUfl_iZe3LLjb_59n?usp=share_link
It is not well re-organized, but should work well.

Please let me know if you have any further questions or concerns.

Best,
Xu

@haiduo
Copy link

haiduo commented Oct 24, 2023

@haiduo 当然,代码没有好好整理,不过应该可以直接跑。 https://drive.google.com/drive/folders/1okWf2noIrnDdOBNYUfl_iZe3LLjb_59n?usp=share_link

@ysj9909 Very sorry about the late reply! The visualization code can be downloaded here: https://drive.google.com/drive/folders/1okWf2noIrnDdOBNYUfl_iZe3LLjb_59n?usp=share_link It is not well re-organized, but should work well.

Please let me know if you have any further questions or concerns.

Best, Xu

大佬,我把你的代码看了,明白了您的不同query对应不同的attention map。 首先,我不太认同这个针对某一个query(或 token)对应的计算的加权值(也即是该query对应的全局attention map的某一行)通过reshape成14x14就是attention map,最多只能称为该token与其他token对应的相关性(也就是一个向量)。其次,你发现的这个现象其实也不奇怪,因为之前已经发现,随着Vit系列的模型越深的block对应的attention map就表现出竖条形状(参考: vit_demo
所以不同行的query表现的形式基本一样,也就是您所谓的reshape后的14x14的“attention map”。
最后,感谢您的code分享,如有不同的想法,欢迎交流,谢谢。

@ma-xu
Copy link
Owner

ma-xu commented Oct 24, 2023

@haiduo 谢谢你的思考和留言,帮助我们提高自己的work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants