-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
attention map visualization code #6
Comments
@ysj9909 Thanks for your interests. |
Thank you!!! |
大佬,您好,我有点不懂如何根据不同的query points(不同的position patch)去得到对应的attention map,是对该patch位置进行了掩码得到对应的attention还是?期待您的回复。 |
@haiduo 你好,就是算出对应位置的attention。 |
您好,您方便分享下这块生成attention的代码不,我还是不理解“不同的query points(不同的position patch)去得到对应的attention map”,可以的话加我的微信 abc1056530546,或者发我的邮箱 haiduohaiduo@outlook.com, 谢谢您的回复。 |
@haiduo 当然,代码没有好好整理,不过应该可以直接跑。 https://drive.google.com/drive/folders/1okWf2noIrnDdOBNYUfl_iZe3LLjb_59n?usp=share_link @ysj9909 Very sorry about the late reply! The visualization code can be downloaded here: https://drive.google.com/drive/folders/1okWf2noIrnDdOBNYUfl_iZe3LLjb_59n?usp=share_link Please let me know if you have any further questions or concerns. Best, |
大佬,我把你的代码看了,明白了您的不同query对应不同的attention map。 首先,我不太认同这个针对某一个query(或 token)对应的计算的加权值(也即是该query对应的全局attention map的某一行)通过reshape成14x14就是attention map,最多只能称为该token与其他token对应的相关性(也就是一个向量)。其次,你发现的这个现象其实也不奇怪,因为之前已经发现,随着Vit系列的模型越深的block对应的attention map就表现出竖条形状(参考: vit_demo ) |
@haiduo 谢谢你的思考和留言,帮助我们提高自己的work |
Thank you for the release of the code for your paper.
I was curious whether you could additionally also share the code that produced Fig. 1 of the paper i.e. the attention maps.
The text was updated successfully, but these errors were encountered: