Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

visualization the attention map #105

Open
yancx8 opened this issue Nov 13, 2024 · 2 comments
Open

visualization the attention map #105

yancx8 opened this issue Nov 13, 2024 · 2 comments
Labels

Comments

@yancx8
Copy link

yancx8 commented Nov 13, 2024

Thank you for your work!
I want visualization the result of cross attention in BasicTransformerBlock, how can i do? I tried get the attention map, but there is no parameter "need_weights" .
Looking forward for your answer!

@flymin
Copy link
Member

flymin commented Nov 18, 2024

You may check xformers to acquire the attention map. xformers adopts a block-wise calculation where no explicit "map" is stored in the memory.

Copy link

This issue is stale because it has been open for 7 days with no activity. If you do not have any follow-ups, the issue will be closed soon.

@github-actions github-actions bot added the stale label Nov 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants