-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Issues: Dao-AILab/flash-attention
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Conflict When Installing flash-attn 2.7.3 and 3.0.0b1 Together
#1459
opened Jan 24, 2025 by
quanta42
BUG? static_assert(!(!Mma1_is_RS && !IntraWGOverlap), "Mma1 must be RS if IntraWGOverlap is enabled");
#1450
opened Jan 20, 2025 by
ziyuhuang123
IncompatibleTypeErrorImpl('invalid operands of type pointer<int64> and triton.language.int32')
#1439
opened Jan 11, 2025 by
wuyouliaoxi
ERROR: No matching distribution found for flash-attn==2.6.3+cu123torch2.4cxx11abifalse
#1423
opened Jan 6, 2025 by
carolynsoo
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.