Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG? get the wrong value when logit_scale is 0 #1461

Open
shunshen93 opened this issue Jan 24, 2025 · 0 comments
Open

BUG? get the wrong value when logit_scale is 0 #1461

shunshen93 opened this issue Jan 24, 2025 · 0 comments

Comments

@shunshen93
Copy link

shunshen93 commented Jan 24, 2025

https://github.com/Dao-AILab/flash-attention/blob/v2.7.3/flash_attn/ops/triton/cross_entropy.py#L54

  1. when logit_scale is 0.0, -inf * (0.0) is nan
  2. when logit_scale is -1.0, -inf * (-1.0) is inf, inf - inf is nan
    so, when logit_scale is not bigger than 0, the result will be nan

and there is another complie error when n_cols is 1

@shunshen93 shunshen93 changed the title BUG? get the wrong value when logit_scale is smaller than 0 BUG? get the wrong value when logit_scale is 0 Jan 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant