-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fail when metric given as function #18
Comments
I mean it finishes training on first batch but then fails on evaluation. |
So in my case the shapes of
in case of
|
Tracking further, looks like the metric passed as a function is not applied to each sample in a batch, o I've added comments around the part that creates
Output
|
In a
but still metric passed as a function returns different shape than passed as a string. |
For me a fix is to change (in
to
but this solution is not generic enough. And here you cannot use |
The problem I see now, with my fix is that I do see one value for my metric (for the whole batch) but I think it is wrong as this metric should not be calculated on a single sample but whole batch. Hence, I do not know if my PR really fixed the problem and the problem is with metric, or the other way around. |
Ok, so the problem is that my metric needs to be evaluated on the whole batch and does not make sense when calculated on a single sample, hence it cannot be fed to |
No problem. I see your solution is more elegant than mine. :) |
I have a model that I compile with
and it fails after the first epoch
when I run with 'accuracy' as a metric e.g.
everything is fine.
I've tried it even with a dummy metric
and it also fails.
I am using CUDA 10.0, TF '1.13.0-rc1', Keras 2.2.4 and latest keras-imporatance-sampling installed via pip. Without
ImportanceTraning
everything runs fine.The text was updated successfully, but these errors were encountered: