-
Notifications
You must be signed in to change notification settings - Fork 424
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Leaky Quantized ReLU fix #961
Changes from all commits
4e483fc
0e973e6
ebbac33
e1e9157
49a30fe
e73ddc9
6db982a
6ea8ab0
a13c978
6640169
6028bb6
6c72dd9
61ae795
7b4047d
235aa5a
99afa33
7dd760d
11371c5
eed800d
fde9841
372d35d
928f383
3b561b4
da65388
f1d187d
700163b
3f48509
d209f5b
46a6e9b
26f32a1
d4642e7
b1a4472
ecec240
34c8070
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
from pathlib import Path | ||
|
||
import numpy as np | ||
import pytest | ||
from qkeras.qlayers import QActivation | ||
from qkeras.quantizers import quantized_relu | ||
from tensorflow.keras.models import Sequential | ||
|
||
import hls4ml | ||
|
||
test_root_path = Path(__file__).parent | ||
|
||
|
||
def randX(batch_size, N): | ||
return np.random.rand(batch_size, N) | ||
|
||
|
||
@pytest.fixture(scope='module') | ||
def randX_1000_1(): | ||
return randX(1000, 1) | ||
|
||
|
||
@pytest.mark.parametrize( | ||
'quantizer', | ||
[ | ||
(quantized_relu(4, negative_slope=0.5)), | ||
(quantized_relu(4, 2, negative_slope=0.5)), | ||
(quantized_relu(8, negative_slope=0.125)), | ||
(quantized_relu(8, 4, negative_slope=1.0)), | ||
(quantized_relu(10, negative_slope=0.25)), | ||
(quantized_relu(10, 5, negative_slope=0.5)), | ||
(quantized_relu(10, 5, negative_slope=0.25)), | ||
], | ||
) | ||
@pytest.mark.parametrize('backend', ['Vivado', 'Vitis', 'Quartus']) | ||
@pytest.mark.parametrize('io_type', ['io_parallel', 'io_stream']) | ||
def test_quantizer(randX_1000_1, quantizer, backend, io_type): | ||
''' | ||
Test a single quantizer as an Activation function. | ||
Using numpy's assert_allclose to check that the differnce between the converted layer and qkeras' is lower than of 10^-5. | ||
''' | ||
X = randX_1000_1 | ||
X = np.round(X * 2**10) * 2**-10 # make it an exact ap_fixed<16,6> | ||
model = Sequential() | ||
model.add(QActivation(input_shape=(1,), activation=quantizer, name='quantizer')) | ||
model.compile() | ||
|
||
config = hls4ml.utils.config_from_keras_model(model, granularity='name') | ||
output_dir = str( | ||
test_root_path | ||
/ 'hls4mlprj_qkeras_quantizer_{}_{}_{}_{}_{}'.format( | ||
quantizer.__class__.__name__, quantizer.bits, quantizer.integer, backend, io_type | ||
) | ||
) | ||
hls_model = hls4ml.converters.convert_from_keras_model( | ||
model, hls_config=config, output_dir=output_dir, backend=backend, io_type=io_type | ||
) | ||
hls_model.compile() | ||
|
||
y_qkeras = model.predict(X) | ||
y_hls4ml = hls_model.predict(X) | ||
np.testing.assert_allclose(y_hls4ml, y_qkeras, rtol=1e-5, atol=0) |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -273,6 +273,7 @@ def randX_1000_1(): | |
(quantized_relu(8, 4)), | ||
(quantized_relu(10)), | ||
(quantized_relu(10, 5)), | ||
(quantized_relu(10, 5, negative_slope=0.25)), | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Aren't you already testing this in the other test? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. In order to create the test suite requested for the PR I added the line in the already present qkeras test, however due to problems with the environment requested for the pytests, some issues not concerning my fix came up. So, I have written a separate script just to show that the "new" layer passes an accuracy test. If the |
||
], | ||
) | ||
@pytest.mark.parametrize('backend', ['Vivado', 'Vitis', 'Quartus']) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder, in what cases can this actually be
None
? One has to explicitly define the model like that, against all Keras tutorials.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have found out the if the QBatchNormalization layer is used without specifying an activation function, the default
layer.activation
is None, which breaks the if statement when thelinear
check is performed. Now that I am thinking about this again, I am not sure if another case should be added in this situation by forcing a linear activation.