Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regarding the issues of parameters in the config_HSIT.json file, the number of heads in multi-head attention, and the calculation of metrics. #12

Open
EchoPhD opened this issue Apr 5, 2023 · 2 comments

Comments

@EchoPhD
Copy link

EchoPhD commented Apr 5, 2023

@wgcban
Hello sir,
thank you for your outstanding work and providing code on HyperTransformer. In order to cite your paper better, I have a few questions. Firstly, in the paper, it was mentioned that the best performance was achieved when the number of heads in the multi-head attention was 16. However, the best model provided by you in config_HSIT.json was using 8 heads, and there were errors in the RGB parameters in the same file. Can you provide the correct best model and config_HSIT.json file? It is difficult to reproduce your method without the correct files.
Secondly, for the calculation of the metrics, did you use the results generated by the code or did you re-calculate them using MATLAB?
Your response is crucially important, and I am very grateful for your work.

@EchoPhD
Copy link
Author

EchoPhD commented May 3, 2023

@wgcban There are many errors in the code, and many errors are found later. Could you please provide the final correct code and the best model you trained? We look forward to repeating your excellent work.

@hachreak
Copy link

Hi @wgcban @HaiMaoShiTang
there are any news about working code/config/pretrained? :)
Thanks a lot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants