-
-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🦠 Model Request: NPBERT-Antimalaria #904
Comments
@GemmaTuron I couldn't access the complete article. |
Hi @leilayesufu |
try to run the BERT model with their pretrained checkpoints |
@GemmaTuron I tried running the code as seen here https://github.com/mldlproject/2021-NPBERT-Antimalaria/tree/main/training/NPBERT_pretrained_model/save_model |
The model is on hold since the checkpoints are not provided. I've opened an issue on their repository asking for them. |
@GemmaTuron did they ever get back? |
no, I'll do a follow up! |
This will not be implemented since we do not have the checkpoints and the authors haven't responded. |
Model Name
Predicting Antimalarial Activity in Natural Products Using Pretrained Bidirectional Encoder Representations from Transformers
Model Description
This model uses a molecular encoding scheme based on Bidirectional Encoder Representations from Transformers (BERT), employing a pretrained encoding model called NPBERT. Four machine learning algorithms—k-Nearest Neighbors (k-NN), Support Vector Machines (SVM), eXtreme Gradient Boosting (XGB), and Random Forest (RF) were employed to create prediction models. The results indicate that SVM models outperform others, and the proposed NPBERT molecular encoding scheme is more effective than existing methods.
Slug
NPBERT-Antimalaria
Tag
Malaria, P.falciparum
Publication
https://pubs.acs.org/doi/full/10.1021/acs.jcim.1c00584
Source Code
https://github.com/mldlproject/2021-NPBERT-Antimalaria
License
None
The text was updated successfully, but these errors were encountered: