Skip to content

Latest commit

 

History

History
23 lines (17 loc) · 817 Bytes

README.md

File metadata and controls

23 lines (17 loc) · 817 Bytes

SFAVEL: Unsupervised Pretraining for Fact Verification by Language Model Distillation

This is the official implementation of the paper "Unsupervised Pretraining for Fact Verification by Language Model Distillation".

Code coming up soon, stay tuned!

Citation

@inproceedings{
bazaga2024unsupervised,
title={Unsupervised Pretraining for Fact Verification by Language Model Distillation},
author={Adrián Bazaga and Pietro Liò and Gos Micklem},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=1mjsP8RYAw}
}

Contact

For feedback, questions, or press inquiries please contact Adrián Bazaga