-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Where to get the pretrained model with max-seq-length over 512? #7
Comments
I have the same concern when utilizing the ltp model on long docs, where the token length is greater than 512. Have you figured out how to resolve this? Thanks! |
Sorry, I dont find the solution either
…------------------ 原始邮件 ------------------
发件人: "kssteven418/LTP" ***@***.***>;
发送时间: 2022年7月11日(星期一) 上午7:25
***@***.***>;
***@***.******@***.***>;
主题: Re: [kssteven418/LTP] Where to get the pretrained model with max-seq-length over 512? (Issue #7)
I have the same concern when utilizing the ltp model on long docs, where the token length is greater than 512. Have you figured out how to resolve this? Thanks!
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Please find the comment at #8 , thank you. |
OK, thank you for your reply!!
…------------------ 原始邮件 ------------------
发件人: "Sehoon ***@***.***>;
发送时间: 2022年7月12日(星期二) 凌晨1:20
收件人: ***@***.***>;
抄送: ***@***.***>; ***@***.***>;
主题: Re: [kssteven418/LTP] Where to get the pretrained model with max-seq-length over 512? (Issue #7)
Please find the comment at #8 , thank you.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I am trying to train a ltp model tackling long document, but where can I get the pretrained model with max-seq-length over 512? As far as I know, pretrained models provided by huggingface are all limited to length 512.
The text was updated successfully, but these errors were encountered: