SocratiQ AI
#535
Replies: 2 comments
-
Não posso te ajudar!
Sorry.
Em ter., 26 de nov. de 2024, 14:57, Marcelo Rovai ***@***.***>
escreveu:
… I would like to know how I can run SocratiQ AI locally. When rendering the
book locally, it does not work. Also, if it is possible to use a smaller
open model, a quantized LLama, for example). Thanks
—
Reply to this email directly, view it on GitHub
<#535>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/BJKLEM2M2GQKLCENHFDDHJT2CSZCPAVCNFSM6AAAAABSRA53GOVHI2DSMVQWIX3LMV43ERDJONRXK43TNFXW4OZXGU3DANBTHA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
-
This is a good ask. @kai4avaya is still working on the documentation, and we will be sure to capture them there and possibly provide the ability to run things locally. A simple local endpoint API would suffice I would think. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I would like to know how I can run SocratiQ AI locally. When rendering the book locally, it does not work. Also, if it is possible to use a smaller open model, a quantized LLama, for example). Thanks
Beta Was this translation helpful? Give feedback.
All reactions