Flowise integration (working) #335
cnndabbler
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Really happy to see llama-cpp-python server (port 8000) working well with flowise using ChatLocalAI node !!!
Note: so much happier than feeling a bug report lol.
Thanks to llama.cpp for latest updates too.
Setting up flowise node to point to llama-cpp-python server
Running code to execute flowise pipeline
import requests
import textwrap
API_URL = "http://127.0.0.1:3000/api/v1/prediction/xxxxxxxxxxxxxxxxxxxxxxxxxxx"
def query(payload):
response = requests.post(API_URL, json=payload)
return response.json()
output = query({
"question": "who is the king of France?",
"max_Tokens": 20
})
print(textwrap.fill(output))
Results
Beta Was this translation helpful? Give feedback.
All reactions