Skip to content
This repository has been archived by the owner on Nov 13, 2024. It is now read-only.

Revert flag to no rag #106

Merged
merged 6 commits into from
Oct 29, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -188,8 +188,8 @@ INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
> **_📝 NOTE:_**
>
> The canopy start command will keep the terminal occupied. To proceed with the next steps, please open a new terminal window.
> and make sure all the environment variables described in the [installation](#how-to-install) section are set.
> If you want to run the service in the background, you can use the following command - **```nohup canopy start &```**
> However, this is not recommended.


### 4. Chat with your data
Expand Down
10 changes: 5 additions & 5 deletions src/canopy_cli/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -377,7 +377,7 @@ def _chat(
RAG-infused ChatBot will respond. You can continue the conversation by entering
more messages. Hit Ctrl+C to exit.

To compare RAG-infused ChatBot with the original LLM, run with the `--baseline`
To compare RAG-infused ChatBot with the original LLM, run with the `--no-rag`
flag, which would display both models' responses side by side.
"""

Expand All @@ -387,11 +387,11 @@ def _chat(
help="Stream the response from the RAG chatbot word by word")
@click.option("--debug/--no-debug", default=False,
help="Print additional debugging information")
@click.option("--baseline/--no-baseline", default=False,
help="Compare RAG-infused Chatbot with baseline LLM",)
@click.option("--rag/--no-rag", default=True,
help="Compare RAG-infused Chatbot with vanilla LLM",)
@click.option("--chat-service-url", default="http://0.0.0.0:8000",
help="URL of the Canopy service to use. Defaults to http://0.0.0.0:8000")
def chat(chat_service_url, baseline, debug, stream):
def chat(chat_service_url, rag, debug, stream):
check_service_health(chat_service_url)
note_msg = (
"🚨 Note 🚨\n"
Expand Down Expand Up @@ -431,7 +431,7 @@ def chat(chat_service_url, baseline, debug, stream):
print_debug_info=debug,
)

if baseline:
if not rag:
_ = _chat(
speaker="Without Context (No RAG)",
speaker_color="yellow",
Expand Down