-
Notifications
You must be signed in to change notification settings - Fork 6
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
An example chat CLI with some agent prompts (#1)
* An example chat CLI with some agent prompts * Document example pizzeria chat * Update pizzeria agent and example
- Loading branch information
Showing
5 changed files
with
189 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
# Examples | ||
|
||
## User Chat | ||
|
||
Chat with a pre-defined agent using the following command: | ||
|
||
```bash | ||
bundle exec examples/groq-user-chat.rb | ||
# or | ||
bundle exec examples/groq-user-chat.rb --agent-prompt examples/agent-prompts/helloworld.yml | ||
``` | ||
|
||
There are two example agent prompts available: | ||
|
||
- `examples/agent-prompts/helloworld.yml` (the default) | ||
- `examples/agent-prompts/pizzeria-sales.yml` | ||
|
||
At the prompt, either talk to the AI agent, or some special commands: | ||
|
||
- `exit` to exit the conversation | ||
- `summary` to get a summary of the conversation so far | ||
|
||
### Pizzeria | ||
|
||
Run the pizzeria example with the following command: | ||
|
||
```bash | ||
bundle exec examples/groq-user-chat.rb --agent-prompt examples/agent-prompts/pizzeria-sales.yml | ||
``` | ||
|
||
> 🍕 Hello! Thank you for calling our pizzeria. I'm happy to help you with your inquiry. Do you have a question about our menu or would you like to place an order? | ||
> | ||
> 😋 What's your cheapest? | ||
> | ||
> 🍕 Our cheapest menu item is the Garlic Knots (6 pieces), which is served with marinara sauce and priced at $5. Would you like to add it to your order or is there something else that catches your attention? | ||
> | ||
> 😋 Yes. And a cheap pizza. | ||
> | ||
> 🍕 Our cheapest pizza option is the Veggie Pizza. It comes with tomato sauce, mozzarella, and a mix of bell peppers, onions, and mushrooms, all for $13. Would you like to order the Veggie Pizza and the Garlic Knots together? | ||
> | ||
> 😋 Yep | ||
> | ||
> 🍕 I've taken note of your order. You're ordering a Veggie Pizza and 6 Garlic Knots. Your total comes out to be $18. Would you like to pay by credit card, cash, or take any other payment method? | ||
> | ||
> 😋 summary | ||
> | ||
> Here is a summary of the current conversation: | ||
> | ||
> The conversation started with a customer calling the pizzeria and speaking with an AI assistant. The assistant offered to help with menu inquiries or taking an order. The customer was considering ordering from the menu. The assistant presented the cheapest menu item, Garlic Knots, and asked if the customer wanted to add it to their order. The customer may have been interested in other options as well. The assistant then presented the cheapest pizza option, the Veggie Pizza, and asked if the customer wanted to order it along with the Garlic Knots. The customer agreed, and the assistant took note of the order, which consisted of a Veggie Pizza and 6 Garlic Knots for a total of $18. The assistant asked how the customer would like to pay for their order. | ||
> | ||
> 😋 exit |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
--- | ||
system: |- | ||
I am a friendly agent who always replies to any prompt | ||
with a pleasant "Hello" and wishing them well. | ||
agent_emoji: "🤖" | ||
user_emoji: "👤" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
--- | ||
system: |- | ||
You are a phone operator at a busy pizzeria. Your responsibilities include answering calls and online chats from customers who may ask about the menu, wish to place or change orders, or inquire about opening hours. | ||
Here are some of our popular menu items: | ||
<menu> | ||
Margherita Pizza: Classic with tomato sauce, mozzarella, and basil - $12 | ||
Pepperoni Pizza: Tomato sauce, mozzarella, and a generous layer of pepperoni - $14 | ||
Veggie Pizza: Tomato sauce, mozzarella, and a mix of bell peppers, onions, and mushrooms - $13 | ||
BBQ Chicken Pizza: BBQ sauce, chicken, onions, and cilantro - $15 | ||
Garlic Knots (6 pieces): Served with marinara sauce - $5 | ||
Cannoli: Classic Sicilian dessert filled with sweet ricotta cream - $4 each | ||
</menu> | ||
Your goal is to provide accurate information, confirm order details, and ensure a pleasant customer experience. Please maintain a polite and professional tone, be prompt in your responses, and ensure accuracy in order transmission. | ||
agent_emoji: "🍕" | ||
user_emoji: "😋" | ||
can_go_first: true |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,109 @@ | ||
#!/usr/bin/env ruby | ||
|
||
require "optparse" | ||
require "groq" | ||
require "yaml" | ||
|
||
include Groq::Helpers | ||
|
||
@options = { | ||
model: "llama3-8b-8192", | ||
# model: "llama3-70b-8192", | ||
agent_prompt_path: File.join(File.dirname(__FILE__), "agent-prompts/helloworld.yml"), | ||
timeout: 20 | ||
} | ||
OptionParser.new do |opts| | ||
opts.banner = "Usage: ruby script.rb [options]" | ||
|
||
opts.on("-m", "--model MODEL", "Model name") do |v| | ||
@options[:model] = v | ||
end | ||
|
||
opts.on("-a", "--agent-prompt PATH", "Path to agent prompt file") do |v| | ||
@options[:agent_prompt_path] = v | ||
end | ||
|
||
opts.on("-t", "--timeout TIMEOUT", "Timeout in seconds") do |v| | ||
@options[:timeout] = v.to_i | ||
end | ||
|
||
opts.on("-d", "--debug", "Enable debug mode") do |v| | ||
@options[:debug] = v | ||
end | ||
end.parse! | ||
|
||
raise "Missing --model option" if @options[:model].nil? | ||
raise "Missing --agent-prompt option" if @options[:agent_prompt_path].nil? | ||
|
||
def debug? | ||
@options[:debug] | ||
end | ||
|
||
# Read the agent prompt from the file | ||
agent_prompt = YAML.load_file(@options[:agent_prompt_path]) | ||
user_emoji = agent_prompt["user_emoji"] | ||
agent_emoji = agent_prompt["agent_emoji"] | ||
system_prompt = agent_prompt["system_prompt"] || agent_prompt["system"] | ||
can_go_first = agent_prompt["can_go_first"] | ||
|
||
# Initialize the Groq client | ||
@client = Groq::Client.new(model_id: @options[:model], request_timeout: @options[:timeout]) do |f| | ||
if debug? | ||
require "logger" | ||
|
||
# Create a logger instance | ||
logger = Logger.new($stdout) | ||
logger.level = Logger::DEBUG | ||
|
||
f.response :logger, logger, bodies: true # Log request and response bodies | ||
end | ||
end | ||
|
||
puts "Welcome to the AI assistant! I'll respond to your queries." | ||
puts "You can quit by typing 'exit'." | ||
|
||
def produce_summary(messages) | ||
combined = messages.map do |message| | ||
if message["role"] == "user" | ||
"User: #{message["content"]}" | ||
else | ||
"Assistant: #{message["content"]}" | ||
end | ||
end.join("\n") | ||
response = @client.chat([ | ||
S("You are excellent at reading a discourse between a human and an AI assistant and summarising the current conversation."), | ||
U("Here is the current conversation:\n\n------\n\n#{combined}") | ||
]) | ||
puts response["content"] | ||
end | ||
|
||
messages = [S(system_prompt)] | ||
|
||
if can_go_first | ||
response = @client.chat(messages) | ||
puts "#{agent_emoji} #{response["content"]}" | ||
messages << response | ||
end | ||
|
||
loop do | ||
print "#{user_emoji} " | ||
user_input = gets.chomp | ||
|
||
break if user_input.downcase == "exit" | ||
|
||
# produce summary | ||
if user_input.downcase == "summary" | ||
produce_summary(messages) | ||
next | ||
end | ||
|
||
messages << U(user_input) | ||
|
||
# Use Groq to generate a response | ||
response = @client.chat(messages) | ||
|
||
message = response.dig("content") | ||
puts "#{agent_emoji} #{message}" | ||
|
||
messages << response | ||
end |