Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System Message is doesnt seem to be sent to LLM when using LiteLLMModel #302

Open
dadaphl opened this issue Jan 22, 2025 · 0 comments
Open

Comments

@dadaphl
Copy link

dadaphl commented Jan 22, 2025

Version: 1.4.1

I'm using the code form the documentation for "leopard-Pont des Arts" question. When using the LiteLLMModel with ollama and Qwen2.5-Instruct the system message with the instructions about code calling dont seem to be acually used in the LLM. I made several experiments and the answers that I can inspect in the Phoenex are always exactly the same as if I just ask the task question directly in ollama console without any instructions. If I manually copy paste the system message I get the expected response with python code. I can not investigate this any further.

model_id = "ollama/qwen2.5-coder:7b-instruct-q4_K_M"                                                                                                                               
model = LiteLLMModel(model_id=model_id)                                                                                                                                              
agent = CodeAgent(tools=[DuckDuckGoSearchTool()], model=model)                                                                                                                       
agent.run("How many seconds would it take for a leopard at full speed to run through Pont des Arts?")  

I'm also getting this errors. I tried with LiteLLMMode connected to ollama and tabbyAPI with different quantisations of Qwen2.5-Instruct and DeepseekR1. Judging from observing the "Trace Details" in Pheonix the in system message ("You are an expert assistant who can solve any task using code blobs...") doesn't seem to be sent to the LLM. At least the answer makes me believe that.

Example:

system:
...
created variables or imported modules, these will all persist.
10. Don't give up! You're in charge of solving the task, not providing directions to solve it.

Now Begin! If you solve the task correctly, you will receive a reward of $1,000,000.

user:

New task:
How many seconds would it take for a leopard at full speed to run through Pont des Arts?

Assistant:

To answer this question, we need to make some assumptions and estimates:

1. Length of Pont des Arts: Approximately 200 meters (656 feet)
2. Average running speed of a leopard: About 30-40 kilometers per hour (18.6-24.9 miles per hour)

First, let's convert the length from meters to kilometers:
200 meters = 0.2 kilometers

Next, we'll calculate the time it would take for a leopard to run this distance at an average speed of 35 km/h:

Time = Distance / Speed
     = 0.2 km / 35 km/h
     ≈ 0.005714 hours

To convert this time into seconds:
0.005714 hours × (60 minutes/hour) × (60 seconds/minute)
= 20 seconds (rounded to the nearest second)

Therefore, it would take a leopard approximately 20 seconds to run through Pont des Arts at full speed.

It's worth noting that this is an estimate based on average speeds and real-world factors may affect actual performance.

When i copy paste the system message with the task in ollama directly i get a proper response

... ', 'collections']
...   9. The state persists between code executions: so if in one step you've created variables or imported modules, these will all persist.
...   10. Don't give up! You're in charge of solving the task, not providing directions to solve it.
... 
...   Now Begin! If you solve the task correctly, you will receive a reward of $1,000,000.
... 
... Task "How many seconds would it take for a leopard at full speed to run through Pont des Arts?"
... 
Thought: To determine how many seconds it would take for a leopard at full speed to run through the Pont des Arts, we need to know the length of the bridge and the maximum speed of a 
leopard.

1. Use `web_search` to find the length of the Pont des Arts.
2. Use `web_search` to find the maximum speed of a leopard.
3. Convert the length of the bridge to meters if it's not already in meters.
4. Calculate the time using the formula: time = distance / speed.

Code:
```py
bridge_length_result = web_search(query="length of Pont des Arts")
print("Bridge length result:", bridge_length_result)

leopard_speed_result = web_search(query="maximum speed of a leopard")
print("Leopard speed result:", leopard_speed_result)
```<end_code>

Originally posted by @dadaphl in #201

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant