Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expected tool format of LiteLLM for ollama is not followed by Smollagent example Tools #264

Open
Johnz86 opened this issue Jan 18, 2025 · 2 comments

Comments

@Johnz86
Copy link

Johnz86 commented Jan 18, 2025

I tried to follow this tutorial with ollama models:

https://huggingface.co/docs/smolagents/tutorials/inspect_runs

This the error tracked by phoenix:

APIConnectionError: litellm.APIConnectionError: 'name' Traceback (most recent call last): File "C:\GIT\python\hello-smolagents\.venv\Lib\site-packages\litellm\main.py", line 2690, in completion response = base_llm_http_handler.completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\GIT\python\hello-smolagents\.venv\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 334, in completion return provider_config.transform_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\GIT\python\hello-smolagents\.venv\Lib\site-packages\litellm\llms\ollama\completion\transformation.py", line 264, in transform_response "name": function_call["name"], ~~~~~~~~~~~~~^^^^^^^^ KeyError: 'name'

This is what is happening in transformation.py arround line 264:

            function_call = json.loads(response_json["response"])
            message = litellm.Message(
                content=None,
                tool_calls=[
                    {
                        "id": f"call_{str(uuid.uuid4())}",
                        "function": {
                            "name": function_call["name"],
                            "arguments": json.dumps(function_call["arguments"]),
                        },
                        "type": "function",
                    }
                ],
            )

Lite LLM tries to use keys 'name' and 'argument' unfortunately 'function_call ' contains response:

 {'tool_name': 'web_search', 'tool_arguments': {'query': 'projected GDP growth rate United States 2024'}}

Which does not match the keys. I had a look at smollagents prompts.py, which has examples like:

Task: "Which city has the highest population , Guangzhou or Shanghai?"

Action:
{
    "tool_name": "search",
    "tool_arguments": "Population Guangzhou"
}

So please fix the example tools, to match the LiteLLM 'litellm/llms/ollama/completion/transformation.py' implementation.

@RolandJAAI
Copy link
Contributor

@aymeric-roucher Indeed this happens when you use the ToolCallingAgent with a LiteLLM model. The problem is not limited to ollama, it also happens with Claude via LiteLLM. The CodeAgent works fine with the same code and tools and the same LiteLLM models. Since the ToolCallingAgent works fine with the HfApi models and tool calling mechanism is the same in both model classes as far as I can see, maybe LiteLLM have changed something which needs to be adapted on the smolagents side, assuing it worked before. Tested with 1.4.1

Is anyone already working on this or should I investigate?

@aymeric-roucher
Copy link
Collaborator

1 week ago, ToolCallingAgent worked for me via LiteLLM for Claude-3.5 sonnet and GPT-4o, and 2 weeks ago with ollama running llama 3.2.
Indeed maybe fixing the prompts could help @RolandJAAI !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants