You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With current source code getting error openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}} on running the example_userproxy.py
#4936
Open
karun19 opened this issue
Jan 8, 2025
· 3 comments
I am getting below error with the current (Master) source code where as it works fine with the source code downloaded on 18 Nov 2024
My environment: Windows 11 Example used: example_userproxy.py Error: openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
Steps used:
git clone https://github.com/microsoft/autogen.git
cd autogen/python
uv sync --all-extras
set CHAT_COMPLETION_PROVIDER=azure
set CHAT_COMPLETION_KWARGS_JSON={"model":"gpt-3.5-turbo-16k", "api_key":"**********************", "api_version":"2024-08-01-preview", "base_url":"https://**************.openai.azure.com", "api_type":"azure", "azure_deployment":"gpt-35-turbo-16k", "model_capabilities": {"function_calling": true,"json_output": true,"vision": true}}
cd .venv/Scripts
activate
cd ..
cd ..
cd packages/autogen-magentic-one
pip install -e .
cd examples
python example_userproxy.py
Detailed error:
(python) C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\examples>python example_userproxy.py
User input ('exit' to quit): who is modi
Next speaker Coder
Error processing publish message for orchestrator/default
Traceback (most recent call last):
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 409, in _on_message
return await agent.on_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 149, in wrapper
return_value = await func(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 81, in handle_incoming_message
await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 47, in _process
await self._handle_broadcast(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_orchestrator.py", line 95, in _handle_broadcast
await self.send_message(request_reply_message, next_agent.id, cancellation_token=ctx.cancellation_token)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 130, in send_message
return await self._runtime.send_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 232, in send_message
return await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 321, in _process_send
response = await recipient_agent.on_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 149, in wrapper
return_value = await func(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 81, in handle_incoming_message
await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 45, in _process
await self._handle_request_reply(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_worker.py", line 42, in _handle_request_reply
request_halt, response = await self._generate_reply(ctx.cancellation_token)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\coder.py", line 55, in _generate_reply
response = await self._model_client.create(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-ext\src\autogen_ext\models\openai_openai_client.py", line 494, in create
result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai\resources\chat\completions.py", line 1720, in create
return await self._post(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1843, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1537, in request
return await self._request(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1638, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
Exception ignored in: <function _ProactorBasePipeTransport.del at 0x0000025BC91FBE20>
Traceback (most recent call last):
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\proactor_events.py", line 116, in del
self.close()
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\proactor_events.py", line 108, in close
self._loop.call_soon(self._call_connection_lost, None)
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 745, in call_soon
self._check_closed()
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 510, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
What did you expect to happen?
Example should work fine
How can we reproduce it (as minimally and precisely as possible)?
git clone https://github.com/microsoft/autogen.git
cd autogen/python
uv sync --all-extras
set CHAT_COMPLETION_PROVIDER=azure
set CHAT_COMPLETION_KWARGS_JSON={"model":"gpt-3.5-turbo-16k", "api_key":"**********************", "api_version":"2024-08-01-preview", "base_url":"https://**************.openai.azure.com", "api_type":"azure", "azure_deployment":"gpt-35-turbo-16k", "model_capabilities": {"function_calling": true,"json_output": true,"vision": true}}
cd .venv/Scripts
activate
cd ..
cd ..
cd packages/autogen-magentic-one
pip install -e .
cd examples
python example_userproxy.py
AutoGen version
Current
Which package was this bug in
Magentic One
Model used
gpt-3.5-turbo-16k
Python version
Python 3.10.1
Operating system
Windows 11
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered:
When you getting 404 error from the model API endpoint, it means that the resource you are accessing is not available. Can you make sure you have passed in the correct deployment name and the right end point URL?
What happened?
I am getting below error with the current (Master) source code where as it works fine with the source code downloaded on 18 Nov 2024
My environment: Windows 11
Example used: example_userproxy.py
Error: openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
Steps used:
git clone https://github.com/microsoft/autogen.git
cd autogen/python
uv sync --all-extras
set CHAT_COMPLETION_PROVIDER=azure
set CHAT_COMPLETION_KWARGS_JSON={"model":"gpt-3.5-turbo-16k", "api_key":"**********************", "api_version":"2024-08-01-preview", "base_url":"https://**************.openai.azure.com", "api_type":"azure", "azure_deployment":"gpt-35-turbo-16k", "model_capabilities": {"function_calling": true,"json_output": true,"vision": true}}
cd .venv/Scripts
activate
cd ..
cd ..
cd packages/autogen-magentic-one
pip install -e .
cd examples
python example_userproxy.py
Detailed error:
(python) C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\examples>python example_userproxy.py
User input ('exit' to quit): who is modi
[2025-01-08T12:07:12.214499], UserProxy:
who is modi
[2025-01-08T12:07:12.215653], orchestrator (thought):
Next speaker Coder
Error processing publish message for orchestrator/default
Traceback (most recent call last):
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 409, in _on_message
return await agent.on_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 149, in wrapper
return_value = await func(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 81, in handle_incoming_message
await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 47, in _process
await self._handle_broadcast(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_orchestrator.py", line 95, in _handle_broadcast
await self.send_message(request_reply_message, next_agent.id, cancellation_token=ctx.cancellation_token)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 130, in send_message
return await self._runtime.send_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 232, in send_message
return await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_single_threaded_agent_runtime.py", line 321, in _process_send
response = await recipient_agent.on_message(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-core\src\autogen_core_routed_agent.py", line 149, in wrapper
return_value = await func(self, message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 81, in handle_incoming_message
await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_agent.py", line 45, in _process
await self._handle_request_reply(message, ctx)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\base_worker.py", line 42, in _handle_request_reply
request_halt, response = await self._generate_reply(ctx.cancellation_token)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-magentic-one\src\autogen_magentic_one\agents\coder.py", line 55, in _generate_reply
response = await self._model_client.create(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python\packages\autogen-ext\src\autogen_ext\models\openai_openai_client.py", line 494, in create
result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai\resources\chat\completions.py", line 1720, in create
return await self._post(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1843, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1537, in request
return await self._request(
File "C:\Karun\Documents\LUCA\AgenticAI\autogen\python.venv\lib\site-packages\openai_base_client.py", line 1638, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
Exception ignored in: <function _ProactorBasePipeTransport.del at 0x0000025BC91FBE20>
Traceback (most recent call last):
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\proactor_events.py", line 116, in del
self.close()
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\proactor_events.py", line 108, in close
self._loop.call_soon(self._call_connection_lost, None)
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 745, in call_soon
self._check_closed()
File "C:\Users\KarunakaranGN\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 510, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
What did you expect to happen?
Example should work fine
How can we reproduce it (as minimally and precisely as possible)?
git clone https://github.com/microsoft/autogen.git
cd autogen/python
uv sync --all-extras
set CHAT_COMPLETION_PROVIDER=azure
set CHAT_COMPLETION_KWARGS_JSON={"model":"gpt-3.5-turbo-16k", "api_key":"**********************", "api_version":"2024-08-01-preview", "base_url":"https://**************.openai.azure.com", "api_type":"azure", "azure_deployment":"gpt-35-turbo-16k", "model_capabilities": {"function_calling": true,"json_output": true,"vision": true}}
cd .venv/Scripts
activate
cd ..
cd ..
cd packages/autogen-magentic-one
pip install -e .
cd examples
python example_userproxy.py
AutoGen version
Current
Which package was this bug in
Magentic One
Model used
gpt-3.5-turbo-16k
Python version
Python 3.10.1
Operating system
Windows 11
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered: