Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: cannot unpack non-iterable NoneType object #193

Open
Guunh opened this issue Jan 17, 2025 · 2 comments
Open

TypeError: cannot unpack non-iterable NoneType object #193

Guunh opened this issue Jan 17, 2025 · 2 comments

Comments

@Guunh
Copy link

Guunh commented Jan 17, 2025

ComfyUI Error Report

Error Details

  • Node ID: 21
  • Node Type: ailab_OmniGen
  • Exception Type: TypeError
  • Exception Message: cannot unpack non-iterable NoneType object

Stack Trace

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\ailab_OmniGen.py", line 387, in generation
    raise e

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\ailab_OmniGen.py", line 353, in generation
    output = pipe(
             ^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\pipeline.py", line 286, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\scheduler.py", line 164, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 388, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(
                    ^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
    hidden_states, self_attn_weights = self.self_attn(
                                       ^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
    cos, sin = position_embeddings
    ^^^^^^^^

System Information

  • ComfyUI Version: 0.3.12
  • Arguments: ComfyUI\main.py --windows-standalone-build
  • OS: nt
  • Python Version: 3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
  • Embedded Python: true
  • PyTorch Version: 2.5.1+cu124

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4070 SUPER : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 12877955072
    • VRAM Free: 10882901468
    • Torch VRAM Total: 8858370048
    • Torch VRAM Free: 8212178396

Logs

2025-01-17T14:38:11.754755 - [START] Security scan2025-01-17T14:38:11.754755 - 
2025-01-17T14:38:12.266951 - [DONE] Security scan2025-01-17T14:38:12.266951 - 
2025-01-17T14:38:12.337951 - ## ComfyUI-Manager: installing dependencies done.2025-01-17T14:38:12.337951 - 
2025-01-17T14:38:12.337951 - ** ComfyUI startup time:2025-01-17T14:38:12.337951 -  2025-01-17T14:38:12.337951 - 2025-01-17 14:38:12.3372025-01-17T14:38:12.337951 - 
2025-01-17T14:38:12.337951 - ** Platform:2025-01-17T14:38:12.337951 -  2025-01-17T14:38:12.337951 - Windows2025-01-17T14:38:12.337951 - 
2025-01-17T14:38:12.337951 - ** Python version:2025-01-17T14:38:12.337951 -  2025-01-17T14:38:12.337951 - 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]2025-01-17T14:38:12.337951 - 
2025-01-17T14:38:12.337951 - ** Python executable:2025-01-17T14:38:12.337951 -  2025-01-17T14:38:12.337951 - A:\ComfyUI\ComfyUI_windows_portable\python_embeded\python.exe2025-01-17T14:38:12.337951 - 
2025-01-17T14:38:12.337951 - ** ComfyUI Path:2025-01-17T14:38:12.337951 -  2025-01-17T14:38:12.337951 - A:\ComfyUI\ComfyUI_windows_portable\ComfyUI2025-01-17T14:38:12.337951 - 
2025-01-17T14:38:12.337951 - ** User directory:2025-01-17T14:38:12.337951 -  2025-01-17T14:38:12.337951 - A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\user2025-01-17T14:38:12.338951 - 
2025-01-17T14:38:12.338951 - ** ComfyUI-Manager config path:2025-01-17T14:38:12.338951 -  2025-01-17T14:38:12.338951 - A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\user\default\ComfyUI-Manager\config.ini2025-01-17T14:38:12.338951 - 
2025-01-17T14:38:12.338951 - ** Log path:2025-01-17T14:38:12.338951 -  2025-01-17T14:38:12.338951 - A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\user\comfyui.log2025-01-17T14:38:12.338951 - 
2025-01-17T14:38:12.913665 - 
Prestartup times for custom nodes:
2025-01-17T14:38:12.913665 -    0.0 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
2025-01-17T14:38:12.914665 -    1.3 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-manager
2025-01-17T14:38:12.914665 - 
2025-01-17T14:38:13.874979 - Checkpoint files will always be loaded safely.
2025-01-17T14:38:13.951226 - Total VRAM 12281 MB, total RAM 65345 MB
2025-01-17T14:38:13.951226 - pytorch version: 2.5.1+cu124
2025-01-17T14:38:13.951226 - Set vram state to: NORMAL_VRAM
2025-01-17T14:38:13.952229 - Device: cuda:0 NVIDIA GeForce RTX 4070 SUPER : cudaMallocAsync
2025-01-17T14:38:14.562349 - Using pytorch attention
2025-01-17T14:38:15.480330 - [Prompt Server] web root: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\web
2025-01-17T14:38:16.070198 - �[92mBmad-DirtyUndoRedo Loaded.�[0m2025-01-17T14:38:16.070348 - 
2025-01-17T14:38:16.070348 - Skip A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Bmad-DirtyUndoRedo module for custom nodes due to the lack of NODE_CLASS_MAPPINGS.
2025-01-17T14:38:16.228305 - [Crystools �[0;32mINFO�[0m] Crystools version: 1.21.0
2025-01-17T14:38:16.242734 - [Crystools �[0;32mINFO�[0m] CPU: 13th Gen Intel(R) Core(TM) i9-13900K - Arch: AMD64 - OS: Windows 10
2025-01-17T14:38:16.248707 - [Crystools �[0;32mINFO�[0m] Pynvml (Nvidia) initialized.
2025-01-17T14:38:16.248707 - [Crystools �[0;32mINFO�[0m] GPU/s:
2025-01-17T14:38:16.266103 - [Crystools �[0;32mINFO�[0m] 0) NVIDIA GeForce RTX 4070 SUPER
2025-01-17T14:38:16.266103 - [Crystools �[0;32mINFO�[0m] NVIDIA Driver: 566.36
2025-01-17T14:38:16.273103 - ### Loading: ComfyUI-Manager (V3.7.6)
2025-01-17T14:38:16.362121 - ### ComfyUI Version: v0.3.12-1-g7fc3ccdc | Released on '2025-01-16'
2025-01-17T14:38:16.524121 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-01-17T14:38:16.552121 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-01-17T14:38:16.561121 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-01-17T14:38:16.578121 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-01-17T14:38:16.594121 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-01-17T14:38:16.594121 - FETCH DATA from: https://api.comfy.org/nodes?page=1&limit=10002025-01-17T14:38:16.594121 - 2025-01-17T14:38:16.615213 - 
2025-01-17T14:38:16.615213 - �[92m[rgthree-comfy] Loaded 42 epic nodes. 🎉�[00m2025-01-17T14:38:16.615213 - 
2025-01-17T14:38:16.616213 - 
2025-01-17T14:38:16.616213 - 
Import times for custom nodes:
2025-01-17T14:38:16.616213 -    0.0 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\websocket_image_save.py
2025-01-17T14:38:16.616213 -    0.0 seconds (IMPORT FAILED): A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Bmad-DirtyUndoRedo
2025-01-17T14:38:16.616213 -    0.0 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen
2025-01-17T14:38:16.616213 -    0.0 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
2025-01-17T14:38:16.616213 -    0.1 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-videohelpersuite
2025-01-17T14:38:16.616213 -    0.2 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-crystools
2025-01-17T14:38:16.616213 -    0.2 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-manager
2025-01-17T14:38:16.616213 -    0.3 seconds: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-advancedliveportrait
2025-01-17T14:38:16.616213 - 
2025-01-17T14:38:16.622216 - Starting server

2025-01-17T14:38:16.622216 - To see the GUI go to: http://127.0.0.1:8188
2025-01-17T14:38:17.339108 - FETCH DATA from: A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-manager\extension-node-map.json2025-01-17T14:38:17.339108 - 2025-01-17T14:38:17.342112 -  [DONE]
2025-01-17T14:38:27.101848 - got prompt
2025-01-17T14:38:27.111848 - OmniGen code already exists2025-01-17T14:38:27.112847 - 
2025-01-17T14:38:27.112847 - OmniGen models verified successfully2025-01-17T14:38:27.112847 - 
2025-01-17T14:38:27.430852 - Auto selecting FP16 (Available VRAM: 12.0GB)2025-01-17T14:38:27.430852 - 
2025-01-17T14:38:27.430852 - Current model instance: None2025-01-17T14:38:27.430852 - 
2025-01-17T14:38:27.430852 - Current model precision: None2025-01-17T14:38:27.430852 - 
2025-01-17T14:38:49.262148 - Loading safetensors2025-01-17T14:38:49.262148 - 
2025-01-17T14:38:54.154306 - Warning: Pipeline.to(device) returned None, using original pipeline2025-01-17T14:38:54.154306 - 
2025-01-17T14:38:54.154306 - VRAM usage after pipeline creation: 15102.28MB2025-01-17T14:38:54.154306 - 
2025-01-17T14:38:54.179467 - Processing with prompt: the woman from <img><|image_1|></img> is sitting in a armchair, cinematic photo, christmas mood, beautiful christmas photo, award winning photography2025-01-17T14:38:54.179467 - 
2025-01-17T14:38:54.179467 - Model will be kept during generation2025-01-17T14:38:54.179467 - 
2025-01-17T14:38:55.246788 - 
  0%|                                                                                           | 0/50 [00:00<?, ?it/s]2025-01-17T14:38:55.395409 - 
  0%|                                                                                           | 0/50 [00:00<?, ?it/s]2025-01-17T14:38:55.395409 - 
2025-01-17T14:38:55.395409 - Error during generation: cannot unpack non-iterable NoneType object2025-01-17T14:38:55.396409 - 
2025-01-17T14:38:55.425408 - !!! Exception during processing !!! cannot unpack non-iterable NoneType object
2025-01-17T14:38:55.427409 - Traceback (most recent call last):
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\ailab_OmniGen.py", line 387, in generation
    raise e
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\ailab_OmniGen.py", line 353, in generation
    output = pipe(
             ^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\pipeline.py", line 286, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\scheduler.py", line 164, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 388, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-omnigen\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(
                    ^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
    hidden_states, self_attn_weights = self.self_attn(
                                       ^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "A:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
    cos, sin = position_embeddings
    ^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object

2025-01-17T14:38:55.428409 - Prompt executed in 28.33 seconds
@staoxiao
Copy link
Contributor

You can try to update the transformers: pip install transformers==4.45.2

@Guunh
Copy link
Author

Guunh commented Jan 20, 2025

It doesn't work, everything is updated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants