Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open WebUI config resets on restart #93

Open
dkodr opened this issue Dec 31, 2024 · 2 comments
Open

Open WebUI config resets on restart #93

dkodr opened this issue Dec 31, 2024 · 2 comments
Labels
question Further information is requested

Comments

@dkodr
Copy link

dkodr commented Dec 31, 2024

From the wiki:

You can configure Open WebUI in three ways:

  • Via WebUI itself: changes are saved in the webui/config.json file, Harbor may override them on restart
    • Copy config changes to the webui/configs/config.override.json in order to persist them over Harbor's default config

Is this really how it should be or maybe there's something I don't understand (wchich absolutely may be the case 😉). If I want to make the changes made via UI persist after restart, I should make a copy of my settings first? If so, that's really inconvenient as I'd like to be able to regularly change some settings in the UI and I can't be trusted with remembering to merge the files 😉 Is there a reason it's designed this way in Harbor?

Besides, I thing the path changed as there isn't a webui/config.json file.

@av
Copy link
Owner

av commented Dec 31, 2024

Hey 👋

Thanks for trying out Harbor. The overwrite is applicable only to portions that are pre-configured by Harbor (Web RAG, TTS, Image Generation, embeddings), in such cases using an override config is a current workaround. For API connections - see openai in the config.

You can find the final config created by Harbor in the webui workspace for reference (webui started renaming it to config.old.json a few months ago, I'll refresh the docs soo), to see if it contains anything that you also want to adjust manually.

Ideally, we want to have a bidirectional config bindings, but that'd require a pretty deep involvement with WebUI internals, which is hard to justify at this stage.

@av av added the question Further information is requested label Dec 31, 2024
@cartergrobinson
Copy link

What format should be used for the override.env file to persist ComfyUI running on a remote host?

I tried this, but it doesn't seem to show up in webui:

# This file can be used for additional environment variables
# specifically for the "webui" service.
# You can also use the "harbor env" command to set these variables.
{
    "image_generation": {
        "engine": "comfyui",
        "enable": true,
        "model": "flux1-dev.safetensors",
        "size": "768x512",
        "steps": 20,
        "comfyui": {
            "base_url": "http://192.168.1.201:8188",
            "nodes": [
                {
                    "type": "prompt",
                    "key": "text",
                    "node_ids": [
                        "6"
                    ]
                },
                {
                    "type": "model",
                    "key": "unet_name",
                    "node_ids": [
                        "12"
                    ]
                },
                {
                    "type": "width",
                    "key": "width",
                    "node_ids": [
                        "27"
                    ]
                },
                {
                    "type": "height",
                    "key": "height",
                    "node_ids": [
                        "27"
                    ]
                },
                {
                    "type": "steps",
                    "key": "steps",
                    "node_ids": [
                        "17"
                    ]
                },
                {
                    "type": "seed",
                    "key": "seed",
                    "node_ids": [
                        "25"
                    ]
                }
            ],
            "workflow": "{\n  \"6\": {\n    \"inputs\": {\n      \"text\": \"a cat holding up a sign that says \"FLUX DEV\"\",\n      \"clip\": [\n        \"11\",\n        0\n      ]\n    },\n    \"class_type\": \"CLIPTextEncode\",\n    \"_meta\": {\n      \"title\": \"CLIP Text Encode (Positive Prompt)\"\n    }\n  },\n  \"8\": {\n    \"inputs\": {\n      \"samples\": [\n        \"13\",\n        0\n      ],\n      \"vae\": [\n        \"10\",\n        0\n      ]\n    },\n    \"class_type\": \"VAEDecode\",\n    \"_meta\": {\n      \"title\": \"VAE Decode\"\n    }\n  },\n  \"9\": {\n    \"inputs\": {\n      \"filename_prefix\": \"ComfyUI\",\n      \"images\": [\n        \"8\",\n        0\n      ]\n    },\n    \"class_type\": \"SaveImage\",\n    \"_meta\": {\n      \"title\": \"Save Image\"\n    }\n  },\n  \"10\": {\n    \"inputs\": {\n      \"vae_name\": \"ae.safetensors\"\n    },\n    \"class_type\": \"VAELoader\",\n    \"_meta\": {\n      \"title\": \"Load VAE\"\n    }\n  },\n  \"11\": {\n    \"inputs\": {\n      \"clip_name1\": \"t5xxl_fp16.safetensors\",\n      \"clip_name2\": \"clip_l.safetensors\",\n      \"type\": \"flux\",\n      \"device\": \"default\"\n    },\n    \"class_type\": \"DualCLIPLoader\",\n    \"_meta\": {\n      \"title\": \"DualCLIPLoader\"\n    }\n  },\n  \"12\": {\n    \"inputs\": {\n      \"unet_name\": \"flux1-dev.safetensors\",\n      \"weight_dtype\": \"default\"\n    },\n    \"class_type\": \"UNETLoader\",\n    \"_meta\": {\n      \"title\": \"Load Diffusion Model\"\n    }\n  },\n  \"13\": {\n    \"inputs\": {\n      \"noise\": [\n        \"25\",\n        0\n      ],\n      \"guider\": [\n        \"22\",\n        0\n      ],\n      \"sampler\": [\n        \"16\",\n        0\n      ],\n      \"sigmas\": [\n        \"17\",\n        0\n      ],\n      \"latent_image\": [\n        \"27\",\n        0\n      ]\n    },\n    \"class_type\": \"SamplerCustomAdvanced\",\n    \"_meta\": {\n      \"title\": \"SamplerCustomAdvanced\"\n    }\n  },\n  \"16\": {\n    \"inputs\": {\n      \"sampler_name\": \"euler\"\n    },\n    \"class_type\": \"KSamplerSelect\",\n    \"_meta\": {\n      \"title\": \"KSamplerSelect\"\n    }\n  },\n  \"17\": {\n    \"inputs\": {\n      \"scheduler\": \"simple\",\n      \"steps\": 20,\n      \"denoise\": 1,\n      \"model\": [\n        \"30\",\n        0\n      ]\n    },\n    \"class_type\": \"BasicScheduler\",\n    \"_meta\": {\n      \"title\": \"BasicScheduler\"\n    }\n  },\n  \"22\": {\n    \"inputs\": {\n      \"model\": [\n        \"30\",\n        0\n      ],\n      \"conditioning\": [\n        \"26\",\n        0\n      ]\n    },\n    \"class_type\": \"BasicGuider\",\n    \"_meta\": {\n      \"title\": \"BasicGuider\"\n    }\n  },\n  \"25\": {\n    \"inputs\": {\n      \"noise_seed\": 1008873145850579\n    },\n    \"class_type\": \"RandomNoise\",\n    \"_meta\": {\n      \"title\": \"RandomNoise\"\n    }\n  },\n  \"26\": {\n    \"inputs\": {\n      \"guidance\": 3.5,\n      \"conditioning\": [\n        \"6\",\n        0\n      ]\n    },\n    \"class_type\": \"FluxGuidance\",\n    \"_meta\": {\n      \"title\": \"FluxGuidance\"\n    }\n  },\n  \"27\": {\n    \"inputs\": {\n      \"width\": 1024,\n      \"height\": 1024,\n      \"batch_size\": 1\n    },\n    \"class_type\": \"EmptySD3LatentImage\",\n    \"_meta\": {\n      \"title\": \"EmptySD3LatentImage\"\n    }\n  },\n  \"30\": {\n    \"inputs\": {\n      \"max_shift\": 1.15,\n      \"base_shift\": 0.5,\n      \"width\": 1024,\n      \"height\": 1024,\n      \"model\": [\n        \"12\",\n        0\n      ]\n    },\n    \"class_type\": \"ModelSamplingFlux\",\n    \"_meta\": {\n      \"title\": \"ModelSamplingFlux\"\n    }\n  }\n}"
        },
        "openai": {
            "api_base_url": "https://api.openai.com/v1",
            "api_key": ""
        },
        "automatic1111": {
            "base_url": "",
            "api_auth": ""
        }
    }
}

I also tried editing config.comfyui.json to the below, but I don't think that does anything:

{
	"image_generation": {
		"engine": "comfyui",
		"enable": true,
		"model": "flux1-dev.safetensors",
		"size": "768x512",
		"steps": 20,
		"comfyui": {
			"base_url": "http://192.168.1.201:8188",
			"nodes": [
				{
					"type": "prompt",
					"key": "text",
					"node_ids": [
						"6"
					]
				},
				{
					"type": "model",
					"key": "unet_name",
					"node_ids": [
						"12"
					]
				},
				{
					"type": "width",
					"key": "width",
					"node_ids": [
						"27"
					]
				},
				{
					"type": "height",
					"key": "height",
					"node_ids": [
						"27"
					]
				},
				{
					"type": "steps",
					"key": "steps",
					"node_ids": [
						"17"
					]
				},
				{
					"type": "seed",
					"key": "seed",
					"node_ids": [
						"25"
					]
				}
			],
			"workflow": "{\n  \"6\": {\n    \"inputs\": {\n      \"text\": \"a cat holding up a sign that says \"FLUX DEV\"\",\n      \"clip\": [\n        \"11\",\n        0\n      ]\n    },\n    \"class_type\": \"CLIPTextEncode\",\n    \"_meta\": {\n      \"title\": \"CLIP Text Encode (Positive Prompt)\"\n    }\n  },\n  \"8\": {\n    \"inputs\": {\n      \"samples\": [\n        \"13\",\n        0\n      ],\n      \"vae\": [\n        \"10\",\n        0\n      ]\n    },\n    \"class_type\": \"VAEDecode\",\n    \"_meta\": {\n      \"title\": \"VAE Decode\"\n    }\n  },\n  \"9\": {\n    \"inputs\": {\n      \"filename_prefix\": \"ComfyUI\",\n      \"images\": [\n        \"8\",\n        0\n      ]\n    },\n    \"class_type\": \"SaveImage\",\n    \"_meta\": {\n      \"title\": \"Save Image\"\n    }\n  },\n  \"10\": {\n    \"inputs\": {\n      \"vae_name\": \"ae.safetensors\"\n    },\n    \"class_type\": \"VAELoader\",\n    \"_meta\": {\n      \"title\": \"Load VAE\"\n    }\n  },\n  \"11\": {\n    \"inputs\": {\n      \"clip_name1\": \"t5xxl_fp16.safetensors\",\n      \"clip_name2\": \"clip_l.safetensors\",\n      \"type\": \"flux\",\n      \"device\": \"default\"\n    },\n    \"class_type\": \"DualCLIPLoader\",\n    \"_meta\": {\n      \"title\": \"DualCLIPLoader\"\n    }\n  },\n  \"12\": {\n    \"inputs\": {\n      \"unet_name\": \"flux1-dev.safetensors\",\n      \"weight_dtype\": \"default\"\n    },\n    \"class_type\": \"UNETLoader\",\n    \"_meta\": {\n      \"title\": \"Load Diffusion Model\"\n    }\n  },\n  \"13\": {\n    \"inputs\": {\n      \"noise\": [\n        \"25\",\n        0\n      ],\n      \"guider\": [\n        \"22\",\n        0\n      ],\n      \"sampler\": [\n        \"16\",\n        0\n      ],\n      \"sigmas\": [\n        \"17\",\n        0\n      ],\n      \"latent_image\": [\n        \"27\",\n        0\n      ]\n    },\n    \"class_type\": \"SamplerCustomAdvanced\",\n    \"_meta\": {\n      \"title\": \"SamplerCustomAdvanced\"\n    }\n  },\n  \"16\": {\n    \"inputs\": {\n      \"sampler_name\": \"euler\"\n    },\n    \"class_type\": \"KSamplerSelect\",\n    \"_meta\": {\n      \"title\": \"KSamplerSelect\"\n    }\n  },\n  \"17\": {\n    \"inputs\": {\n      \"scheduler\": \"simple\",\n      \"steps\": 20,\n      \"denoise\": 1,\n      \"model\": [\n        \"30\",\n        0\n      ]\n    },\n    \"class_type\": \"BasicScheduler\",\n    \"_meta\": {\n      \"title\": \"BasicScheduler\"\n    }\n  },\n  \"22\": {\n    \"inputs\": {\n      \"model\": [\n        \"30\",\n        0\n      ],\n      \"conditioning\": [\n        \"26\",\n        0\n      ]\n    },\n    \"class_type\": \"BasicGuider\",\n    \"_meta\": {\n      \"title\": \"BasicGuider\"\n    }\n  },\n  \"25\": {\n    \"inputs\": {\n      \"noise_seed\": 1008873145850579\n    },\n    \"class_type\": \"RandomNoise\",\n    \"_meta\": {\n      \"title\": \"RandomNoise\"\n    }\n  },\n  \"26\": {\n    \"inputs\": {\n      \"guidance\": 3.5,\n      \"conditioning\": [\n        \"6\",\n        0\n      ]\n    },\n    \"class_type\": \"FluxGuidance\",\n    \"_meta\": {\n      \"title\": \"FluxGuidance\"\n    }\n  },\n  \"27\": {\n    \"inputs\": {\n      \"width\": 1024,\n      \"height\": 1024,\n      \"batch_size\": 1\n    },\n    \"class_type\": \"EmptySD3LatentImage\",\n    \"_meta\": {\n      \"title\": \"EmptySD3LatentImage\"\n    }\n  },\n  \"30\": {\n    \"inputs\": {\n      \"max_shift\": 1.15,\n      \"base_shift\": 0.5,\n      \"width\": 1024,\n      \"height\": 1024,\n      \"model\": [\n        \"12\",\n        0\n      ]\n    },\n    \"class_type\": \"ModelSamplingFlux\",\n    \"_meta\": {\n      \"title\": \"ModelSamplingFlux\"\n    }\n  }\n}"
		},
		"openai": {
			"api_base_url": "https://api.openai.com/v1",
			"api_key": ""
		},
		"automatic1111": {
			"base_url": "",
			"api_auth": ""
		}
	}
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants