From 868a94de6e65dad2e0c990c7649203848fdb9658 Mon Sep 17 00:00:00 2001 From: ClashSAN <98228077+ClashSAN@users.noreply.github.com> Date: Sun, 17 Dec 2023 19:09:29 +0000 Subject: [PATCH] add tip for loading full weights on 4gb --- Troubleshooting.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/Troubleshooting.md b/Troubleshooting.md index 73dfb92..82dde8c 100644 --- a/Troubleshooting.md +++ b/Troubleshooting.md @@ -38,10 +38,11 @@ pause # Low VRAM Video-cards When running on video cards with a low amount of VRAM (<=4GB), out of memory errors may arise. Various optimizations may be enabled through command line arguments, sacrificing some/a lot of speed in favor of using less VRAM: -- Use `--opt-sdp-attention` OR the optional dependency `--xformers` to cut the gpu memory usage down by half on many cards. -- If you have 4GB VRAM and want to make 512x512 (or maybe up to 640x640) images, use `--medvram`. -- If you have 4GB VRAM and want to make 512x512 images, but you get an out of memory error with `--medvram`, use `--lowvram --always-batch-cond-uncond` instead. +- Use `--opt-sdp-no-mem-attention` OR the optional dependency `--xformers` to cut the gpu memory usage down by half on many cards. +- If you have 4GB VRAM and want to make ~1.3x larger images, use `--medvram`. +- If you have 4GB VRAM but you get an out of memory error with `--medvram`, use `--lowvram --always-batch-cond-uncond` instead. - If you have 4GB VRAM and want to make images larger than you can with `--medvram`, use `--lowvram`. +- If you have 4GB VRAM and get an out of memory error when loading a full weight model, use `--disable-model-loading-ram-optimization` (added in v1.6.0) # Torch is not able to use GPU This is one of the most frequently mentioned problems, but it's usually not a WebUI fault, there are many reasons for it.