Alexander Ljungberg d9cc0910c8
Fix upcast attention dtype error.
Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error:

```
  File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward
    out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False)
RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead.
```

The fix is to make sure to upcast the value tensor too.
2023-06-06 21:45:30 +01:00
..
2023-05-17 22:43:24 +03:00
2023-05-10 11:37:18 +03:00
2023-05-10 11:05:02 +03:00
2023-05-10 11:05:02 +03:00
2022-09-07 12:32:28 +03:00
2023-05-10 09:02:23 +03:00
2023-05-19 22:59:29 +03:00
2023-05-10 08:43:42 +03:00
2023-05-10 11:37:18 +03:00
2023-05-18 10:12:17 +03:00
2023-05-10 09:02:23 +03:00
2023-05-10 11:37:18 +03:00
2023-05-10 09:02:23 +03:00
2023-05-10 11:05:02 +03:00
2023-05-10 08:43:42 +03:00
2023-05-10 08:43:42 +03:00
2023-05-10 08:43:42 +03:00
2023-05-17 17:39:07 +08:00
2023-05-14 11:46:27 +03:00
2023-04-29 09:17:35 +03:00
2023-05-21 16:42:54 +09:00
2023-05-10 08:43:42 +03:00
2023-05-27 20:23:16 +03:00