You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I find that modifying training_method to 'selfattn' or 'xattn' leads to failure:
create LoRA for U-Net: 0 modules.
Traceback (most recent call last):
File "/home/notebook/code/personal/S9049723/LECO/./train_lora.py", line 343, in
main(args)
File "/home/notebook/code/personal/S9049723/LECO/./train_lora.py", line 330, in main
train(config, prompts)
File "/home/notebook/code/personal/S9049723/LECO/./train_lora.py", line 89, in train
optimizer = optimizer_module(network.prepare_optimizer_params(), lr=config.train.lr, **optimizer_kwargs)
File "/home/notebook/code/personal/S9049723/Anaconda3/envs/leco/lib/python3.10/site-packages/torch/optim/adamw.py", line 50, in init
super().init(params, defaults)
File "/home/notebook/code/personal/S9049723/Anaconda3/envs/leco/lib/python3.10/site-packages/torch/optim/optimizer.py", line 187, in init
raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list
According to LoRA implementation, extra LoRA modules are only attached to Conv and Linear modules, so that attn blocks have no LoRA associated. Maybe related code can be removed or refined in your later update.
The text was updated successfully, but these errors were encountered:
Thanks for your impressive work and clear code!
I find that modifying
training_method
to 'selfattn' or 'xattn' leads to failure:According to LoRA implementation, extra LoRA modules are only attached to Conv and Linear modules, so that attn blocks have no LoRA associated. Maybe related code can be removed or refined in your later update.
The text was updated successfully, but these errors were encountered: