Replies: 1 comment
-
|
Hi @Aitejiu , I believe It has been fixed with #1146 can you please double check by installing PEFT from source? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
package version:peft==0.6.2Using
QLoRAto fine-tuneBaiChuan-13BreportsAttributeError: 'Parameter' object has no attribute 'weight'.The error location code is
Its in /peft/tuners/adalora/bnb.py line 144.
After looking through the source code, I found that
lora_A[active_adapter]is considered a nn.module inpeft==0.6.2because it has the weight attribute. But after the model is loaded,lora_A[active_adapter]is actually annn.parameters, it is already the weight of the corresponding module. I would like to know why this is the case?Beta Was this translation helpful? Give feedback.
All reactions