Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent LoRA alpha scalars from being skipped over during model loading #263

Closed
wants to merge 1 commit into from

Conversation

grauho
Copy link
Contributor

@grauho grauho commented May 12, 2024

Adjusted model.cpp such that LoRA alpha scalars are not skipped for having an encoded dimension of zero.

@grauho grauho mentioned this pull request May 12, 2024
@grauho
Copy link
Contributor Author

grauho commented May 13, 2024

Interestingly enough the problem this addresses isn't a problem for most LoRAs as the LoRA loading logic only needs to access the "alpha" scalars when the "scale" tensor is missing. That said I believe this addition is still valuable for two reasons. Firstly, edge cases that don't have the scale tensor may yield undesirable results if the alpha scalar tensor isn't loaded as observed with the SDXL Lightning LoRA. Secondly, this keeps the dimension check from spamming the users terminal with 'ERROR' log messages that might lead to confusion.

@leejet
Copy link
Owner

leejet commented May 14, 2024

Thank you for your contribution, but I believe the latest master branch has already addressed this issue and is more versatile, even functioning in Debug mode.

@grauho grauho closed this May 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants