-
Notifications
You must be signed in to change notification settings - Fork 2.6k
[bug]: #8008
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
When did it last work? |
I can't say for sure when this last worked, but after getting detailed information from the lora developer I can tell you what the problem is with this error. Here's the message from the developer: "InvokeAI's model manager relies on specific metadata within LoRA files to identify and load them appropriately. Other interfaces like Automatic1111 and ComfyUI have implemented more flexible or updated mechanisms for parsing and loading various LoRA formats, including slider LoRAs." Here is his solution to the problem, which he kindly provided to me to solve the problem: " Try this - make a file lora-name.json and copy this in there and save - { |
Please link to the lora that isn't working. |
Here: https://civitai.green/models/1410317/amateur-style-slider |
Is there an existing issue for this problem?
Operating system
Windows
GPU vendor
Nvidia (CUDA)
GPU model
No response
GPU VRAM
No response
Version number
5.11
Browser
edje
Python dependencies
my lore doesn't work anymore
What happened
What you expected to happen
N/A
How to reproduce the problem
No response
Additional context
No response
Discord username
No response
The text was updated successfully, but these errors were encountered: