Skip to content

[bug]: #8008

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
1 task done
coveral opened this issue May 16, 2025 · 4 comments
Open
1 task done

[bug]: #8008

coveral opened this issue May 16, 2025 · 4 comments
Labels
bug Something isn't working

Comments

@coveral
Copy link

coveral commented May 16, 2025

Is there an existing issue for this problem?

  • I have searched the existing issues

Operating system

Windows

GPU vendor

Nvidia (CUDA)

GPU model

No response

GPU VRAM

No response

Version number

5.11

Browser

edje

Python dependencies

my lore doesn't work anymore

What happened

Image

What you expected to happen

N/A

How to reproduce the problem

No response

Additional context

No response

Discord username

No response

@coveral coveral added the bug Something isn't working label May 16, 2025
@psychedelicious
Copy link
Collaborator

When did it last work?

@coveral
Copy link
Author

coveral commented May 20, 2025

I can't say for sure when this last worked, but after getting detailed information from the lora developer I can tell you what the problem is with this error. Here's the message from the developer: "InvokeAI's model manager relies on specific metadata within LoRA files to identify and load them appropriately. Other interfaces like Automatic1111 and ComfyUI have implemented more flexible or updated mechanisms for parsing and loading various LoRA formats, including slider LoRAs."

Here is his solution to the problem, which he kindly provided to me to solve the problem: "

Try this - make a file lora-name.json and copy this in there and save - {
"metadata": {
"ss_network_module": "networks.lora",
"ss_network_dim": "1",
"ss_network_alpha": "1",
"ss_base_model": "Pony"
}
}
Then try again
If that doesn't work - save json file as metadata.json and run python script in the same location where the file and lora is - python safetensors_util.py writemd your_model.safetensors metadata.json output_model.safetensors
That will fix it
If there already is a json card in folder - edit existing
If it's ILXL model add ILXL in the json.
Feel free to find the InvokeAI/invokeai/frontend/web/src/features/lora/components/LoRACard.tsx file and edit the following <CompositeNumberInput
value={lora.weight}
onChange={handleChange}
min={-5}
max={5}
step={0.01}
w={20}

@psychedelicious
Copy link
Collaborator

Please link to the lora that isn't working.

@coveral
Copy link
Author

coveral commented May 20, 2025

Here: https://civitai.green/models/1410317/amateur-style-slider
This error happens with all his lore sliders

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants