Skip to content

Supported sft for vlm #499

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
May 29, 2025
Merged

Supported sft for vlm #499

merged 6 commits into from
May 29, 2025

Conversation

JingofXin
Copy link
Contributor

Supported sft for vlm:

  • Qwen2.5-VL-3B-Instruct
  • InternVL2-3
    image
    training_loss

@JingofXin
Copy link
Contributor Author

JingofXin commented May 8, 2025

ENV:

git clone https://github.com/hiyouga/LLaMA-Factory.git             (SHA: 52f25651)
pip install --no-deps -e .

pip install peft==0.14.0
pip install transformers==4.51.3
pip install qwen_vl_utils 
pip install --upgrade lmdeploy --user      (lmdeploy-0.8.0 pynvml-12.0.0)
pip install accelerate==1.6.0

@JingofXin JingofXin changed the title [WIP]Supported sft for vlm Supported sft for vlm May 23, 2025
lwj-st pushed a commit to LazyAGI/LazyLLM-Env that referenced this pull request May 28, 2025
@mergify mergify bot added the lint_pass label May 28, 2025
@lwj-st lwj-st removed the lint_pass label May 29, 2025
@mergify mergify bot added the lint_pass label May 29, 2025
@wzh1994 wzh1994 merged commit ca8010b into LazyAGI:main May 29, 2025
15 of 27 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants