Skip to content

[Dev Hybrid] Distill with loss_mask (SFT dataset) and sequence-TP #350

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Aug 13, 2025

Conversation

oleksost
Copy link
Contributor

@oleksost oleksost commented Aug 8, 2025

✨ Description

Masked sequence-tensor parallel distillation with reverse KL.

See comparison of loss and grad norm here for sequence and non-TP.

Select all that apply:

  • 🐛 Bug fix (non-breaking change that addresses a specific issue)
  • 🚀 New feature (non-breaking change that adds functionality)
  • ⚠️ Breaking change (a change that could affect existing functionality)
  • 📈 Performance improvement/optimization (improves speed, memory usage, or efficiency)
  • 🛠️ Code refactor (non-functional changes that improve code readability, structure, etc.)
  • 📦 Dependency bump (updates dependencies, including Dockerfile or package changes)
  • 📝 Documentation change (updates documentation, including new content or typo fixes)
  • 🔧 Infrastructure/Build change (affects build process, CI/CD, or dependencies)

📝 Changes

List the key changes introduced in this PR:

  1. Change A
  2. Change B

✅ Checklist

Make sure the following tasks are completed before submitting the PR:

General

  • 📜 I have read and followed the contributing guidelines.
  • 🏷️ I am using a clear and descriptive PR title that summarizes the key change or feature introduced.
  • 🎉 The functionality is complete, and I have tested the changes.
  • 📝 I have updated the documentation if needed.
  • ⚠️ The change does not introduce any new issues (e.g., runtime warnings, type checker errors, linting problems, unhandled edge cases).
  • 🧩 I have commented my code, especially in hard-to-understand areas.

Dependencies and Configuration

  • 🐋 I have updated the Docker configuration or dependencies, if applicable.
  • 🔄 I have ensured compatibility with the existing setup after dependency changes.

Testing

  • 🧪 I have added or updated tests to cover my changes.
  • ✔️ New and existing tests pass locally with my changes.
  • 🚦 I have tested these changes on GPUs and verified training stability.
  • 🏋️ I have tested the changes on realistic training workloads, if applicable.

Performance Impact

  • 📊 I have run benchmarks where applicable to evaluate the performance impact.
  • ✅ The benchmarks show no performance regression.
  • 🚀 The benchmarks indicate a potential performance improvement.
  • ⚠️ The benchmarks indicate a potential performance degradation.
  • 📈 I have provided benchmark results and detailed any performance impact below, if applicable.

📊 Performance Impact Details

If there is any impact on performance, describe it and provide benchmark results, if applicable:


🗒️ Additional Notes

Include any additional context, information, or considerations here, such as known issues, follow-up tasks, or backward compatibility concerns.

@oleksost oleksost requested a review from RaymondLi0 August 8, 2025 20:20
@oleksost oleksost changed the title Distill sft Distill with loss_mask (SFT dataset) and sequence-TP Aug 8, 2025
@oleksost oleksost changed the title Distill with loss_mask (SFT dataset) and sequence-TP [Dev hybrid] Distill with loss_mask (SFT dataset) and sequence-TP Aug 11, 2025
@oleksost oleksost changed the title [Dev hybrid] Distill with loss_mask (SFT dataset) and sequence-TP [Dev] Distill with loss_mask (SFT dataset) and sequence-TP Aug 11, 2025
@oleksost oleksost changed the title [Dev] Distill with loss_mask (SFT dataset) and sequence-TP [Dev Hybrid] Distill with loss_mask (SFT dataset) and sequence-TP Aug 11, 2025
@oleksost oleksost merged commit 5263996 into hybrid_dev Aug 13, 2025
@oleksost oleksost deleted the distill_sft branch August 13, 2025 13:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant