Skip to content

Can you further unify Consistency Models and Rectified Diffusion? #4

@YouJiacheng

Description

@YouJiacheng

https://x.com/YouJiacheng/status/1852680900397633976

CMs generate $$\hat{x}_ 0$$ from $$x_{t-Δt}$$ (less noisy), RDs pre-generate $$\hat{x}_0$$ from pure noise $$\epsilon$$ (more noisy).

Importantly, $$x_ {t-Δt}$$ and $$\epsilon$$ and the input to the model $$x_t$$ has the same source of randomness $$\epsilon$$.

Diffusion Distillation methods also make the target and input has the same source of randomness $$\epsilon$$.

This will encourage the consistency (and reduce the variance) of the model along the same trajectory, which should be crucial for few-step sampling.

If we allow $$Δt < 0$$, then CMs is a strict generalization of online RDs.

Cheng Lu & Yang Song have proven that continuous-time (Δt→0) CMs can work very well. This fact strongly hint that Δt < 0 might also be a valid option.
2

3

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions