Skip to content

Conversation

A-transformer
Copy link
Owner

Correct the BACKWARD_PREFETCH_SUBMIT mismatch
FORWARD_PREFETCH_SUBMIT = 'forward_prefetch_submit'

raza-sikander and others added 2 commits March 9, 2025 16:30
Signed-off-by: Shaik Raza Sikander <[email protected]>
Signed-off-by: Olatunji Ruwase <[email protected]>
Signed-off-by: Max Kovalenko <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Signed-off-by: shaomin <[email protected]>
Signed-off-by: Stas Bekman <[email protected]>
Signed-off-by: siqi <[email protected]>
Signed-off-by: Logan Adams <[email protected]>
Signed-off-by: Wei Wu <[email protected]>
Signed-off-by: ShellyNR <[email protected]>
Signed-off-by: Lai, Yejing <[email protected]>
Signed-off-by: Hongwei <[email protected]>
Co-authored-by: Olatunji Ruwase <[email protected]>
Co-authored-by: Max Kovalenko <[email protected]>
Co-authored-by: Logan Adams <[email protected]>
Co-authored-by: inkcherry <[email protected]>
Co-authored-by: wukong1992 <[email protected]>
Co-authored-by: shaomin <[email protected]>
Co-authored-by: Hongwei Chen <[email protected]>
Co-authored-by: loadams <[email protected]>
Co-authored-by: Stas Bekman <[email protected]>
Co-authored-by: siqi654321 <[email protected]>
Co-authored-by: siqi <[email protected]>
Co-authored-by: Wei Wu <[email protected]>
Co-authored-by: Masahiro Tanaka <[email protected]>
Co-authored-by: Shelly Nahir <[email protected]>
Co-authored-by: snahir <[email protected]>
Co-authored-by: Yejing-Lai <[email protected]>
Signed-off-by: A-transformer <[email protected]>
Correct the BACKWARD_PREFETCH_SUBMIT mismatch
FORWARD_PREFETCH_SUBMIT = 'forward_prefetch_submit'

Signed-off-by: A-transformer <[email protected]>
A-transformer and others added 10 commits March 9, 2025 16:39
Reapply deepspeedai#6846.
FYI @oelayan7

---------

Signed-off-by: inkcherry <[email protected]>
Co-authored-by: Logan Adams <[email protected]>
Support training multiple models, such as in
[HF](https://huggingface.co/docs/accelerate/en/usage_guides/deepspeed_multiple_model)

Here is some update on supporting multiple DS engines with single
loss.backward(). The main message is that I think we can support this.
First, some context. Backward pass in ZeRO is complicated because the
optimizations/features require special handling of gradients, such as:

1. Gradient partitioning
2. Overlapping backward and reduction
3. Upcasting for fp32 grad accumulation

So, we created engine.backward(loss) as a wrapper function to provide us
fine-grained control over backward as below

```python
def backward(loss):
 backward_prologue() # setup logic for special gradient handling
 loss.backward()
 backward_epilogue() # cleanup/teardown logic
```

As demonstrated by @muellerzr, this approach breaks down when loss
originates from multiple DS engines. Our proposed solution is to use
backward hooks on the module to launch backward_prologue() and
backward_epilogue() . Specifically,

1. backward pre hook on engine.module to launch backward_prologue()
before any module gradient is created.
2. backward post hook on engine.module to launch backward_epilogue()
after all module gradients are created.

We plan for this solution to preserve BC, i.e., engine.backward() will
remain correct for single engine scenarios.
The current status is that (1) is completed, while (2) is in progress.
To unblock e2e testing for multi-engine scenarios, since there are
probably other issues, we have a temporarily added
engine._backward_prologue() . You can try this out via the following
artifacts.

1. Simple multi-engine test code:
https://gist.github.com/tjruwase/f1adccf087b8fa269ffce2ab91c4f1c6#file-multi_engine-py
2. DS branch:
https://github.com/microsoft/DeepSpeed/tree/olruwase/zero_multi_models

---------

Signed-off-by: Olatunji Ruwase <[email protected]>
Co-authored-by: Logan Adams <[email protected]>
Co-authored-by: Stas Bekman <[email protected]>
…i#7135)

Copy changes from deepspeedai/DeepSpeed-MII#558.
Fixes issue where docs still referenced CLA.

---------

Signed-off-by: Logan Adams <[email protected]>
Keeps lines within PEP 8 length limits.
Enhances readability with a single, concise expression.
Preserves original functionality.

---------

Signed-off-by: Shaik Raza Sikander <[email protected]>
Signed-off-by: Olatunji Ruwase <[email protected]>
Signed-off-by: Max Kovalenko <[email protected]>
Signed-off-by: inkcherry <[email protected]>
Signed-off-by: shaomin <[email protected]>
Signed-off-by: Stas Bekman <[email protected]>
Signed-off-by: siqi <[email protected]>
Signed-off-by: Logan Adams <[email protected]>
Signed-off-by: Wei Wu <[email protected]>
Signed-off-by: ShellyNR <[email protected]>
Signed-off-by: Lai, Yejing <[email protected]>
Signed-off-by: Hongwei <[email protected]>
Signed-off-by: Liang Cheng <[email protected]>
Signed-off-by: A-transformer <[email protected]>
Co-authored-by: Raza Sikander <[email protected]>
Co-authored-by: Olatunji Ruwase <[email protected]>
Co-authored-by: Max Kovalenko <[email protected]>
Co-authored-by: Logan Adams <[email protected]>
Co-authored-by: inkcherry <[email protected]>
Co-authored-by: wukong1992 <[email protected]>
Co-authored-by: shaomin <[email protected]>
Co-authored-by: Hongwei Chen <[email protected]>
Co-authored-by: loadams <[email protected]>
Co-authored-by: Stas Bekman <[email protected]>
Co-authored-by: siqi654321 <[email protected]>
Co-authored-by: siqi <[email protected]>
Co-authored-by: Wei Wu <[email protected]>
Co-authored-by: Masahiro Tanaka <[email protected]>
Co-authored-by: Shelly Nahir <[email protected]>
Co-authored-by: snahir <[email protected]>
Co-authored-by: Yejing-Lai <[email protected]>
Co-authored-by: A-transformer <[email protected]>
Unpin transformers version for all workflows except
`nv-torch-latest-v100` as this still has a tolerance issue with some
quantization tests.

Signed-off-by: Logan Adams <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants