Skip to content

support training, fix some inplace func of nn #118

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 12, 2022
Merged

support training, fix some inplace func of nn #118

merged 2 commits into from
Dec 12, 2022

Conversation

Gy-Lu
Copy link
Contributor

@Gy-Lu Gy-Lu commented Dec 1, 2022

This PR should be merged after the other two.

  1. Add training code and its shell script, support DDP now.
  2. Add lr_scheduler and move loss to hub.
  3. Modify some bugs introduced by openfold, the inplace func in dropout and the unused out_product_mean in evoformer.

For unit test, it would cause OOM on CI machine. I am trying to develop a tiny model.

@Gy-Lu Gy-Lu marked this pull request as draft December 1, 2022 13:48
@Shenggan
Copy link
Contributor

Shenggan commented Dec 8, 2022

Seem still have some bugs:

  1. from fastfold.model.loss import * -> from fastfold.model.hub.loss import * in fastfold/model/nn/heads.py
  2. t = self.layer_norm(t[i])
    introduced by refactor chunk for training #117

@Shenggan Shenggan marked this pull request as ready for review December 12, 2022 03:10
@Shenggan Shenggan changed the title [Not merge]support training, fix some inplace func of nn support training, fix some inplace func of nn Dec 12, 2022
@Shenggan Shenggan merged commit ceee81d into main Dec 12, 2022
@Shenggan Shenggan deleted the training branch December 12, 2022 03:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants