Skip to content

[video][r25d model update] fixing #1265 #1432

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from
Closed

[video][r25d model update] fixing #1265 #1432

wants to merge 4 commits into from

Conversation

bjuncek
Copy link
Contributor

@bjuncek bjuncek commented Oct 8, 2019

Initial PR for fixing issue 1265 regarding
a) midplanes computation on the per-layer level rather than per-block level
b) adding additional batchnorm params according to match @daniel-j-h suggestion (the impact of that on the performance is negligible, but allows easier weight transfer).


Leftover todo's:

  • Uploading the newly trained models
  • Making sure that the old pretrained models can be loaded into a legacy version?

Copy link
Member

@fmassa fmassa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes are good and simplifies the code, which is great!

We should just improve on the backwards compatibility story, and then it will be good.

@@ -333,6 +340,12 @@ def r2plus1d_18(pretrained=False, progress=True, **kwargs):
Returns:
nn.Module: R(2+1)D-18 network
"""
warnings.warn(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The warning should appear when trying to load a state-dict, so you might need to have a custom _load_from_state_dict that checks the version and maybe builds a different model depending on the version? Something similar to #1224

@codecov-io
Copy link

codecov-io commented Oct 24, 2019

Codecov Report

❗ No coverage uploaded for pull request base (master@44a5bae). Click here to learn what that means.
The diff coverage is 37.5%.

Impacted file tree graph

@@            Coverage Diff            @@
##             master    #1432   +/-   ##
=========================================
  Coverage          ?   63.77%           
=========================================
  Files             ?       83           
  Lines             ?     6525           
  Branches          ?     1009           
=========================================
  Hits              ?     4161           
  Misses            ?     2069           
  Partials          ?      295
Impacted Files Coverage Δ
torchvision/models/video/resnet.py 73.28% <37.5%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 44a5bae...71525c6. Read the comment docs.

@bjuncek
Copy link
Contributor Author

bjuncek commented Oct 24, 2019

BC should be sorted - now it's only left to train the models :)

Copy link
Member

@fmassa fmassa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes LGTM, let's just wait until we have the new pre-trained weights before merging this

@bjuncek
Copy link
Contributor Author

bjuncek commented Dec 31, 2019

Hi @fmassa - did we ever got the updated models et al?

@SunDoge
Copy link

SunDoge commented Oct 15, 2020

Hi, are there any updates? This model is very important for many downstream tasks.

@bjuncek
Copy link
Contributor Author

bjuncek commented Oct 15, 2020

I don't have the access to the models anymore (changed jobs).
Maybe we can get this done after the release @fmassa? I agree it would be very useful.

@malfet malfet deleted the branch pytorch:master September 20, 2021 14:34
@malfet malfet closed this Sep 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants