Skip to content

Conversation

covix
Copy link

@covix covix commented Jul 26, 2018

Working with the code I noticed that data augmentation (i.e. random crop and random flip) was not performed, and that in val.ipynb data normalisation at test time was not working correctly. With these changes I was able to perform MAE: 11.12 on ShanghaiTech Part B, not far from what reported in the paper.

@leeyeehoo
Copy link
Owner

I'll check it today. Sorry for the delay.

@vlad3996
Copy link

I've obtained 10.42 MAE and 16.89 MSE on part B without augmentation but it takes about 20 hours on GTX 1080Ti. So maybe augmentation is not needed at all.

@covix
Copy link
Author

covix commented Aug 7, 2018

Probably part B is a big enough dataset, in which data augmentation does not make much difference.
I tried CSRNet with a smaller dataset, and data augmentation do makes a lot of difference, it improves MAE from ~20 to ~10.

@eain3314
Copy link

@vlad3996 请问MSE怎么加上的?谢谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants