-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CNNCifar.weight_keys #8
Comments
I think this is for keeping conv layers global: the logic of codes is to select the last N layers as global layers. |
I think local part extract high level, compact features like feature extractor and global part acts like classifier. If the codes select conv layers as global part, does it meet the original meaning of the paper ? Or am i misunderstand the paper? |
I have the same question here. Is this problem solved? |
You can try to reproduce their experiments but as I tried to modify the code the results are different from what they put in the paper (I did this last year so I may forget the details). |
Nice Work!
But in Net.py CNNCifar.weight_keys, why are fc layers ahead of conv?
If nothing wrong with my understanding, I suppose it should be like this because you do conv first then feedfoward
The text was updated successfully, but these errors were encountered: