Replies: 1 comment 2 replies
-
Hi @MkRakesh , Good question! It's actually the reverse for the unfreezing - we unfreeze the last ~10 or so layers, the layers closest to the output of the model (so the initial layers stay frozen). Also, since the resizing/normalization layers often do not have any learnable parameters, these layers won't change even if they are frozen/unfrozen. Does this make sense? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, Daniel .. I have been watching your tensorflow course and practcing for a while now. In transfer learning with EfficientnetB0 as base_line model we are skipping normalize/resizing since the base_line model already has these layers but the sametime we are freezing the layers first and tuning last 5 or 10 layers. So since these resizing/normalisation layers are among the first few layers and they are freezed while we use them , in effect we are not passing the image without resizing/normalising right.. I am in some chaos regardingthis.. Also i am getting a validation accuracy of 75% and while predicting my model is biased to a single class out of 101 classes.. Hope you will clear these concers. regards
Beta Was this translation helpful? Give feedback.
All reactions