-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dumping weights from transfer learning - inception v4 #7
Comments
Unfortunatly I don't know a "simple" way to do it. I did it by dumping parameters from a loaded model in tensorflow. Please have a look at: Then you can load the pretrained parameters and test if you get the same result with the pytorch model: |
Thank you for your answer! I was referring to those files, in particular tensorflow_dump. I see you download the model from the internet, but what if you only have a .pb file you want to dump the weights of? Assuming that the structure is the same as inception v4 (or similar enough anyway) |
You should definitely try. Beware the tensorflow API has changed a bit. You should have a look to the nasnet directory (in this repo) which uses the new API. |
@Cadene thanks alot for your reply , I really need a clarification on a paper that I read related with transfer learning portion. The subject is a classification problem for numbering disease level from 0 to 5 Inside the artice there is a statement like this " An ImageNet pre-trained Resnet-50 model was used, and the Inception’s bottom layers were frozen to prevent their weights from getting updated during training process while the remaining top layers were trained with the pre-processed fundus images. |
@berkinimamoglu It sounds a mistake to me, because ResNet-50 doesn't have any "Inception's bottom layers". Source The authors of the paper are probably speaking about fine-tuning where most of the layers (starting from the beginning) are frozen. Unfortunately, they don't provide further details. What you should do is to download resnet50 with pretrained weights and try to fine tune specific layers. import pretrainedmodels as pm
model = pm.__dict__['resnet50'](num_classes=1000, pretrained='imagenet')
nb_classes = 10 # your new dataset
model.last_linear = nn.Linear(model.last_linear.in_features, nb_classes)
# freeze all parameters
for param in model.parameters():
param.requires_grad = False
# unfreeze the last block residual before the classif
for param in model.layer4.parameters():
param.requires_grad = True
# unfreeze classif
for param in model.last_linear.parameters():
param.requires_grad = True
optimizer = torch.optim.Adam(
filter(lambda p: p.requires_grad, model.parameters()),
3e-4)
# train the network |
Hi,
Is there a way to load inception_v4 weights from a custom .pb tensorflow model? The weights are different than the one from the original inception but the architechture is the same.
It looks like it should be an easy modification but I know nothing about tensorflow :)
The text was updated successfully, but these errors were encountered: