Change the output layer of AlexNet/GoogleNet/ImageNet?

702 views Asked by At

I have a question regarding changing the output layer of the nets (AlexNet/GoogleNet/ImageNet). So the standard output is a 1x1000 Vector so one value for each class.

I know I can change the output to e.g 5 so I would get 1x5 Vector, if I only have 5 classes.

But what is if I do not have classes? Is it possible to change the output to a matrix like 18x18. Because my net should output a density map and not a "class". And is it recommended to use a pre-trained net for my task or should I train from scratch?

Thank you for your help :-)

1

There are 1 answers

0
lejlot On

But what is if i do not have classes?

The concept of "class" is not really connected to the architecture but rather to the loss function itself. In other words if you have 1000 outputs it does not matter whether you want to classify among 1000 disjoint classes, assign 1000 tags, or regress on 1000 dimensional real output - architecture still makes perfect sense.

Is it possible to change the output to a matrix like 18x18

The "naive" approach would be to output 18*18 = 324 values and treat it as a 2-dim matrix. However, the 2-dim structure suggests there is some characteristics that can be exploited on the architecture side, one typical characteristic is translational invariance, which is exploited in convnets, and if the same is true for your output you might consider deconvolution (of any sort, since there are many) for your model.

And is it recommended to use a pretrained net for my task? Or shoud i learn from scratch ?

This does not depend on the architecture but task. If your task is similar enough to the ones a given net was trained on, you can use the pretrained one as a starting point and just "fine-tune" on the new one. In general, using pretrained net as a starting point is a safe thing to do (it should not be worse than training from scratch). Remember to train the whole network, and not just added parts though (unless you do not have enough data to train the whole structure).