Fast Transfer Learning for GAN Style Transfer
In most recent researches, generative adversarial models are extremely powerfulto transfer real photos into stylized images. However, the major hurdle in this approach is that once we need to perform style transfer to a brand new style the model has never seen, the whole network needs to be trained from scratch.
In this project, we proposed a novel transfer learning method for GAN style transfer. In particular, it is based on the assumption that the new weights are the combination of scaling and shifting the pre-trained weights. Combined with Generator Decomposition strategy, out method consumes fewer computation resources and training time. We further show that this approach achieves similar results compared with naive fine-tuning but has higherspeed and far less training parameters.