Advertisement

Stable and Refined Style Transfer Using Zigzag Learning Algorithm

  • Lingli ZhanEmail author
  • Yuanqing Wang
Article
  • 25 Downloads

Abstract

Recently, style transfer based on the convolutional neural network has achieved remarkable results. In this paper, we extend the original neural style transfer algorithm to ameliorate the instability in the reconstruction of certain structural information, and improve the ghosting artefacts in the background of image which with low texture and homogeneous areas. For that end, we adopt zigzag learning strategy: The model parameters are optimized to an intermediate target firstly, then let the model converge to the final goal. We show the zigzag learning to multi-sample model which is fabricated from resampling the style input and to loss function that is split into two sections. And also, we demonstrate experimentally the effectiveness of the proposed algorithm and provide its theoretical analysis. Finally we show how to integrate the zigzag learning strategy in fast neural style transfer framework.

Keywords

Style transfer Neural networks Deep learning Painting transfer 

Notes

References

  1. 1.
    Gatys LA, Ecker AS, Bethge M (2015) Texture synthesis using convolutional neural networks. Adv Neural Inf Process Syst 70(1):262–270Google Scholar
  2. 2.
    Gatys LA, Ecker AS, Bethge M (2015) A neural algorithm of artistic style. arXiv preprint arXiv:1508.06576
  3. 3.
    Gatys LA, Ecker AS, Bethge M (2016) Image style transfer using convolutional neural networks. In: Computer vision and pattern recognition. IEEE, pp 2414–2423Google Scholar
  4. 4.
    Johnson J, Alahi A, Li FF (2016) Perceptual losses for real-time style transfer and super-resolution. In: European conference on computer vision. Springer, Cham, pp 694–711Google Scholar
  5. 5.
    Ulyanov D, Lebedev V, Vedaldi A, Lempitsky V (2016) Texture networks: feed-forward synthesis of textures and stylized images. arXiv preprint arXiv:1603.03417
  6. 6.
    Ulyanov D, Vedaldi A, Lempitsky V (2016) Instance normalization: the missing ingredient for fast stylization. arXiv preprint arXiv:1607.08022
  7. 7.
    Dumoulin V, Shlens J, Kudlur M (2016) A learned representation for artistic style. arXiv preprint arXiv:1610.07629
  8. 8.
    Li C, Wand M (2016) Precomputed real-time texture synthesis with markovian generative adversarial networks. In: European Conference on Computer Vision. Springer International Publishing, pp 702–716Google Scholar
  9. 9.
    Selim A, Elgharib M, Doyle L (2016) Painting style transfer for head portraits using convolutional neural networks. ACM Trans Graph 35(4):1–18CrossRefGoogle Scholar
  10. 10.
    Yin R (2016) Content aware neural style transfer. arXiv preprint arXiv:1601.04568
  11. 11.
    Chen YL, Hsu CT (2016) Towards deep style transfer: a content-aware perspective. In: British machine vision conference, pp. 8.1–8.11Google Scholar
  12. 12.
    Champandard AJ (2016) Semantic style transfer and turning two-bit doodles into fine artworks. arXiv preprint arXiv:1603.01768
  13. 13.
    Risser E, Wilmot P, Barnes C (2017) Stable and controllable neural texture synthesis and style transfer using histogram losses. arXiv preprint arXiv:1701.08893
  14. 14.
    Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556
  15. 15.
    Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612CrossRefGoogle Scholar
  16. 16.
    Odena A, Olah C, Shlens J (2016) Conditional image synthesis with auxiliary classifier gans. arXiv preprint arXiv:1610.09585
  17. 17.
    Breiman L (1994) Bagging predictors. Mach Learn 24(2):123–140zbMATHGoogle Scholar
  18. 18.
    Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Machine learning: proceedings of thirteenth international conference. ACM, pp 148–156Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of Electronic Science and Engineering/Key Laboratory of Intelligent Optical Sensing and Manipulation, Ministry of EducationNanjing UniversityNanjingChina

Personalised recommendations