Advertisement

Cycle Generative Adversarial Network for Unpaired Sketch-to-Character Translation

  • Leena AlsaatiEmail author
  • Siti Zaiton Mohd HashimEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1073)

Abstract

This research investigated the capabilities of Cycle-Consistent adversarial network (Cycle GAN) in the application of stick figure sketches to cartoon character translation. Few studies focused on the task of generating a variety of poses and facial expression of cartoon characters from simple sketches of stick figures, based on unpaired dataset samples. Furthermore, existing studies showed low performance in detecting rare pose features. In this research, two datasets have been created which consists of paired and unpaired images of manually drawn sketches and cartoon characters. The performance of Cycle GAN has been compared against a paired based model, Pix2Pix, by using qualitative and quantitative evaluation measurements. Results show that Pix2Pix outperforms Cycle GAN in accurately mapping cartoon characters to stick figures. Despite that, the Cycle GAN still managed to produce competing results.

Keywords

Cycle-consistent GAN Pix2pix Image translation 

References

  1. 1.
    Goodfellow, I.J., Pouget-abadie, J., Mirza, M., Xu, B., Warde-farley, D.: Generative adversarial nets, pp. 1–9 (2014)Google Scholar
  2. 2.
    Zhu, J., Park, T., Efros, A.A.: Unpaired image-to-image translation using cycle-consistent adversarial networks (2018)Google Scholar
  3. 3.
    Jin, Y., Zhang, J., Li, M., Tian, Y., Zhu, H.: Towards the high-quality anime characters generation with generative adversarial networks. In: Machine Learning for Creativity and Design Workshop at NIPS Work, Long Beach, California, USA, pp. 1–13 (2017)Google Scholar
  4. 4.
    Kataoka, Y., Matsubara, T., Uehara, K.: Automatic manga colorization with color style by generative adversarial nets. In: IEEE SNPD, Kanazawa, Japan, vol. 495, pp. 495–499 (2017)Google Scholar
  5. 5.
    Saito, M., Matsui, Y.: Illustration2Vec: a semantic vector representation of illustrations. In: SIGGRAPH Asia, no. 2 (2015)Google Scholar
  6. 6.
    Zhang, S., Ji, R., Hu, J., Lu, X., Li, X.: Face sketch synthesis by multidomain adversarial learning. IEEE Trans. Neural Netw. Learn. Syst. 1–10 (2018)Google Scholar
  7. 7.
    Huang, Y., Khan, S.M.: DyadGAN : generating facial expressions in dyadic interactions. In: IEEE Conference on Computer Vision and Pattern Recognition Workshops Work, DyadGAN, pp. 2259–2266 (2017)Google Scholar
  8. 8.
    Philip, C., Jong, L.H.: Face sketch synthesis using conditional adversarial networks. In: ICTC, pp. 373–378 (2017)Google Scholar
  9. 9.
    Wang, L., Sindagi, V.A., Patel, V.M.: High-quality facial photo-sketch synthesis using multi-adversarial networks. In: 13th IEEE International Conference on Automatic Face and Gesture Recognition High-Quality, pp. 83–90 (2018)Google Scholar
  10. 10.
    Royer, A., et al.: XGAN: unsupervised Image-to-image translation for many-to-many mappings, pp. 1–19 (2017)Google Scholar
  11. 11.
    Isola, P., Zhu, J.Y., Zhou, T., Efros, A.A.: Image-to-image translation with conditional adversarial networks. In: 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, pp. 5967–5976 (2017)Google Scholar
  12. 12.
    Fukumoto, Y., Shimizu, D., Shibata, C.: Generation of character illustrations from stick figures using a modification of generative adversarial network. In: 2018 IEEE 42nd Annual Computer Software and Applications Conference, vol. 01, pp. 183–186 (2018)Google Scholar
  13. 13.
    Selseng, S.: Guiding the training of generative adversarial networks (2017)Google Scholar
  14. 14.
    Fang, F., Yamagishi, J., Echizen, I., Lorenzo-trueba, J.: High-quality nonparallel voice conversion based on cycle-consistent adversarial network national. In: 2018 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 5279–5283 (2018)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Universiti Teknologi MalaysiaSkudaiMalaysia

Personalised recommendations