Advertisement

Robust Feature Extraction for Material Image Retrieval in Fashion Accessory Management

  • Yuyang Meng
  • Dongmei Mo
  • Xiaotang Guo
  • Yan Cui
  • Jiajun Wen
  • Wai Keung Wong
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 849)

Abstract

Fashion accessory plays an important role in costume designing. A well-designed accessory consisting of different types of materials help enhance the aesthetic of the dresses. A key problem of accessory design is to find the replaceable material with appropriate aesthetic and cheaper price. However, such a process is performed manually in accessory factory, in which the work efficiency is very low. Therefore, material image retrieval is an important technique to automatic and facilitates the process of accessory design and management. In this paper, a voting-based preprocessing method is proposed to locate the material in the image. And thus a regression model is built to make use of the neighboring edge directions to optimize the robust edge direction of a point. Finally, both color and edge features will be coded as histogram-based features for representing the materials for image retrieval. Experiments have been conducted on real captured material image to validate the effectiveness of the proposed locating and searching technique.

Keywords

Accessory Material retrieval Feature extraction 

Notes

Acknowledgements

This work was supported in part by the Natural Science Foundation of China under Grant 61703283, 61773328, 61672358, 61703169, 61573248, in part by the research grant of the Hong Kong Polytechnic University (Project Code: G-UA2B) in part by the China Postdoctoral Science Foundation under Project 2016M590812, Project 2017T100645 and Project 2017M612736, in part by the Guangdong Natural Science Foundation under Project 2017A030310067, Project with the title Rough Sets-Based Knowledge Discovery for Hybrid Labeled Data and Project with the title The Study on Knowledge Discovery and Uncertain Reasoning in Multi-Valued Decisions, and in part by the Shenzhen Municipal Science and Technology Innovation Council under Grant JCYJ20160429182058044.

References

  1. 1.
    Chatzichristofis, S.A., Iakovidou, C., Boutalis, Y., Marques, O., Co.Vi.Wo.: Color visual words based on non-predefined size codebooks. IEEE Trans. Cybern. 43(1), 192–205 (2013)CrossRefGoogle Scholar
  2. 2.
    Biswas, S., Aggarwal, G., Chellappa, R.: An efficient and robust algorithm for shape indexing and retrieval. IEEE Trans. Multimedia 12(5), 372–385 (2010)CrossRefGoogle Scholar
  3. 3.
    Bhattacharjee, S.D., Yuan, J., Tan, Y.-P., Duan, L.-Y.: Query-adaptive small object search using object proposals and shape-aware descriptors. IEEE Trans. Multimedia 18(4), 726–737 (2016)CrossRefGoogle Scholar
  4. 4.
    Hu, R.-X., Jia, W., Ling, H., Zhao, Y., Gui, J.: Angular pattern and binary angular pattern for shape retrieval. IEEE Trans. Image Process. 23(3), 1118–1127 (2014)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Polsley, S., Ray, J., Hammond, T.: SketchSeeker: finding similar sketches. IEEE Trans. Hum. Mach. Syst. 47(2), 194–205 (2017)CrossRefGoogle Scholar
  6. 6.
    Wang, B., Gao, Y.: Hierarchical string cuts: a translation, rotation, scale, and mirror invariant descriptor for fast shape retrieval. IEEE Trans. Image Process. 23(9), 4101–4111 (2014)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Yuyang Meng
    • 1
  • Dongmei Mo
    • 2
  • Xiaotang Guo
    • 2
  • Yan Cui
    • 3
  • Jiajun Wen
    • 2
    • 4
    • 5
  • Wai Keung Wong
    • 4
    • 5
  1. 1.School of Information Science and EngineeringShaoguan UniversityShaoguanChina
  2. 2.College of Computer Science and Software EngineeringShenzhen UniversityShenzhenChina
  3. 3.School of Mathematics and Information ScienceNanjing Normal University of Special EducationNanjingChina
  4. 4.Institute of Textile and ClothingThe Hong Kong Polytechnic UniversityKowloonHong Kong
  5. 5.The Hong Kong Polytechnic University Shenzhen Research InstituteShenzhenChina

Personalised recommendations