To read this content please select one of the options below:

A mixed faster R-CNN and positioning coordinates method for recognition of suit button arrangement with small sample sizes

Yanwen Yang (Key Laboratory of Silk Culture Inheriting and Products Design Digital Technology-Ministry of Culture and Tourism, Zhejiang Sci-Tech University, Hangzhou, China)
Yuping Jiang (School of Fashion Design and Engineering, Zhejiang Sci-Tech University, Hangzhou, China)
Qingqi Zhang (School of Information Science and Technology, Zhejiang Sci-Tech University, Hangzhou, China)
Fengyuan Zou (School of Fashion Design and Engineering, Zhejiang Sci-Tech University, Hangzhou, China) (Zhejiang Provincial Research Center of Clothing Engineering Technology, Zhejiang Sci-Tech University, Hangzhou, China)
Lei Du (Key Laboratory of Silk Culture Inheriting and Products Design Digital Technology-Ministry of Culture and Tourism, Zhejiang Sci-Tech University, Hangzhou, China)

International Journal of Clothing Science and Technology

ISSN: 0955-6222

Article publication date: 1 March 2022

Issue publication date: 27 June 2022

87

Abstract

Purpose

It is an important style classification way to sort out suits according to the button arrangement. However, since the different dressing ways of suit cause the buttons to be easily occluded, the traditional identification methods are difficult to identify the details of suits, and the recognition accuracy is not ideal. The purpose of this paper is to solve the problem of fine-grained classification of suit by button arrangement. Taking men's suits as an example, a method of coordinate position discrimination algorithm combined faster region-based convolutional neural network (R-CNN) algorithm is proposed to achieve accurate batch classification of suit styles under different dressing modes.

Design/methodology/approach

The detection algorithm of suit buttons proposed in this paper includes faster R-CNN algorithm and coordinate position discrimination algorithm. Firstly, a small sample base was established, which includes six suit styles in different dressing states. Secondly, buttons and buttonholes in the image were marked, and the image features were extracted by the residual network to identify the object. The anchors regression coordinates in the sample were obtained through convolution, pooling and other operations. Finally, the position coordinate relation of buttons and buttonholes was used to accurately judge and distinguish suit styles under different dressing ways, so as to eliminate the wrong results of direct classification by the network and achieve accurate classification.

Findings

The experimental results show that this method could be used to accurately classify suits based on small samples. The recognition accuracy rate reaches 95.42%. It can effectively solve the problem of machine misjudgment of suit style due to the cover of buttons, which provides an effective method for the fine-grained classification of suit style.

Originality/value

A method combining coordinate position discrimination algorithm with convolutional neural network was proposed for the first time to realize the fine-grained classification of suit style. It solves the problem of machine misreading, which is easily caused by buttons occluded in different suits.

Keywords

Acknowledgements

Funding: This study is financially supported by the National Natural Science Foundation of China (61702460), the General Scientific Research Projects of the Zhejiang Provincial Education Department (Y201840287), the Open Project Program of Zhejiang Provincial Research Center of Clothing Engineering Technology (2018FZKF13), the National Undergraduate Innovation and Entrepreneurship Training Program (201910338011), and the Clothing Culture Innovation Team of Zhejiang Sci-Tech University.

Citation

Yang, Y., Jiang, Y., Zhang, Q., Zou, F. and Du, L. (2022), "A mixed faster R-CNN and positioning coordinates method for recognition of suit button arrangement with small sample sizes", International Journal of Clothing Science and Technology, Vol. 34 No. 4, pp. 532-548. https://doi.org/10.1108/IJCST-10-2020-0165

Publisher

:

Emerald Publishing Limited

Copyright © 2022, Emerald Publishing Limited

Related articles