文摘
With the advent and popularity of e-commerce and clothing image-sharing websites, clothing image search and annotation become active research topics in recent years. Clothing image annotation is a challenging task due to large variations in clothing appearance, human body pose and background. In this paper, we explore part-based clothing image annotation in a search and mining framework. Similar image search is first conducted to discover visual neighbors for a query image. The impact of large variations of clothing is alleviated by pose detection and part-based feature alignment. Tag relevance and tag saliency are taken into consideration to obtain the candidate tags. The relevance of candidate tags is identified by mining visual neighbors of a query image, while the saliency is determined according to the relationship between query image parts and part clusters on the whole training set. Experiments on a dataset with 1.1 million clothing images demonstrate the effectiveness and efficiency of the proposed approach.