Improving the Accuracy of Virtual Try-On with Real-World Dataset Using Genetic Algorithm

Kosei Takaki, Takumi Ikenaga, Taiyo Sato, Shudai Ishikawa


Virtual try-on refers to trying on clothing virtually on a computer. The latest research has proposed highly accurate try-on methods that can complement any posture and features of clothing (such as texture and logo). However, try-on an input image that significantly differs from the training data is difficult. In other words, if images input by the user (images taken by a web camera or smartphone that show the background) are used as input data, they cannot be processed appropriately. This research aims to realize virtual try-on under natural environment conditions, using images taken by web cameras or smartphones as input. Creating a dataset of sufficient volume for training requires a lot of time and effort. To learn efficiently with little training data, we apply transfer learning optimized by a genetic algorithm (GA), which optimizes the layers of a pre-trained model and additional models. Using GA to determine the weight update and fixation of the network during training, we aim to realize learning with less data. Virtual try-on systems consist of several networks responsible for partitioning human images, deforming clothes, and generating final output images. If retraining is performed for all networks, the number of networks and data do not match, leading to overfitting. Using GA, we carefully select networks in advance and verify the accuracy of the virtual try-on. The try-on results were compared and evaluated using SMD in conventional GA methods with the teacher data. However, preparing teacher data is costly and time-consuming, as it uses images not included in the existing training dataset, similar to a natural environment dataset. Therefore, we developed a method that evaluates only the try-on results, compared it with the method using teacher data, and verified it.


Deep Learning; Virtual Try-On; Human Segmentation; GA

Full Text:



  • There are currently no refbacks.