Google released an announcement today at the annual Google I/O developer conference that transformed the online shopping of clothes by introducing an advanced AI-powered virtual try-on feature.
This initiative is the next leap in the future of e-commerce and allows customers to see how clothes would look on their body before making a purchase. This announcement was made to bring the convenience of online shopping and confidence of in-store trials together to an even balance.
The virtual try-on feature launched in the US through Google Search Labs enables a user to upload a full-length photo of themselves. The user can then try on a variety of clothes such as shirts, pants, skirts, or dresses, all on their full-body photo directly on the search results. The AI model used in this feature possesses a profound understanding of the human body, and in the body, certain fabrics fold, stretch, and drape.
This gives the user a good understanding of how the garment will look on them as an individual, taking into account things such as body size and shape. Many digital users have reported returning goods purchased online due to the difference between models and real people.
Vidhya Srinivasan, VP and GM for Ads and Commerce at Google added “A lot of users are dissatisfied with their purchase, saying the clothes do not fit the model exactly” during the I/O conference’s announcement. This virtual try-on feature has been integrated into Google Search, and in the search, eligible products will appear with an icon written ‘Try it on’.
By clicking on this icon, a user can upload their own photo, and with the AI model, they can view themselves in a garment selected. A user can further share these virtual try-on looks with friends to rate the best look to purchase.
In the coming month, the tool will expand to more retailer stores and reach more users. This is a significant move towards reducing fashion wastage and exploring better methods to shop clothes as AI in e-commerce digital shopping.