Google Launches AI Tool That Helps Shoppers Pick And Try Clothing

Google has launched a new shopping tool that lets people see what clothes might look like on their own bodies without going near a changing room. All it takes is a full-length photo. Once uploaded, the tool places clothing on the image, giving a realistic view of how something could fit and hang on different body shapes.

This feature started out earlier in the year as a test through Search Labs. It鈥檚 now rolling out to people in the US across Search, Google Shopping and Google Images. To use it, shoppers tap the 鈥渢ry it on鈥 icon next to any supported clothing item and upload their picture.

The tech behind it was trained to understand how different fabrics stretch, fold and drape across bodies. That means it tries to give a more accurate picture than those cut-and-paste apps from the past. It works with billions of items listed in Google鈥檚 Shopping Graph, so most items from known shops should be covered.

 

What Makes This Tool Useful Besides The Visuals?

 

Google has also added new price alert features. US shoppers can now set alerts that go beyond the product itself… they can pick their preferred size, colour and even how much they鈥檙e willing to spend.

If that black jacket someone鈥檚 been eyeing goes down to the right price, they鈥檒l get a heads up. The system uses product and price data from across the web, pulled from Google鈥檚 Shopping Graph, which now covers around 50 billion items.

Danielle Buckley, Google鈥檚 Director of Product for Consumer Shopping, said, “No more constantly checking to see if that bag you鈥檙e eyeing is finally at the right price for you or forgetting to come back to a product you loved!”

 

 

Is It Just Clothes Or More Than That?

 

From this autumn, US users will also be able to search for outfit and home decor ideas through AI Mode, Google鈥檚 new AI chat tab launched in May. People will be able to type in things like 鈥渟oft pink dress for a picnic鈥 or 鈥渓iving room with light wood and plants鈥 and get back visual suggestions with links to actual products.

These suggestions are created using vision match tech. It works by taking what the person types and turning it into visual ideas, then matching those ideas to products listed in the Shopping Graph. The idea is for people get inspiration and options in one go, rather than going through multiple apps and websites one by one.

 

Who Is This Tool For?

 

At this stage, it鈥檚 mostly for US shoppers. There鈥檚 no word yet on when it鈥檒l roll out to other countries. But it鈥檚 clearly being built for people who want less back and forth when it comes to online shopping. That could be anyone from students doing back to school shopping to people redoing their bedrooms or buying clothes for an event.

Shoppers who don鈥檛 want the uncertainty that comes with guessing sizes or who don鈥檛 trust model photos, could find this useful because seeing the item on a body that looks like theirs could make a difference. The same goes for shoppers who鈥檝e missed deals because they waited too long or didn鈥檛 save a link.

It still depends on people uploading their own photos, and not all clothes will display perfectly. But it鈥檚 a sign that Google is trying to fold more of the shopping process into its own search tools, and make it feel a bit less annoying.