Practical AI: On Color Synonym Mapping in E-Commerce
Rana el Kaliouby mentioned Amazon Alexa as she was rehearsing a speech about Artificial intelligence (AI) is the simulation of human intelligence processes by machines, typically computer systems. ... More for an upcoming conference.
This shouldn’t have been a big deal, but when her personal Alexa woke up and said: “Playing Selena Gomez” it quickly grew into a situation that broke her focus.
After trying several times to make Alexa stop, Rana recognized what is perhaps AI’s most severe limitation: it can’t recognize and respond to what we’re feeling.
Her company, Affectiva, grew out of MIT’s Media Lab and is seeking to address this limitation through emotion measurement technology.
Still, the example highlights how far away AI is from achieving anything remotely close to true human understanding—which is often the assumptive underpinning of most fear-based discussions on the topic.
Take Sophia, for example. She’s a robot from Hong Kong’s Hanson Robotics, and she terrifies people precisely because she appears as though she is self-aware and can understand human emotions.
But she’s not and she can’t. It’s all appearances.
When her eyebrows furrow it may appear to us as though she’s thinking, but it’s simply a gesture she’s been programmed to do while she processes information.
And she can’t tell if you’re upset or happy; she’s capable only of processing language input and responding thanks to an encyclopedic knowledge on a variety of topics—similar to many other devices that are using Natural Language Processing.
However, one thing AI can do is color synonym mapping.
At Reflektion, we’ve been working to improve our e-commerce personalization solution, in part, by building out the world’s most comprehensive color knowledge base on fashion and apparel. We pair this with our proprietary algorithms and CB Insights-recognized AI technology so that our clients, including Ann Taylor and DXL, can provide an individualized customer experience in each moment of their customers’ journey (more on that later).
This is all to say: using AI for color synonym mapping is a practical AI use case that we know a thing or two about.
So, in this latest installment of our Practical AI series (see our introduction), we’ll answer some of the most common questions about color synonym mapping and provide practical e-commerce examples that should tie everything together for you.
What is color synonym mapping for eCommerce, also known as electronic commerce, digital commerce, or internet commerce, refers to the buying and selling o... More?
Color synonym mapping is the process of collecting all of the world’s potential color names and calculating the visual distance between them so that one individual color becomes part of a neighborhood of related colors.
When a potential customer types “violet” into an on-site search bar, for example, color synonym mapping is what allows “violet” to also be equated with all colors that the algorithm has determined is on the spectrum of violet.
This means that a search for “violet” will display violet as well as orchid, plum, magenta, and fuchsia, but it will also include colors such as “pretty princess purple” that aren’t officially recognized colors but are names in the broad spectrum of violet that have been used in a retailer’s catalog or on the web somewhere.
How are color synonyms compiled?
Compiling an exhaustive list of color synonyms is challenging for a variety of reasons, including because color naming is language dependent and because people typically refer to only a few colors to describe what they’re seeing: light blue, blue, and dark blue, for example, to describe what could be hundreds of different hues of blue.
To establish a baseline at Reflektion, we initially started with a color dataset of 140 colors. This obviously wasn’t enough, so we incorporated over 1,000 of the colors mentioned in Wikipedia, but that wasn’t enough either.
From there we moved on to Pantone, gathering another 1,800+ colors.
After combing through the web more, and assessing color datasets across all types of merchandising, we grew the algorithm to respond to well over 4,000 colors and color names—and at the time of writing we’ve far surpassed that figure.
Despite our own achievements, however, we’re not even in the ballpark of what the human eye can do. Researchers have not yet been able to understand how many colors a typical person can see, but this piece at the BBC puts it at about one million.
How are all of those colors mapped by distance?
This is where things get technical.
The first step is to define every color numerically by coordinates or additional features. There are multiple ways to do this, and because each has their own strengths we’ve blended them in such a way so as to maximize the strengths of each.
One relatively well-known and straightforward approach is to use the RGB representation, which divides colors into three basic elements (red, green, blue). With RGB, each color takes a value that ranges from 0 to 255 (8 bit per primary).
The additive combination of the three elements yields a color. This kind of representation is simple and useful because it allows graphic designers to pick colors via a hexadecimal codification (#FF0000 for pure red, for example) which reduces the effort to identify and compare colors.
Other methods take into account other properties, such as hue, luminosity, chroma, combinations of colors, etc. Some examples of the latter are HSL/HSV (hue/saturation/luminosity), XYZ, LCH, or LAB, among others.
From there we use formulas to convert colors from one classification approach to another, for computing distances between colors (we use euclidean or manhattan distance measures), and for increasing overall accuracy.
Then, finally, we can compile a list of color names and map their respective distances, denoting similar colors that can be used as synonyms:
As shown in the snapshots below, the results of computing color synonyms produces a list of related colors somewhat comparable to what a human would perceive with their own eyes:
Still with me? Let’s move on to some practical examples.
3 ways AI-powered color synonym mapping helps retailers create better customer experiences
There are of course a variety of applications for color synonym mapping in the retail space, but here are a few we’ve had particular success with over the years.
1. The end of all-or-nothing color searches
Elite retailers have catalogs of their colors, but they are often filled with the retailers’ own diverse, creative color names. These, of course, must be accounted for and they are part of the reason why our own color catalog has grown so vast.
Not all search queries are as smooth-sailing as this one from our client TOMS:
Some retailers may in fact have purple shoes, or at least shoes in purple’s neighborhood of colors, but search results will come back with 0 results because they have creative color names and failed to include common color names in their metadata and product descriptions.
We’ve worked to combat this challenge by applying the color synonym mapping described earlier to our clients’ product catalogs. This ensures that products in similar color neighborhoods will be displayed to potential customers—and this, as you can imagine, can lead to fairly dramatic revenue gains.
Removing this unfortunately common friction point in the buyer’s journey is a surefire way to improve digital merchandising conversation rates. After all, showing a customer something remarkably similar is far better than showing them nothing at all.
2. More accurate search display results
It’s one thing to simply display the results of a retailer’s product based on a search for color, but our color synonym mapping takes accuracy to the next level.
To us, the products themselves provide another rich source of color attributes.
By leveraging our AI image analysis and combining it with our color synonym mapping, we’re able to scrape client product image files to understand the weight of a color as part of the image, and then weight each color based on a percentage of the image make up—in other words, we can also use the image itself (not just the text-based search query) to map colors back to the RGB.
One example could come from the watch shown above. The bezel and band are both types of dark blue, so if a shopper was looking for a “dark blue watch” we would take into account that this particular watch contains dark blue in those areas. We’d weight it and display it based on its color relevancy in relation to the other products in the catalog.
Similarly, there’s red in the bezel and band. So, although it’s unlikely to be the watch a potential customer would want displayed first when they search for “red watch,” it would still show up in the results (but likely towards the bottom of the search results depending on the available products).
Here’s a more visualized look into how this works behind-the-scenes for our client Ann Taylor:
Our own comprehensive color catalog ensures that if a potential customer searches for, say, “green shorts,” they’ll see results, in order, based on which is closer to green, and whether or not “green” was actually a term the retailer used to describe said shorts.
Here’s another example, this time from O’Neill, another client of ours:
3. Photo search is here, are you ready?
In October 2017, eBay launched two AI-based photo search capabilities.
One allows customers to essentially begin their search anywhere by sharing a product image (from social, another website, etc.) with eBay’s mobile app. The other allows users to start their search on eBay’s site or app with a photo they took with their phone.
It lends credence to the “a picture is worth a thousand words” cliche.
But while adoption of photo search capabilities hasn’t yet taken off the way voice-based search (such as Alexa) has, the stage is set. With players like eBay and Amazon making big moves into this space, it’s only a matter of time before customers expect the same from their favorite brands.
Powered by our world-class AI and obsession with color, here’s a glimpse into our Photo Search solution:
Customers are equipped with excellent cameras in their smartphones; they aren’t going to forever remain content with voice commerce. When they see something, a dress for example, or maybe even a sliver of a color in a painting that they’d love to have as their watchface, they’ll be able to capture the image with their phone and immediately begin their path to purchase.
If retailers are equipped to handle such moves.
If you’ve made it this far we’d love to hear from you. What practical application of AI would you like to see us write about next? Let us know in the comment section below.
Note: Special thanks to Dr. Alejandro Rago.