case study-v2
TRANSCRIPT
CASE STUDY OF A MOBILE APP FOR SEASONAL COLOR ANALYSISAudibilities, LLCJanuary, 2017
CASE STUDY OF A MOBILE APP FOR SEASONAL COLOR ANALYSIS
Be more attractive with the right colors for you!
This is a case study of a mobile app that identifies a users color season from a selfie and then guides them to appropriate color clothing choices.
This presentation takes you through the art, science (including computational photography, color space manipulation and artificial intelligence) and engineering required to create this solution.
AGENDA• Seasonal Color Analysis• Concept – Selfie to Season to Color Selection• The Art• The Science• The Engineering• Next Steps
All Eyes on Hue(in the Apple App Store)
This indicates a side-note
SEASONAL COLOR ANALYSISThe idea is that
• A person’s innate colors map them to one of four color ‘seasons’• The season has colors that make a person more attractive, and less
attractive
Historically it has required trained practitioners to identify a person’s season and create for them a custom palette of colors.
Popularized in the 1970’s with the NY Times best seller “Color Me Beautiful”, this book presented a somewhat systematic approach to identify and use your colors.
More recently enhancements have included additional seasons – up to 16.
CONCEPT – DO IT YOURSELF!
Take a Selfie in the App
Your season is identified
Image Processing, Machine Learning driven Algorithms
You find colors good (and bad) for you!
By Browsing your Palette
By Using the Camera to find good/bad colors
Avoid
Great!
THE ARTHistorically, trained practitioners would identify seasons, sitting in front of a person.
We leveraged trained practitioners to identify the season for each of thousands of selfies. This was then used to feed our machine learning analysis.
We also used trained practitioners to identify the color palettes for each of the four seasons.
It seems that even trained practitioners don’t always agree on season assignments.
A custom iPad app was developed to aide the experts in the assignment of seasons to selfies.
THE SCIENCESeveral areas to cover:
• Computational Photography• Image Recognition• Machine Learning
A lot of approaches were hypothesized and tested.
COMPUTATIONAL PHOTOGRAPHYIt turns out that getting accurate colors is important.
Key issues include:• Camera variations on iPhones/iPads• iOS software ‘development’ unknows – from RAW to bitmaps• Variations due to picture conditions – temperature, exposure,
white balance• Variations due to poor selfies – backgrounds, size of face
Colors were calibrated to an iPhone reference camera.
The lack of a ‘standard’ color greatly limits accuracy. A gray card was used for awhile but was an inhibitor for adoption.
Similarly, restricting images to ‘proper’ ranges of light temperature and exposure also inhibited adoption and were rejected.
We tested an approach to adjust for white balance by mapping the skin color to a ‘reasonable skin color’ (per literature.) We released an app showing this – see ‘Better Selfie’.
At the end of the day, we tried to take users through the process as best we could – and make up for variations at the machine learning stage.
Hair is the most problematic to correctly capture – backgrounds and face angles cause complications (as do baldness…)
IMAGE RECOGNITIONA key assumption is that a person’s colors can be fully represented by three body parts – their skin, hair, and eyes.
Face recognition provided the location of the eyes. The color was located by identifying the black pupil and taking the surrounding iris.Skin was identified via face geometry – the best location was on the cheeks (less chance of makeup.)Hair was also identified using face geometry with multiple uses of an algorithm known as ‘grabcut’.
Skin (with statistical outliers removed)
Hair
Eye (iris)
MACHINE LEARNING FOR SEASON IDENTIFICATIONAs we are focused on colors, a traditional machine learning image recognition approach was not called for.Instead, we mapped the three body part images – skin, hair, and eyes into a normal image using the Munsell color space. This gave us a useful, 2D image that represented the colors of each person.This image was then used, through a more traditional convolutional neural net approach to identify our seasons.At present the solution is above 80% accuracy.
Our previous attempt, a random tree classifier operating on summary color values for the body parts (in the version in the app store) achieves 70% accuracy.
Image colors mapped to a new image via a Munsell Transform.
Trained Deep Neural Net used for season identification.
Season, e.g., Winter
THE ENGINEERINGThe challenge was to encapsulate all this functionality in a useful app. Key challenges included:• User experience• Implementation of image processing in the iOS environment –
the OpenCV library was used.• Implementation of the random tree analysis and the DNN
identification (tensor flow is planned)• Implementation of a real-time ‘X-Ray’ vision which translated
the camera video into good/bad colors for that season.
Key areas of concern were speed of execution, especially for image processing.
Best practice coding practices were employed – as far as possible…
The user experience needs to be refreshed…
XCode Storyboard
NEXT STEPS
Illustrative Potential Example
Reactions to the concept have been wildly positive.Implementation issues limiting adoption are primarily:
1. Accuracy of selfie to season identification2. Usefulness of identifying exact good/bad colors.
A key theme is moving from ‘Art’ to ‘Science’ in the area of seasonal color palettes. This has the potential of not only more accurate palettes, but eventually personalized palettes.
Commercialization has been discussed for personal use, trained advisor use, and in-person/web site store selection of proper clothing.
AUDIBILITIES, LLC• Peter Gaston• Cindi Lynch• John Pries• Bruce Cottman