realistic ar makeup over diverse skin tones on mobile · 45fps. the table below shows our makeup...

2
Realistic AR Makeup over Diverse Skin Tones on Mobile Bruno Evangelista Instagram Houman Meshkin Instagram Helen Kim Instagram Anaelisa Aburto Instagram Ben Max Rubinstein Instagram Andrea Ho Instagram Glitter Gloss Matte Metallic Figure 1: (Left) Results of AR makeup application. (Right) Different materials and respective rendering results. CCS CONCEPTS Computing methodologies Mixed / augmented reality; KEYWORDS Augmented Reality, Makeup, Cosmetic Rendering, Skin Rendering ACM Reference Format: Bruno Evangelista, Houman Meshkin, Helen Kim, Anaelisa Aburto, Ben Max Rubinstein, and Andrea Ho. 2018. Realistic AR Makeup over Diverse Skin Tones on Mobile. In Proceedings of SA ’18 Posters. ACM, New York, NY, USA, 2 pages. https://doi.org/10.1145/3283289.3283313 1 INTRODUCTION We propose a novel approach to the application of realistic makeup over a diverse set of skin tones in mobile phones using augmented reality. The method we developed mimics the real world layering techniques and application that makeup artists use. We can accu- rately represent the five most commonly used materials found in commercial makeup products- Matte, Velvet, Glossy, Glitter, and Metallic. We apply skin smoothing to even out the natural skin tone and tone-mapping to further blend source and synthetic layers. 2 OUR APPROACH Our makeup pipeline relies on a real-time mobile face-tracker, which allows us to run GPU shaders over a face-aligned mesh for each frame of a live video stream, as shown in figure 2. We Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). SA ’18 Posters, December 04-07, 2018, Tokyo, Japan © 2018 Copyright held by the owner/author(s). ACM ISBN 978-1-4503-6063-0/18/12. https://doi.org/10.1145/3283289.3283313 Figure 2: Our pipeline - makeup is applied to live video frames. Accessories, such as eyelashes, are rendered. The fi- nal image goes through skin smoothing and tone mapping. provide as input, constructed maps that define face regions, such as lips, eyes, and cheeks, as well as makeup properties. Our light responsive makeup is then applied, generating our target image. Makeup accessories, such as eyelashes, are optionally rigged to the face and rendered on top. Lastly, we apply skin smoothing and tone- mapping to further blend source and synthetic layers, increasing realism. 2.1 Light Responsive Makeup Our algorithm works in RGB and LAB color space, where the base albedo color is applied in RGB and shading is done in LAB. Previous works have applied the makeup base color in HSV [KIM and CHOI 2008] or LAB color space [d. Campos and Morimoto 2014]. However,

Upload: others

Post on 26-Jun-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Realistic AR Makeup over Diverse Skin Tones on Mobile · 45fps. The table below shows our makeup rendering time. Device Render (ms) 2013 Nexus 5 20 2015 Pixel 1 16 2014 Galaxy S5

Realistic AR Makeup over Diverse Skin Tones on MobileBruno Evangelista

InstagramHouman Meshkin

InstagramHelen KimInstagram

Anaelisa AburtoInstagram

Ben Max RubinsteinInstagram

Andrea HoInstagram

Glitter Gloss Matte Metallic

Figure 1: (Left) Results of AR makeup application. (Right) Different materials and respective rendering results.

CCS CONCEPTS• Computing methodologies→ Mixed / augmented reality;

KEYWORDSAugmented Reality, Makeup, Cosmetic Rendering, Skin Rendering

ACM Reference Format:Bruno Evangelista, Houman Meshkin, Helen Kim, Anaelisa Aburto, BenMax Rubinstein, and Andrea Ho. 2018. Realistic AR Makeup over DiverseSkin Tones on Mobile. In Proceedings of SA ’18 Posters. ACM, New York, NY,USA, 2 pages. https://doi.org/10.1145/3283289.3283313

1 INTRODUCTIONWe propose a novel approach to the application of realistic makeupover a diverse set of skin tones in mobile phones using augmentedreality. The method we developed mimics the real world layeringtechniques and application that makeup artists use. We can accu-rately represent the five most commonly used materials found incommercial makeup products- Matte, Velvet, Glossy, Glitter, andMetallic. We apply skin smoothing to even out the natural skin toneand tone-mapping to further blend source and synthetic layers.

2 OUR APPROACHOur makeup pipeline relies on a real-time mobile face-tracker,which allows us to run GPU shaders over a face-aligned meshfor each frame of a live video stream, as shown in figure 2. We

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).SA ’18 Posters, December 04-07, 2018, Tokyo, Japan© 2018 Copyright held by the owner/author(s).ACM ISBN 978-1-4503-6063-0/18/12.https://doi.org/10.1145/3283289.3283313

Figure 2: Our pipeline - makeup is applied to live videoframes. Accessories, such as eyelashes, are rendered. The fi-nal image goes through skin smoothing and tone mapping.

provide as input, constructed maps that define face regions, suchas lips, eyes, and cheeks, as well as makeup properties. Our lightresponsive makeup is then applied, generating our target image.Makeup accessories, such as eyelashes, are optionally rigged to theface and rendered on top. Lastly, we apply skin smoothing and tone-mapping to further blend source and synthetic layers, increasingrealism.

2.1 Light Responsive MakeupOur algorithm works in RGB and LAB color space, where the basealbedo color is applied in RGB and shading is done in LAB. Previousworks have applied the makeup base color in HSV [KIM and CHOI2008] or LAB color space [d. Campos andMorimoto 2014]. However,

Page 2: Realistic AR Makeup over Diverse Skin Tones on Mobile · 45fps. The table below shows our makeup rendering time. Device Render (ms) 2013 Nexus 5 20 2015 Pixel 1 16 2014 Galaxy S5

SA ’18 Posters, December 04-07, 2018, Tokyo, Japan EVANGELISTA, B. et al.

Figure 3: Eyelashes attached to face’s UV and rendered.

those approaches don’t retain a consistent color over diverse skintones (hues) or lighting conditions.

In our algorithm, we first desaturate the input image and ex-tract mid and low luminance frequencies. Then, we combine thesource makeup color with the extracted frequencies, using the midfrequency to highlight - by screening it on top, and the low fre-quency to darken - by multiplying it on top. This process is shownin equation 1 which uses artist provided frequencies.

L(v,min,max) =max(0,v −min)/max(ϵ,max −min)

F1 = L(φluma , 0.2, 0.8), F2 = L(φluma , 0.2, 0.5)X1 = Lerp(Screen(F1, Imakeup ), Imakeup ,α)

X2 = Lerp(Multiply(X1, F2), F2, β)

(1)

Our shading algorithm combines the makeup color with a pre-computed ambient-occlusion map and converts the result to LABspace. To achieve the material looks shown on figure 1, we proposean empirical Gloss and Shine model which works by transformingLAB’s lightness, this model is shown in the equation below.

φshine =LabL2

shinePower+LabL2

φдloss = H (0, 100,φshine ) ∗ 100

LabL = φдloss + threshold ∗ дlossAlpha ∗ luma1+γ ∗дlossPower

(2)

Lastly, we use an environment map to simulate reflections oververy bright makeup areas. Our map contains a low-frequency studiolight setup and is 3D oriented according to the user’s mobile device.

2.2 EyelashesFor eyelashes, we use a strip-like mesh with joints and a texture forlashes patterns. We use 4 joints to reliably attach our mesh to UVcoordinates in the face tracker’s face mesh, which is transformedat runtime via a series of blend shapes, mimicking the user’s facialexpressions as shown in figure 3. This allows us to control thelength, curvature, and density of the eyelashes.

2.3 Skin SmoothingWe apply an edge preserving blur filter to even out the naturalskin tone, mimicking foundation makeup products. We explored afew algorithms, including Bilateral Filter [Barash and Comaniciu2004], Low-pass Filter and Guided Filter [He and Sun 2015]. TheBilateral filter provided good visual results, however, its O(N 2)complexity makes it computationally expensive for mobile devices.And although there is an approximate separable solution, it often

Figure 4: Tone mapping on different lightning environ-ments with respective min, avg and max log luma.

generates artifacts [Yoshizawa et al. 2010]. The Low-pass filter wascomputationally efficient but didn’t produce good visual results.

To achieve our desired visual look and performance, we usedthe Fast Guided Filter, which does the bulk of its computations insub-sampled space, making it efficient. Our implementation fur-ther optimizes it by using the image’s luma as the guiding image,allowing RGB and luma to be packed in a single texture.

2.4 Tone MappingWe apply localized tone mapping similar to Reinhard’s operator[Reinhard et al. 2002]. In our implementation, we first take advan-tage of GPU bilinear filtering to downsample the image to 1/16of its size. To localize the tone mapping, we use a 4x8 grid andcompute the min, max and average luma in log space per region.This localization improves visual quality and better utilizes GPU’sparallelism. Finally, we remap the color of each pixel to a s-curvegenerated from the computed values. This results in rendered pixelsthat match the actual environment as shown in figure 4.

3 RESULTSOur method was tested in a range of mobile devices, achieving over45fps. The table below shows our makeup rendering time.

Device Render (ms) Device Render (ms)2013 Nexus 5 20 2015 Pixel 1 162014 Galaxy S5 14 2017 Pixel 2 7

We have shown our method can realistically render materialscommonly used by makeup artists (figure 1), and based on it, wedeveloped a platform for mobile users to try-on commercial makeupproducts free of charge.

REFERENCESDanny Barash and Dorin Comaniciu. 2004. A common framework for nonlinear

diffusion, adaptive smoothing, bilateral filtering and mean shift. Image and VisionComputing 22, 1 (2004), 73 – 81. https://doi.org/10.1016/j.imavis.2003.08.005

F. M. S. d. Campos and C. H. Morimoto. 2014. Virtual Makeup: Foundation, Eye Shadowand Lipstick Simulation. In 2014 XVI Symposium on Virtual and Augmented Reality.181–189. https://doi.org/10.1109/SVR.2014.32

Kaiming He and Jian Sun. 2015. Fast Guided Filter. CoRR abs/1505.00996 (2015).arXiv:1505.00996 http://arxiv.org/abs/1505.00996

Jeong-Sik KIM and Soo-Mi CHOI. 2008. Interactive Cosmetic Makeup of a 3D Point-Based Face Model. IEICE Transactions on Information and Systems E91.D, 6 (2008),1673–1680. https://doi.org/10.1093/ietisy/e91-d.6.1673

Erik Reinhard, Michael Stark, Peter Shirley, and James Ferwerda. 2002. PhotographicTone Reproduction for Digital Images. In Proceedings of the 29th Annual Conferenceon Computer Graphics and Interactive Techniques (SIGGRAPH ’02). ACM, New York,NY, USA, 267–276. https://doi.org/10.1145/566570.566575

Shin Yoshizawa, Alexander Belyaev, and Hideo Yokota. 2010. Fast Gauss BilateralFiltering. Computer Graphics Forum 29, 1 (2010), 60–74. https://doi.org/10.1111/j.1467-8659.2009.01544.x