Font Size: a A A

Adpative Faical Image Beautification And Rendering

Posted on:2015-02-23Degree:DoctorType:Dissertation
Country:ChinaCandidate:L Y LiangFull Text:PDF
GTID:1268330422481517Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
Facial image beautification and rendering are two rapidly developing computationalphotography techniques, which involed with manipulation of attribute or content of an image(like the enhancement of faical skin lighting, smoothness and color), while the classic imageprocessing techniques aim to enhance the quality of an image. Using image-basedmanipulation techniques, a novel image is synthesized by samples captured from the realworld rather than recreating the entire physical world, which can enhance or extend thecapabilities of digital photography. The development of facial image beautification andrendering has led to many useful applications in our daily life (like post-production ofphotography or entertainment) and industry (like advertisement or movie production).However, existing methods of faical beautifcaiton and rendering may require tedious and timeconsuming hand-crafted operations. Furthermore, good visual effects are hard to produce byhand-crafted manipulation due to the limitations of human visual perception and skills.Therefore, it is fascinating to construct an automatic system for faical image beautificationand rendering.It is challenging to build an automatic system of faical image beautification andrendering. Variations of facial images are caused by many factors, such as illumination,viewpoint and background. Facial image beautification and rendering are involved withassorted mathematical models, but there is no mature unified framework to analyze the relatedmodels effectively. To produce an image in a natural manner, we may also take the visualperception principles of human into considerations for system construcation. This thesisdevelops an adaptive edge-preserving energy minimization model which can automaticallyadjust its model properties according the input images or the manpulation tasks. Using thismodel, we can analysize and construct novel edge-proserving smoothing or edit propagationmodels under a unified framework and devolep an automatic image manipulation system withgreat reliablity, accuracy, error tolerance, and stability. Based on the adaptive edge-preservingenergy minimization model, we explore the specific problems of facial skin beautification,faical relighting and ink-painting rendering, respectively. The contributions of the thesis are asfollows:First, we develop a general adaptive edge-preserving energy minimization framework toimprove performance of edge-preserving smoothing and edit propagation methods, and toachieve adaptive facial image beautification and rendering. A general edge-preserving energy minimization (GEEM) model is presented to understand the connections and properties of thebilateral filtering, anisotropic diffusion and weighted least squares filter using nonparametricpoint estimaton and calculus of variation. To overcome the shortages of the general GEEMmodel, an adaptive edge-preserving energy minimization (AEEM) model is proposed, whichhas adaptive fidelity term, model parameters and high-dimenstional guided feature space. TheAEEM model can derive a novel model with better edge-preserving smoothing or editpropagation effects, which further improve the performace of the specific automatic system offacial skin beautification, face relighting or ink-painting rendering.Second, we propose a novel image editing tool called adaptive region-aware mask andconstruct a unified framework for facial skin beautifcaion, which can enhance the skinlighting, smoothness and color automatically. A region-aware mask is generated based onAEEM, which is integrated with faical structure and apperace features, adaptive modelparameter and guided feature space construced by lighting and color feature. Using aregion-aware mask, we can automatically select the editing skin regions and perform anunhomogenenous local adjustment automatically with great precision, especially for the theregions with complex boundaries. The proposed skin beautification framework contains threemajor steps, image layers decomposition, region-aware mask generation and image layersmanipulation. Under this framework, a user can perform facial beautification simply byadjusting the skin parameters. Furthermore, the combinations of parameters can be optimizedautomatically, depending on the average face assumption and related psychologicalknowledge. We performed both qualitative and quantitative evaluation for our method usingfaces with different genders, races, ages, poses, and backgrounds from various databases. Theexperimental results demonstrate that our technique is superior to previous methods andcomparable to commercial systems, for example, PicTreat, Portrait+, and Portraiture.Third, we present a novel automatic lighting template generation method to relight faceswith complex backgound. Based on the principles of Retinex theory and quotient image, aface relighting framework with single reference image is presented, where the lightingtemplate is the key component. Face relighting within the skin region is performed using alighting template, which is generated by an adaptive edge-preserving smoothing modelderived from AEEM with adaptive smoothness parameter. To address the problem of religtingin complex background, the lighting within skin region is diffused to the background in asmooth manner using an edit propagation model derived from AEEM with adaptivepropagation paramenter.Fourth, we propose an image-based ink-painting rendering framework with a novel ink diffusion simulation method, which can mimic diverse ink painting styles. We construct aspecific edit propagation model derived from AEEM with edge detectors and guided featurespace to simulate ink diffusion. Different ink diffusion effects with different abstraction,diffusion scope, and diffusion patterns are obtained by adjusting the model feature, parametersand guided feature. The proposed ink-painting rendering framework, which consists of linefeature extraction, adaptive ink diffusion and absorbent paper background simulation, cangenerate distinctive ink painting styles by different combination of image abstraction, inkdiffusion patterns and absorbent paper background.
Keywords/Search Tags:Energy Minimization Model, Edge-preserving Smoothing, Edit Propagation, Photorealistic Rendering, Non-photorealistic Rendering, Adaptive, Facial Beautification, Face Relighting, Ink Diffusion
PDF Full Text Request
Related items