Color Transfer Brush Qing Luan1,2 ∗ 1
Microsoft Research Asia
Fang Wen1 2
Ying-Qing Xu1
University of Science and Technology of China
Abstract In this paper, we introduce an interactive tool for local color transfer. The new technique is based on the observation that color transfer operations are local in nature while at the same time should adhere global consistency. We introduce a brush by which the user specifies the source and destination image regions for color transfer. Color statistics in the source region are transferred to the destination region. A global optimization is then applied to eliminate visual discontinuities that may result by the local operations. We demonstrate that our tool is easy to use yet effective in quickly generating diverse artistic effects.
1. Introduction Recently, much research has been dedicated to the style transfer from example images to destination images [10] [12] [8] [1] [3] [2] [13] [4]. Transfer operation is attractive since the example image offers good preview of the final effect. Some works [10] [12] have been introduced for automatic color transfer. Despite the fact that these algorithms performs well in transferring the global color styles, sometimes it would be difficult for them to fully capture the intention or preferences of each individual user. Interactive control over the result is desirable to fully reflect the user’s personal taste. Moreover, the incorporation of user supervision effectively addresses the problem of establishing the regions correspondence in transferring process, which would be difficult task for automatic algorithms. In this paper, we present an interactive image editing tool for users to locally manipulate the colors in the image. With our tool, the user gets to edit the image color style based on a collection of reference photos obtained from internet or professional photographers. The color style of an image can be progressively modified by transferring the desirable color statistics from source regions to the destination regions that are specified by the user with the designed brush ∗ This work was done while Qing Luan was a visiting student at Microsoft Research Asia
tool. Seeing that separated local operations inevitably incur discontinuity across different regions, we design a global optimization to resolve this issue. Based on the known boundary between regions, the optimization stage aims to enhance the cross-region coherence by naturally propagating the colors beyond the boundaries while preserving the gradient of the remaining parts. This approach leads to the following advantages: 1. in contrary to the conventional global methods, our method makes color transfer an intuitive and controllable task by providing users with the local control ability. 2. The interaction process explicitly makes use of the user input and tackles the difficulty issue of identifying appropriate source and destination regions, which have long perplexed the traditional automatic algorithms. 3. The local transferring strategy enables the using of multiple example images for editing, which substantially enlarges the range of possible information. The rest of the paper is organized as follow. Section 2 reviews previous related works. In Section 3, we describe our method of color transfer brush; In Section 4, various effects are shown using our tool. We conclude our work in Section 5.
2. Related works Reinhard et al [10]. report a simple and successful global color transfer method, which is effective in transferring the image color style when two images are with similar color composition and simple color statistics. Tai et al. [12] propose an automatic local color transfer method. Source and destination images are first probabilistically segmented into regions of simple color statistics, then region correspondence in the image pair are set up based on the designed rules for local color transfer. This method analysis the statistics of images so that it can handle images with more complex color statistics. Many other style transfer methods are introduced during these years. Bae et al. [1] introduce an automatic style transfer method to transfer both the histogram and textureness of images; Chang et al. [2] introduce a method to transfer the color and texture of im-
a1
c1
b
a2
c2
a3
Figure 1. Example of make-up transferring using our tool. The user brushes the regions in the example image, as shown in a1, a2, and apply the brushes to the destination regions in image b. After the local color transfer and optimization, we get the result in c1. We show another result in c2 transferred using the eye style shown in a3. Other regions are still transferred using the strokes in a2 ages. Welsh et al. [13] and Irony et al. [4] introduces colorization method that transfer the color from the example color image to a destination gray image. Direct and local control is an important property in the image editing process. In [6], an interactive local tone adjustment system is provided for user to adjust parameters in stroke indicated regions. Comparing to the parameter adjustment, transferring might be a more intuitive interface for common users since it provides preview of the effect through the examples. Our work is also related to the recent colorization and recolor works [5] [9] [14]. Unlike these recolor works that assign single color to the regions, our method transfer color statistics so that it can obtain rich color variations in the final results.
3. Color Transfer Brush Color statistics are important visual features in human perception. In the work of Reinhard et al. [10], an effective automatic color transfer method is introduced to transfer the color statistics from source to destination images. The core of the paper is the following equation: g(Cd ) = µs +
σs (Cd − µd ), σd
(1)
where µs , µd are the means of the underlying Gaussian distribution in the source and destination images. σs , σd are the standard deviations respectively. Thus, for each pixel with color Cd in the destination image, the transferred color value g(Cd ) is obtained using equation 1. lαβ color space is used in the algorithm. The conditions of success for equation 1 are the similarity in image composition and the simplicity of image statistics. Seeing these issues, Tai et al. [12] segment the images into regions of simple statistics for local color transfer.
The problem of their method is that they rely on a designed rules for setting up region correspondence in the source and destination images, which may lead to correspondence error. The design of our color transfer brush is inspired by the fact that the condition for equation 1 is naturally approached when the correspondence is set up interactively by users. Since users intuitively set up region that are similar in content, the source and destination regions are likely to have similar color statistics. When transfer takes place in a local regions, the color statistics involved is comparatively simpler than the entire image. Thus, color transfer using equation 1 would usually perform well in local regions. In our interface, the user first brushes a region in an example image indicating the source region. Then, in the destination image, he brushes the region that requires adjustment. After the interaction, a pair of regions covered by the brushes is ready for color transfer, we call these regions transfer region pair. After setting up regions pairs, color transfer using equation 1 is applied locally to the region pairs. Seeing that separated local operations inevitably incur discontinuity across different regions, we design a global optimization to resolve this issue. The aim of the optimization is to eliminate the discontinuity on the transfer region boundary as well as maintaining the gradient in the rest of the image. The following energy term is minimized: E=
X
T
Ω6=∅
(upq − vpq )2 ,
(2)
where upq = fp − fq , f is the unknown function for the optimization. and vpq = gp − gq , g is the value of original image. p, q are neighbor pixels, Ω represents the regions that is not covered by the transfer brush. On the boundary of the region, fp = gp . Equation 2 forms a sparse, symmetric, positive-definite
system. In our implementation, we use Gauss-Seidel SOR algorithm [11] to minimize the energy in equation 2 since it has the most stable performance. Our optimization framework resembles that in poisson image editing [7] since both the methods aim to maintain the gradient so that it keeps the salient feature for visual quality as claimed in the work of Perez et al. Expect that our optimization do not guarantee to converge since no Dirichlet boundary condition is required. However, practically, we can still get reasonable solutions for the images. Beside the brush that transfer the color from another example region, we also provide a brush that maintains the color of the original region. We call this brush the color maintain brush. In the region under the brush, color statistics are maintained so that it won’t be influenced by the color transfer brush nearby. Boundaries of the brushed regions are also involved in the optimization. The user can better control the final result with both the color transfer brush and the color maintain brush.
4. Results In this section, we show that our interactive color transfer tool is effective in editing local colors in the image. In all the examples, regions of same transfer region pair are shown with same colors. Color maintain brushes are shown in yellow color. Figure 1 shows an example of make up transfer using our method. Make up transfer is an attractive application for demonstrating cosmetic effect. With the internet, it is easy to get demo images for variant make-up styles. Customers may feel it more convincing to try the effect on his own face instead of viewing the effect on some perfect model. With our color transfer brush, customers can virtually apply the make-up style on a picture of his own, and see the preview effect before buying the stuff. He can also try the effects from many different make-up styles using more than one example images. The progress is shown in Figure 1. Our tool is effective in editing the portrait photographs. In first row of Figure 2, after applying transfer in the eye and lip regions from the example image in Figure 2(a) first row, the portrait gets brighter color style as the model image in Figure 2(a) first row. We compare our result with the result of global color transfer method (shown in up right corner of the image); In the lower part of Figure 2, we show another interesting application that transfer the style of an artist drawing (Figure 2(a) second row) to a portrait picture ( 2(b) second row) using our tool. Sometimes, users only want to modify the appearance of some regions instead of the entire image. For example, in Figure 3(a), user would like the color style of sky region to be similar to the sky region in Figure 3(b). Global transfer using [10] is not effective enough for the color of the the sky
(a)
(b)
(c)
Figure 2. Our tool is effective in editing the color style of portraits. Using the brushes in (a) (b), we get the color transferred results in (c). For the image in the first row, We also show the result of [10] in up right side for comparison region, shown in Figure 3(d). Color transfer using [12] suffers from error in setting up region correspondence (shown in Figure 3(e)). Our method benefits from the correspondence set by the user(shown in the lower row of the Figure 3(a)(b)) and get a better effect. Compared with the existing colorization or recolor methods, our tool can obtain much richer colors, as shown in Figure 4. Since limited number of strokes is used, the result of stroke based colorization looks flat and unnatural. Instead of carefully specifying multiple colors, the user can take an example from example images, as shown in Figure 4(a), which implies that ”I’d like the flower in (b) to be similar as the rose in (a)”. We can see that in our result in Figure 4(c), the left flower possesses richer color variation than those in Figure 4(e).
5. Conclusion and future work In this paper, we present the color transfer brush. Color transfer is locally applied to the user-specified region pairs, then a global optimization is then introduced to reduce the artifact caused by the local transfer. Our tool is intuitive to use and effective in getting various effects. The direct
Figure 3. Regional color editing. Using the brush in (a) (b), we get the transferred result in (c). The sky region is modified to be similar as the sky in (a). Color transfer using [10] or [12] cannot get the desired effect in the sky region, shown respectively in (d)(e).
(b)
(a)
(d)
(c)
(e)
Figure 4. Our method can obtain rich color variations. Using the brushes in (a) (b), we get the result in (c). Colorization result using [5] is shown in (e) with the strokes shown in (d). We can see that the flower region in (c) has richer colors than (e). control over the result enhances the user experience in the image editing progress. A limitation of the current tool is handling the texture regions in the images. The future work would be developing a region selection method that consider the texture features, so that a texture region can be group together for further transferring.
References [1] S. Bae, S. Paris, and F. Durand. Two-scale tone management for photographic look. In ACM Transaction on Graphics,
volume 25, 2006. [2] Y. Chang, K. Uchikawa, and S. Saito. Example-based color stylization based on categorical perception. In Proceedings of the 1st Symposium on Applied perception in graphics and visualization, volume 73. [3] A. Hertzmann, C. E. Jacobs, N. Oliver, B. Curless, and D. H. Salesin. Image analogy. In ACM Transaction on Graphics, pages 327–340, 2001. [4] R. Irony, D. Cohen-Or, and D. Lischinski. Colorization by example. In Eurographics Symposium on Rendering, pages 277–280, 2005. [5] A. Levin, D. Lishinski, and Y. Weiss. Colorization using optimization. In ACM Transactions on Graphics, pages 689– 694, 2004. [6] D. Lischinski, Z. Farbman, M. Uytendaelle, and R. Szeliski. Interactive local adjustment of tonal values. In ACM Transaction on Graphics, volume 25, 2006. [7] P. Perez, M. Gangnet, and A. Blake. Poisson image editing. In ACM Transaction on Graphics, volume 22, pages 313– 318, 2003. [8] F. Pitie, A. Kokaram, and R. Dahyot. N-dimensional probability density function transfer and its application to colour transfer. In International conference on computer vision, 2005. [9] Y. Qu, T. Wong, and P. Heng. Manga colorization. In ACM Transactions on Graphics, pages 1214–1220, 2006. [10] E. Reinhard, M. Ashikhmin, B. Gooch, and P. Shirley. Color transfer between images. In IEEE Computer Graphics and Applications, volume 21, pages 34–41, 2001. [11] Y. Saad. Iterative methods for sparse linear systems (1st edition). PWS, 1996. [12] Y.-W. Tai, J. Jia, and C.-K. Tang. Local color transfer via probabilistic segmentation by expectation-maximization. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), volume 1, 2005. [13] T. Welsh, M. Ashikhmin, and K. Mueller. In ACM Transaction on Graphics, pages 341–346, 2002. [14] L. Yatziv and G. Sapiro. Fast image and video colorization using chrominance blending. IEEE Transaction on Image Processing, 2006.