1 / 77

Image Recoloring

Image Recoloring. Ron Yanovich & Guy Peled. Contents. Grayscale coloring background Luminance / Luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor ( Knn ) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) Training

kioshi
Download Presentation

Image Recoloring

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Image Recoloring Ron Yanovich & Guy Peled

  2. Contents • Grayscale coloring background • Luminance / Luminance channel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) Training • (ii) Classification • (iii) Color transfer • (iv) Optimization

  3. Grayscale coloring background • Colorization definition: ‘The process of adding colors to monochrome image.’

  4. Grayscale coloring background • Colorization is a term introduced by Wilson Markle in 1970 to describe the computer-assisted process he invented for adding color to black and white movies or TV programs.

  5. Grayscale coloring background • Black magic ( PC tool ) • Motion video and film colorization • “Color transfer between images” (Reinhardet al.) • Transferring the color pallet from one color image to another • “Transferring color to greyscaleimages” (Welsh et al.) • Colorizes an image by matching small pixel neighborhoods in the image to those in the reference image • “Unsupervised colorization of black-and-white cartoons” (Sykoraet al.) • Colorization of black and white cartoons (segmented), patch-based sampling and probabilistic reasoning.

  6. Black magic (tool) Reinhardet al.

  7. Welsh et al Sykoraet al.

  8. Contents • Grayscalecoloringbackground • luminance / luminance channel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization

  9. Luminance / Luminance channel • Luminance • The amount of light that passes through or is emitted from a particular area • Luminance Channel • Y - Full resolution plane that represents the mean luminance information only • U, V - Full resolution, or lower, planes that represent the chroma (color) information only

  10. Luminance / Luminance channel

  11. Luminance / Luminance channel

  12. Luminance / Luminance channel

  13. Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization

  14. Segmentation • The process of partitioning a digital image into multiple segments (sets of pixels, also known as superpixels)

  15. Segmentation • Making the image more meaningful and easier to analyze • locate objects and boundaries • assigning a label to every pixel in an image

  16. Segmentation • ‘Superpixel’ - A polygonal part of a digital image, larger than a normal pixel, that is rendered in the same color and brightness

  17. Segmentation • Possible implementation is mean-shift segmentation

  18. Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization

  19. Discrete Cosine Transform • Finite sequence of data points in terms of a sum of cosine functions oscillating at different frequencies • DCT is a Fourier-related transform similar to the discrete Fourier transform (DFT), but using only real numbers

  20. Discrete Cosine Transform

  21. Discrete Cosine Transform • Can be used for compression

  22. Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization

  23. K-nearest-neighbor (Knn) • In pattern recognition, the k-nearest neighbor algorithm (k-NN) is a non-parametric method for classifying objects based on closest training examples in the feature space.

  24. K-nearest-neighbor (Knn) • All instances are points in n-dimensional space • “Closeness” between points determined by some distance measure • Example • Classification made by Majority Vote among the neighbors

  25. K-nearest-neighbor – 2D Ex • Given n points Point location Point Class a b a a a Given new point Classification for k=2 Given new point Classification for k=5 a a b a a b a a a b b a b b b a b b b b

  26. Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization

  27. Linear discriminant analysis (LDA) Background • In the field of machine learning, the goal of statistical classification is to use an object's characteristics to identify which class (or group) it belongs to

  28. Linear discriminant analysis (LDA) Background • A linear classifier achieves this by making a classification decision based on the value of a linear combination of the characteristics • An object's characteristics are also known as feature values and are typically presented to the machine in a vector called a feature vector.

  29. Linear discriminant analysis (LDA) Background • There are two broad classes of methods for determining the parameters of a linear classifier • Generative models (conditional density functions) • LDA (or Fisher’s linear discriminant) • Discriminative models • Support vector machine (SVM)

  30. Linear discriminant analysis (LDA) Background • Discriminative training often yields higher accuracy than modeling the conditional density functions. • However, handling missing data is often easier with conditional density models

  31. Linear discriminant analysis (LDA) • LDA seeks to reduce dimensionality while preserving as much of the class discriminatory information as possible • LDA finds a linear subspace that maximizes class separability among the feature vector projections in this space

  32. LDA– two classes • Having a set of D-dimensional samples: • The samples are divided into 2 groups: N1 – belongs to class w1 N2– belongs to class w2 • Seek to obtain a scalar yby projecting the samples x onto a line: http://research.cs.tamu.edu

  33. LDA– two classes • Of all the possible lines we would like to select the one that maximizes the separability of the scalars

  34. LDA – two classes • Try to separate the two classes by projecting it onto different lines: Unsuccessful separation

  35. LDA – two classes • Try to separate the two classes by projecting it onto different lines: Successful separation Reducing the problem dimensionality from two features (x1,x2) to only a scalar value y.

  36. LDA – two classes • In order to find a good projection vector, we need to define a measure of separation • Measure by Distance between mean vectors This axis yields better class separability This axis has a larger distance between means

  37. LDA – two classes - Fisher’s solution • Fisher suggested maximizing the difference between the means, normalized by a measure of the within-class scatter • For each class we define the scatter, an equivalent of the variance • The Fisher linear discriminant is defined as the linear function that maximizes the criterion function Scatter (per class) Within class scatter

  38. LDA – two classes - Fisher’s solution • Therefore, we are looking for a projection where samples from the same class are projected very close to each other and, at the same time, the projected means are as farther apart as possible w hyperplane

  39. Two Classes - Example • 2 sample classes X1 , X2

  40. Two Classes - Example • are the meanvectors of each class = • S1 , S2 are the covariancematrixes of X1 & X2 (the scatter) =

  41. Two Classes - Example • Sbis the Between-class scatter matrix • Swis the Within-class scatter matrix = = +

  42. Two Classes - Example • Finding eigenvalues and eigenvectors

  43. Two Classes - Example • Apparently, the projection vector that has the highest eigenvalue provides higher discrimination power between classes • LDA Projection found by Fisher’s Linear Discriminant

  44. LDA Limitation • LDA is a parametric method since it assumes Gaussian conditional density models • Therefore if the samples distribution are non-Gaussian, LDA will have difficulties to make the classification for complex structures

  45. Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • LinearDiscriminantAnalysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization

  46. Colorization using optimization Levinat el. • User scribbles desired colors inside regions • Colors are propagated to all pixels • Looking at the YUV space • Remember: neighboring pixels with similar intensities should have similar colors

  47. Colorization using optimization Levinat el. • input: • Y(x; y; t) intensity volume • output: • U(x; y; t) color volume • V(x; y; t) color volume • w(rs) is a weighting function that sums to one • and are the mean and variance of the intensities in a window around the pixel

  48. Colorization using optimization

  49. Colorization using optimization

  50. Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • LinearDiscriminantAnalysis (LDA) • Colorizationusingoptimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization

More Related