Photo

Objective-C

@interface Photo : NSObject

Swift

class Photo : NSObject

The Photo module

Member classes: Tonemap, TonemapDrago, TonemapReinhard, TonemapMantiuk, AlignExposures, AlignMTB, CalibrateCRF, CalibrateDebevec, CalibrateRobertson, MergeExposures, MergeDebevec, MergeMertens, MergeRobertson

Class Constants

  • Declaration

    Objective-C

    @property (class, readonly) int INPAINT_NS

    Swift

    class var INPAINT_NS: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int INPAINT_TELEA

    Swift

    class var INPAINT_TELEA: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int LDR_SIZE

    Swift

    class var LDR_SIZE: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int NORMAL_CLONE

    Swift

    class var NORMAL_CLONE: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int MIXED_CLONE

    Swift

    class var MIXED_CLONE: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int MONOCHROME_TRANSFER

    Swift

    class var MONOCHROME_TRANSFER: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int RECURS_FILTER

    Swift

    class var RECURS_FILTER: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int NORMCONV_FILTER

    Swift

    class var NORMCONV_FILTER: Int32 { get }

Methods

  • Creates AlignMTB object

    Declaration

    Objective-C

    + (nonnull AlignMTB *)createAlignMTB:(int)max_bits
                           exclude_range:(int)exclude_range
                                     cut:(BOOL)cut;

    Swift

    class func createAlignMTB(max_bits: Int32, exclude_range: Int32, cut: Bool) -> AlignMTB

    Parameters

    max_bits

    logarithm to the base 2 of maximal shift in each dimension. Values of 5 and 6 are usually good enough (31 and 63 pixels shift respectively).

    exclude_range

    range for exclusion bitmap that is constructed to suppress noise around the median value.

    cut

    if true cuts images, otherwise fills the new regions with zeros.

  • Creates AlignMTB object

    Declaration

    Objective-C

    + (nonnull AlignMTB *)createAlignMTB:(int)max_bits
                           exclude_range:(int)exclude_range;

    Swift

    class func createAlignMTB(max_bits: Int32, exclude_range: Int32) -> AlignMTB

    Parameters

    max_bits

    logarithm to the base 2 of maximal shift in each dimension. Values of 5 and 6 are usually good enough (31 and 63 pixels shift respectively).

    exclude_range

    range for exclusion bitmap that is constructed to suppress noise around the median value.

  • Creates AlignMTB object

    Declaration

    Objective-C

    + (nonnull AlignMTB *)createAlignMTB:(int)max_bits;

    Swift

    class func createAlignMTB(max_bits: Int32) -> AlignMTB

    Parameters

    max_bits

    logarithm to the base 2 of maximal shift in each dimension. Values of 5 and 6 are usually good enough (31 and 63 pixels shift respectively). median value.

  • Creates AlignMTB object

    usually good enough (31 and 63 pixels shift respectively). median value.

    Declaration

    Objective-C

    + (nonnull AlignMTB *)createAlignMTB;

    Swift

    class func createAlignMTB() -> AlignMTB
  • Creates CalibrateDebevec object

    Declaration

    Objective-C

    + (nonnull CalibrateDebevec *)createCalibrateDebevec:(int)samples
                                                  lambda:(float)lambda
                                                  random:(BOOL)random;

    Swift

    class func createCalibrateDebevec(samples: Int32, lambda: Float, random: Bool) -> CalibrateDebevec

    Parameters

    samples

    number of pixel locations to use

    lambda

    smoothness term weight. Greater values produce smoother results, but can alter the response.

    random

    if true sample pixel locations are chosen at random, otherwise they form a rectangular grid.

  • Creates CalibrateDebevec object

    Declaration

    Objective-C

    + (nonnull CalibrateDebevec *)createCalibrateDebevec:(int)samples
                                                  lambda:(float)lambda;

    Swift

    class func createCalibrateDebevec(samples: Int32, lambda: Float) -> CalibrateDebevec

    Parameters

    samples

    number of pixel locations to use

    lambda

    smoothness term weight. Greater values produce smoother results, but can alter the response. rectangular grid.

  • Creates CalibrateDebevec object

    Declaration

    Objective-C

    + (nonnull CalibrateDebevec *)createCalibrateDebevec:(int)samples;

    Swift

    class func createCalibrateDebevec(samples: Int32) -> CalibrateDebevec

    Parameters

    samples

    number of pixel locations to use response. rectangular grid.

  • Creates CalibrateDebevec object

    response. rectangular grid.

    Declaration

    Objective-C

    + (nonnull CalibrateDebevec *)createCalibrateDebevec;

    Swift

    class func createCalibrateDebevec() -> CalibrateDebevec
  • Creates CalibrateRobertson object

    Declaration

    Objective-C

    + (nonnull CalibrateRobertson *)createCalibrateRobertson:(int)max_iter
                                                   threshold:(float)threshold;

    Swift

    class func createCalibrateRobertson(max_iter: Int32, threshold: Float) -> CalibrateRobertson

    Parameters

    max_iter

    maximal number of Gauss-Seidel solver iterations.

    threshold

    target difference between results of two successive steps of the minimization.

  • Creates CalibrateRobertson object

    Declaration

    Objective-C

    + (nonnull CalibrateRobertson *)createCalibrateRobertson:(int)max_iter;

    Swift

    class func createCalibrateRobertson(max_iter: Int32) -> CalibrateRobertson

    Parameters

    max_iter

    maximal number of Gauss-Seidel solver iterations.

  • Creates CalibrateRobertson object

    Declaration

    Objective-C

    + (nonnull CalibrateRobertson *)createCalibrateRobertson;

    Swift

    class func createCalibrateRobertson() -> CalibrateRobertson
  • Creates MergeDebevec object

    Declaration

    Objective-C

    + (nonnull MergeDebevec *)createMergeDebevec;

    Swift

    class func createMergeDebevec() -> MergeDebevec
  • Creates MergeMertens object

    Declaration

    Objective-C

    + (nonnull MergeMertens *)createMergeMertens:(float)contrast_weight
                               saturation_weight:(float)saturation_weight
                                 exposure_weight:(float)exposure_weight;

    Swift

    class func createMergeMertens(contrast_weight: Float, saturation_weight: Float, exposure_weight: Float) -> MergeMertens

    Parameters

    contrast_weight

    contrast measure weight. See MergeMertens.

    saturation_weight

    saturation measure weight

    exposure_weight

    well-exposedness measure weight

  • Creates MergeMertens object

    Declaration

    Objective-C

    + (nonnull MergeMertens *)createMergeMertens:(float)contrast_weight
                               saturation_weight:(float)saturation_weight;

    Swift

    class func createMergeMertens(contrast_weight: Float, saturation_weight: Float) -> MergeMertens

    Parameters

    contrast_weight

    contrast measure weight. See MergeMertens.

    saturation_weight

    saturation measure weight

  • Creates MergeMertens object

    Declaration

    Objective-C

    + (nonnull MergeMertens *)createMergeMertens:(float)contrast_weight;

    Swift

    class func createMergeMertens(contrast_weight: Float) -> MergeMertens

    Parameters

    contrast_weight

    contrast measure weight. See MergeMertens.

  • Creates MergeMertens object

    Declaration

    Objective-C

    + (nonnull MergeMertens *)createMergeMertens;

    Swift

    class func createMergeMertens() -> MergeMertens
  • Creates MergeRobertson object

    Declaration

    Objective-C

    + (nonnull MergeRobertson *)createMergeRobertson;

    Swift

    class func createMergeRobertson() -> MergeRobertson
  • Creates simple linear mapper with gamma correction

    Declaration

    Objective-C

    + (nonnull Tonemap *)createTonemap:(float)gamma;

    Swift

    class func createTonemap(gamma: Float) -> Tonemap

    Parameters

    gamma

    positive value for gamma correction. Gamma value of 1.0 implies no correction, gamma equal to 2.2f is suitable for most displays. Generally gamma > 1 brightens the image and gamma < 1 darkens it.

  • Creates simple linear mapper with gamma correction

    equal to 2.2f is suitable for most displays. Generally gamma > 1 brightens the image and gamma < 1 darkens it.

    Declaration

    Objective-C

    + (nonnull Tonemap *)createTonemap;

    Swift

    class func createTonemap() -> Tonemap
  • Creates TonemapDrago object

    Declaration

    Objective-C

    + (nonnull TonemapDrago *)createTonemapDrago:(float)gamma
                                      saturation:(float)saturation
                                            bias:(float)bias;

    Swift

    class func createTonemapDrago(gamma: Float, saturation: Float, bias: Float) -> TonemapDrago

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap

    saturation

    positive saturation enhancement value. 1.0 preserves saturation, values greater than 1 increase saturation and values less than 1 decrease it.

    bias

    value for bias function in [0, 1] range. Values from 0.7 to 0.9 usually give best results, default value is 0.85.

  • Creates TonemapDrago object

    Declaration

    Objective-C

    + (nonnull TonemapDrago *)createTonemapDrago:(float)gamma
                                      saturation:(float)saturation;

    Swift

    class func createTonemapDrago(gamma: Float, saturation: Float) -> TonemapDrago

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap

    saturation

    positive saturation enhancement value. 1.0 preserves saturation, values greater than 1 increase saturation and values less than 1 decrease it. results, default value is 0.85.

  • Creates TonemapDrago object

    Declaration

    Objective-C

    + (nonnull TonemapDrago *)createTonemapDrago:(float)gamma;

    Swift

    class func createTonemapDrago(gamma: Float) -> TonemapDrago

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap than 1 increase saturation and values less than 1 decrease it. results, default value is 0.85.

  • Creates TonemapDrago object

    than 1 increase saturation and values less than 1 decrease it. results, default value is 0.85.

    Declaration

    Objective-C

    + (nonnull TonemapDrago *)createTonemapDrago;

    Swift

    class func createTonemapDrago() -> TonemapDrago
  • Creates TonemapMantiuk object

    Declaration

    Objective-C

    + (nonnull TonemapMantiuk *)createTonemapMantiuk:(float)gamma
                                               scale:(float)scale
                                          saturation:(float)saturation;

    Swift

    class func createTonemapMantiuk(gamma: Float, scale: Float, saturation: Float) -> TonemapMantiuk

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap

    scale

    contrast scale factor. HVS response is multiplied by this parameter, thus compressing dynamic range. Values from 0.6 to 0.9 produce best results.

    saturation

    saturation enhancement value. See createTonemapDrago

  • Creates TonemapMantiuk object

    Declaration

    Objective-C

    + (nonnull TonemapMantiuk *)createTonemapMantiuk:(float)gamma
                                               scale:(float)scale;

    Swift

    class func createTonemapMantiuk(gamma: Float, scale: Float) -> TonemapMantiuk

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap

    scale

    contrast scale factor. HVS response is multiplied by this parameter, thus compressing dynamic range. Values from 0.6 to 0.9 produce best results.

  • Creates TonemapMantiuk object

    Declaration

    Objective-C

    + (nonnull TonemapMantiuk *)createTonemapMantiuk:(float)gamma;

    Swift

    class func createTonemapMantiuk(gamma: Float) -> TonemapMantiuk

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap dynamic range. Values from 0.6 to 0.9 produce best results.

  • Creates TonemapMantiuk object

    dynamic range. Values from 0.6 to 0.9 produce best results.

    Declaration

    Objective-C

    + (nonnull TonemapMantiuk *)createTonemapMantiuk;

    Swift

    class func createTonemapMantiuk() -> TonemapMantiuk
  • Creates TonemapReinhard object

    Declaration

    Objective-C

    + (nonnull TonemapReinhard *)createTonemapReinhard:(float)gamma
                                             intensity:(float)intensity
                                           light_adapt:(float)light_adapt
                                           color_adapt:(float)color_adapt;

    Swift

    class func createTonemapReinhard(gamma: Float, intensity: Float, light_adapt: Float, color_adapt: Float) -> TonemapReinhard

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap

    intensity

    result intensity in [-8, 8] range. Greater intensity produces brighter results.

    light_adapt

    light adaptation in [0, 1] range. If 1 adaptation is based only on pixel value, if 0 it’s global, otherwise it’s a weighted mean of this two cases.

    color_adapt

    chromatic adaptation in [0, 1] range. If 1 channels are treated independently, if 0 adaptation level is the same for each channel.

  • Creates TonemapReinhard object

    Declaration

    Objective-C

    + (nonnull TonemapReinhard *)createTonemapReinhard:(float)gamma
                                             intensity:(float)intensity
                                           light_adapt:(float)light_adapt;

    Swift

    class func createTonemapReinhard(gamma: Float, intensity: Float, light_adapt: Float) -> TonemapReinhard

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap

    intensity

    result intensity in [-8, 8] range. Greater intensity produces brighter results.

    light_adapt

    light adaptation in [0, 1] range. If 1 adaptation is based only on pixel value, if 0 it’s global, otherwise it’s a weighted mean of this two cases. if 0 adaptation level is the same for each channel.

  • Creates TonemapReinhard object

    Declaration

    Objective-C

    + (nonnull TonemapReinhard *)createTonemapReinhard:(float)gamma
                                             intensity:(float)intensity;

    Swift

    class func createTonemapReinhard(gamma: Float, intensity: Float) -> TonemapReinhard

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap

    intensity

    result intensity in [-8, 8] range. Greater intensity produces brighter results. value, if 0 it’s global, otherwise it’s a weighted mean of this two cases. if 0 adaptation level is the same for each channel.

  • Creates TonemapReinhard object

    Declaration

    Objective-C

    + (nonnull TonemapReinhard *)createTonemapReinhard:(float)gamma;

    Swift

    class func createTonemapReinhard(gamma: Float) -> TonemapReinhard

    Parameters

    gamma

    gamma value for gamma correction. See createTonemap value, if 0 it’s global, otherwise it’s a weighted mean of this two cases. if 0 adaptation level is the same for each channel.

  • Creates TonemapReinhard object

    value, if 0 it’s global, otherwise it’s a weighted mean of this two cases. if 0 adaptation level is the same for each channel.

    Declaration

    Objective-C

    + (nonnull TonemapReinhard *)createTonemapReinhard;

    Swift

    class func createTonemapReinhard() -> TonemapReinhard
  • Given an original color image, two differently colored versions of this image can be mixed seamlessly.

    Multiplication factor is between .5 to 2.5.

    Declaration

    Objective-C

    + (void)colorChange:(nonnull Mat *)src
                   mask:(nonnull Mat *)mask
                    dst:(nonnull Mat *)dst
                red_mul:(float)red_mul
              green_mul:(float)green_mul
               blue_mul:(float)blue_mul;

    Swift

    class func colorChange(src: Mat, mask: Mat, dst: Mat, red_mul: Float, green_mul: Float, blue_mul: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src .

    red_mul

    R-channel multiply factor.

    green_mul

    G-channel multiply factor.

    blue_mul

    B-channel multiply factor.

  • Given an original color image, two differently colored versions of this image can be mixed seamlessly.

    Multiplication factor is between .5 to 2.5.

    Declaration

    Objective-C

    + (void)colorChange:(nonnull Mat *)src
                   mask:(nonnull Mat *)mask
                    dst:(nonnull Mat *)dst
                red_mul:(float)red_mul
              green_mul:(float)green_mul;

    Swift

    class func colorChange(src: Mat, mask: Mat, dst: Mat, red_mul: Float, green_mul: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src .

    red_mul

    R-channel multiply factor.

    green_mul

    G-channel multiply factor.

  • Given an original color image, two differently colored versions of this image can be mixed seamlessly.

    Multiplication factor is between .5 to 2.5.

    Declaration

    Objective-C

    + (void)colorChange:(nonnull Mat *)src
                   mask:(nonnull Mat *)mask
                    dst:(nonnull Mat *)dst
                red_mul:(float)red_mul;

    Swift

    class func colorChange(src: Mat, mask: Mat, dst: Mat, red_mul: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src .

    red_mul

    R-channel multiply factor.

  • Given an original color image, two differently colored versions of this image can be mixed seamlessly.

    Multiplication factor is between .5 to 2.5.

    Declaration

    Objective-C

    + (void)colorChange:(nonnull Mat *)src
                   mask:(nonnull Mat *)mask
                    dst:(nonnull Mat *)dst;

    Swift

    class func colorChange(src: Mat, mask: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src .

  • Transforms a color image to a grayscale image. It is a basic tool in digital printing, stylized black-and-white photograph rendering, and in many single channel image processing applications CITE: CL12 .

    This function is to be applied on color images.

    Declaration

    Objective-C

    + (void)decolor:(nonnull Mat *)src
          grayscale:(nonnull Mat *)grayscale
        color_boost:(nonnull Mat *)color_boost;

    Swift

    class func decolor(src: Mat, grayscale: Mat, color_boost: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    grayscale

    Output 8-bit 1-channel image.

    color_boost

    Output 8-bit 3-channel image.

  • Primal-dual algorithm is an algorithm for solving special types of variational problems (that is, finding a function to minimize some functional). As the image denoising, in particular, may be seen as the variational problem, primal-dual algorithm then can be used to perform denoising and this is exactly what is implemented.

    It should be noted, that this implementation was taken from the July 2013 blog entry CITE: MA13 , which also contained (slightly more general) ready-to-use source code on Python. Subsequently, that code was rewritten on C++ with the usage of openCV by Vadim Pisarevsky at the end of July 2013 and finally it was slightly adapted by later authors.

    Although the thorough discussion and justification of the algorithm involved may be found in CITE: ChambolleEtAl, it might make sense to skim over it here, following CITE: MA13 . To begin with, we consider the 1-byte gray-level images as the functions from the rectangular domain of pixels (it may be seen as set

    \left\{(x,y)\in\mathbb{N}\times\mathbb{N}\mid 1\leq x\leq n,\;1\leq y\leq m\right\}
    for some
    m,\;n\in\mathbb{N}
    ) into
    \{0,1,\dots,255\}
    . We shall denote the noised images as
    f_i
    and with this view, given some image
    x
    of the same size, we may measure how bad it is by the formula

    \left\|\left\|\nabla x\right\|\right\| + \lambda\sum_i\left\|\left\|x-f_i\right\|\right\|

    \|\|\cdot\|\|
    here denotes
    L_2
    -norm and as you see, the first addend states that we want our image to be smooth (ideally, having zero gradient, thus being constant) and the second states that we want our result to be close to the observations we’ve got. If we treat
    x
    as a function, this is exactly the functional what we seek to minimize and here the Primal-Dual algorithm comes into play.

    Declaration

    Objective-C

    + (void)denoise_TVL1:(nonnull NSArray<Mat *> *)observations
                  result:(nonnull Mat *)result
                  lambda:(double)lambda
                  niters:(int)niters;

    Swift

    class func denoise_TVL1(observations: [Mat], result: Mat, lambda: Double, niters: Int32)
  • Primal-dual algorithm is an algorithm for solving special types of variational problems (that is, finding a function to minimize some functional). As the image denoising, in particular, may be seen as the variational problem, primal-dual algorithm then can be used to perform denoising and this is exactly what is implemented.

    It should be noted, that this implementation was taken from the July 2013 blog entry CITE: MA13 , which also contained (slightly more general) ready-to-use source code on Python. Subsequently, that code was rewritten on C++ with the usage of openCV by Vadim Pisarevsky at the end of July 2013 and finally it was slightly adapted by later authors.

    Although the thorough discussion and justification of the algorithm involved may be found in CITE: ChambolleEtAl, it might make sense to skim over it here, following CITE: MA13 . To begin with, we consider the 1-byte gray-level images as the functions from the rectangular domain of pixels (it may be seen as set

    \left\{(x,y)\in\mathbb{N}\times\mathbb{N}\mid 1\leq x\leq n,\;1\leq y\leq m\right\}
    for some
    m,\;n\in\mathbb{N}
    ) into
    \{0,1,\dots,255\}
    . We shall denote the noised images as
    f_i
    and with this view, given some image
    x
    of the same size, we may measure how bad it is by the formula

    \left\|\left\|\nabla x\right\|\right\| + \lambda\sum_i\left\|\left\|x-f_i\right\|\right\|

    \|\|\cdot\|\|
    here denotes
    L_2
    -norm and as you see, the first addend states that we want our image to be smooth (ideally, having zero gradient, thus being constant) and the second states that we want our result to be close to the observations we’ve got. If we treat
    x
    as a function, this is exactly the functional what we seek to minimize and here the Primal-Dual algorithm comes into play.

    Declaration

    Objective-C

    + (void)denoise_TVL1:(nonnull NSArray<Mat *> *)observations
                  result:(nonnull Mat *)result
                  lambda:(double)lambda;

    Swift

    class func denoise_TVL1(observations: [Mat], result: Mat, lambda: Double)
  • Primal-dual algorithm is an algorithm for solving special types of variational problems (that is, finding a function to minimize some functional). As the image denoising, in particular, may be seen as the variational problem, primal-dual algorithm then can be used to perform denoising and this is exactly what is implemented.

    It should be noted, that this implementation was taken from the July 2013 blog entry CITE: MA13 , which also contained (slightly more general) ready-to-use source code on Python. Subsequently, that code was rewritten on C++ with the usage of openCV by Vadim Pisarevsky at the end of July 2013 and finally it was slightly adapted by later authors.

    Although the thorough discussion and justification of the algorithm involved may be found in CITE: ChambolleEtAl, it might make sense to skim over it here, following CITE: MA13 . To begin with, we consider the 1-byte gray-level images as the functions from the rectangular domain of pixels (it may be seen as set

    \left\{(x,y)\in\mathbb{N}\times\mathbb{N}\mid 1\leq x\leq n,\;1\leq y\leq m\right\}
    for some
    m,\;n\in\mathbb{N}
    ) into
    \{0,1,\dots,255\}
    . We shall denote the noised images as
    f_i
    and with this view, given some image
    x
    of the same size, we may measure how bad it is by the formula

    \left\|\left\|\nabla x\right\|\right\| + \lambda\sum_i\left\|\left\|x-f_i\right\|\right\|

    \|\|\cdot\|\|
    here denotes
    L_2
    -norm and as you see, the first addend states that we want our image to be smooth (ideally, having zero gradient, thus being constant) and the second states that we want our result to be close to the observations we’ve got. If we treat
    x
    as a function, this is exactly the functional what we seek to minimize and here the Primal-Dual algorithm comes into play.

    Declaration

    Objective-C

    + (void)denoise_TVL1:(nonnull NSArray<Mat *> *)observations
                  result:(nonnull Mat *)result;

    Swift

    class func denoise_TVL1(observations: [Mat], result: Mat)
  • This filter enhances the details of a particular image.

    Declaration

    Objective-C

    + (void)detailEnhance:(nonnull Mat *)src
                      dst:(nonnull Mat *)dst
                  sigma_s:(float)sigma_s
                  sigma_r:(float)sigma_r;

    Swift

    class func detailEnhance(src: Mat, dst: Mat, sigma_s: Float, sigma_r: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src.

    sigma_s

    %Range between 0 to 200.

    sigma_r

    %Range between 0 to 1.

  • This filter enhances the details of a particular image.

    Declaration

    Objective-C

    + (void)detailEnhance:(nonnull Mat *)src
                      dst:(nonnull Mat *)dst
                  sigma_s:(float)sigma_s;

    Swift

    class func detailEnhance(src: Mat, dst: Mat, sigma_s: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src.

    sigma_s

    %Range between 0 to 200.

  • This filter enhances the details of a particular image.

    Declaration

    Objective-C

    + (void)detailEnhance:(nonnull Mat *)src dst:(nonnull Mat *)dst;

    Swift

    class func detailEnhance(src: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src.

  • Filtering is the fundamental operation in image and video processing. Edge-preserving smoothing filters are used in many different applications CITE: EM11 .

    Declaration

    Objective-C

    + (void)edgePreservingFilter:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                           flags:(int)flags
                         sigma_s:(float)sigma_s
                         sigma_r:(float)sigma_r;

    Swift

    class func edgePreservingFilter(src: Mat, dst: Mat, flags: Int32, sigma_s: Float, sigma_r: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output 8-bit 3-channel image.

    flags

    Edge preserving filters: cv::RECURS_FILTER or cv::NORMCONV_FILTER

    sigma_s

    %Range between 0 to 200.

    sigma_r

    %Range between 0 to 1.

  • Filtering is the fundamental operation in image and video processing. Edge-preserving smoothing filters are used in many different applications CITE: EM11 .

    Declaration

    Objective-C

    + (void)edgePreservingFilter:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                           flags:(int)flags
                         sigma_s:(float)sigma_s;

    Swift

    class func edgePreservingFilter(src: Mat, dst: Mat, flags: Int32, sigma_s: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output 8-bit 3-channel image.

    flags

    Edge preserving filters: cv::RECURS_FILTER or cv::NORMCONV_FILTER

    sigma_s

    %Range between 0 to 200.

  • Filtering is the fundamental operation in image and video processing. Edge-preserving smoothing filters are used in many different applications CITE: EM11 .

    Declaration

    Objective-C

    + (void)edgePreservingFilter:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                           flags:(int)flags;

    Swift

    class func edgePreservingFilter(src: Mat, dst: Mat, flags: Int32)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output 8-bit 3-channel image.

    flags

    Edge preserving filters: cv::RECURS_FILTER or cv::NORMCONV_FILTER

  • Filtering is the fundamental operation in image and video processing. Edge-preserving smoothing filters are used in many different applications CITE: EM11 .

    Declaration

    Objective-C

    + (void)edgePreservingFilter:(nonnull Mat *)src dst:(nonnull Mat *)dst;

    Swift

    class func edgePreservingFilter(src: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output 8-bit 3-channel image.

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                               h:(float)h
              templateWindowSize:(int)templateWindowSize
                searchWindowSize:(int)searchWindowSize;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat, h: Float, templateWindowSize: Int32, searchWindowSize: Int32)

    Parameters

    src

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src .

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                               h:(float)h
              templateWindowSize:(int)templateWindowSize;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat, h: Float, templateWindowSize: Int32)

    Parameters

    src

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src .

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                               h:(float)h;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat, h: Float)

    Parameters

    src

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src . Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src dst:(nonnull Mat *)dst;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src . Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels removes image details, smaller h value preserves details but also preserves some noise

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                         hVector:(nonnull FloatVector *)hVector
              templateWindowSize:(int)templateWindowSize
                searchWindowSize:(int)searchWindowSize
                        normType:(int)normType;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat, hVector: FloatVector, templateWindowSize: Int32, searchWindowSize: Int32, normType: Int32)

    Parameters

    src

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src .

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

    normType

    Type of norm used for weight calculation. Can be either NORM_L2 or NORM_L1

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                         hVector:(nonnull FloatVector *)hVector
              templateWindowSize:(int)templateWindowSize
                searchWindowSize:(int)searchWindowSize;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat, hVector: FloatVector, templateWindowSize: Int32, searchWindowSize: Int32)

    Parameters

    src

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src .

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                         hVector:(nonnull FloatVector *)hVector
              templateWindowSize:(int)templateWindowSize;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat, hVector: FloatVector, templateWindowSize: Int32)

    Parameters

    src

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src .

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Perform image denoising using Non-local Means Denoising algorithm http://www.ipol.im/pub/algo/bcm_non_local_means_denoising/ with several computational optimizations. Noise expected to be a gaussian white noise

    This function expected to be applied to grayscale images. For colored images look at fastNlMeansDenoisingColored. Advanced usage of this functions can be manual denoising of colored image in different colorspaces. Such approach is used in fastNlMeansDenoisingColored by converting image to CIELAB colorspace and then separately denoise L and AB components with different h parameter.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoising:(nonnull Mat *)src
                             dst:(nonnull Mat *)dst
                         hVector:(nonnull FloatVector *)hVector;

    Swift

    class func fastNlMeansDenoising(src: Mat, dst: Mat, hVector: FloatVector)

    Parameters

    src

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel image.

    dst

    Output image with the same size and type as src . Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Modification of fastNlMeansDenoising function for colored images

    The function converts image to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoising function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColored:(nonnull Mat *)src
                                    dst:(nonnull Mat *)dst
                                      h:(float)h
                                 hColor:(float)hColor
                     templateWindowSize:(int)templateWindowSize
                       searchWindowSize:(int)searchWindowSize;

    Swift

    class func fastNlMeansDenoisingColored(src: Mat, dst: Mat, h: Float, hColor: Float, templateWindowSize: Int32, searchWindowSize: Int32)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src .

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

    hColor

    The same as h but for color components. For most images value equals 10 will be enough to remove colored noise and do not distort colors

  • Modification of fastNlMeansDenoising function for colored images

    The function converts image to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoising function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColored:(nonnull Mat *)src
                                    dst:(nonnull Mat *)dst
                                      h:(float)h
                                 hColor:(float)hColor
                     templateWindowSize:(int)templateWindowSize;

    Swift

    class func fastNlMeansDenoisingColored(src: Mat, dst: Mat, h: Float, hColor: Float, templateWindowSize: Int32)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src .

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

    hColor

    The same as h but for color components. For most images value equals 10 will be enough to remove colored noise and do not distort colors

  • Modification of fastNlMeansDenoising function for colored images

    The function converts image to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoising function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColored:(nonnull Mat *)src
                                    dst:(nonnull Mat *)dst
                                      h:(float)h
                                 hColor:(float)hColor;

    Swift

    class func fastNlMeansDenoisingColored(src: Mat, dst: Mat, h: Float, hColor: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src . Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

    hColor

    The same as h but for color components. For most images value equals 10 will be enough to remove colored noise and do not distort colors

  • Modification of fastNlMeansDenoising function for colored images

    The function converts image to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoising function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColored:(nonnull Mat *)src
                                    dst:(nonnull Mat *)dst
                                      h:(float)h;

    Swift

    class func fastNlMeansDenoisingColored(src: Mat, dst: Mat, h: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src . Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise will be enough to remove colored noise and do not distort colors

  • Modification of fastNlMeansDenoising function for colored images

    The function converts image to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoising function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColored:(nonnull Mat *)src dst:(nonnull Mat *)dst;

    Swift

    class func fastNlMeansDenoisingColored(src: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src . Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels removes noise but also removes image details, smaller h value preserves details but also preserves some noise will be enough to remove colored noise and do not distort colors

  • Modification of fastNlMeansDenoisingMulti function for colored images sequences

    The function converts images to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoisingMulti function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColoredMulti:(nonnull NSArray<Mat *> *)srcImgs
                                         dst:(nonnull Mat *)dst
                           imgToDenoiseIndex:(int)imgToDenoiseIndex
                          temporalWindowSize:(int)temporalWindowSize
                                           h:(float)h
                                      hColor:(float)hColor
                          templateWindowSize:(int)templateWindowSize
                            searchWindowSize:(int)searchWindowSize;

    Swift

    class func fastNlMeansDenoisingColoredMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, h: Float, hColor: Float, templateWindowSize: Int32, searchWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit 3-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images.

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise.

    hColor

    The same as h but for color components.

  • Modification of fastNlMeansDenoisingMulti function for colored images sequences

    The function converts images to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoisingMulti function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColoredMulti:(nonnull NSArray<Mat *> *)srcImgs
                                         dst:(nonnull Mat *)dst
                           imgToDenoiseIndex:(int)imgToDenoiseIndex
                          temporalWindowSize:(int)temporalWindowSize
                                           h:(float)h
                                      hColor:(float)hColor
                          templateWindowSize:(int)templateWindowSize;

    Swift

    class func fastNlMeansDenoisingColoredMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, h: Float, hColor: Float, templateWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit 3-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images.

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise.

    hColor

    The same as h but for color components.

  • Modification of fastNlMeansDenoisingMulti function for colored images sequences

    The function converts images to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoisingMulti function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColoredMulti:(nonnull NSArray<Mat *> *)srcImgs
                                         dst:(nonnull Mat *)dst
                           imgToDenoiseIndex:(int)imgToDenoiseIndex
                          temporalWindowSize:(int)temporalWindowSize
                                           h:(float)h
                                      hColor:(float)hColor;

    Swift

    class func fastNlMeansDenoisingColoredMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, h: Float, hColor: Float)

    Parameters

    srcImgs

    Input 8-bit 3-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise.

    hColor

    The same as h but for color components.

  • Modification of fastNlMeansDenoisingMulti function for colored images sequences

    The function converts images to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoisingMulti function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColoredMulti:(nonnull NSArray<Mat *> *)srcImgs
                                         dst:(nonnull Mat *)dst
                           imgToDenoiseIndex:(int)imgToDenoiseIndex
                          temporalWindowSize:(int)temporalWindowSize
                                           h:(float)h;

    Swift

    class func fastNlMeansDenoisingColoredMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, h: Float)

    Parameters

    srcImgs

    Input 8-bit 3-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength for luminance component. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise.

  • Modification of fastNlMeansDenoisingMulti function for colored images sequences

    The function converts images to CIELAB colorspace and then separately denoise L and AB components with given h parameters using fastNlMeansDenoisingMulti function.

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingColoredMulti:(nonnull NSArray<Mat *> *)srcImgs
                                         dst:(nonnull Mat *)dst
                           imgToDenoiseIndex:(int)imgToDenoiseIndex
                          temporalWindowSize:(int)temporalWindowSize;

    Swift

    class func fastNlMeansDenoisingColoredMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit 3-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels removes noise but also removes image details, smaller h value preserves details but also preserves some noise.

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize
                                    h:(float)h
                   templateWindowSize:(int)templateWindowSize
                     searchWindowSize:(int)searchWindowSize;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, h: Float, templateWindowSize: Int32, searchWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images.

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize
                                    h:(float)h
                   templateWindowSize:(int)templateWindowSize;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, h: Float, templateWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images.

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize
                                    h:(float)h;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, h: Float)

    Parameters

    srcImgs

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels

    h

    Parameter regulating filter strength. Bigger h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize
                              hVector:(nonnull FloatVector *)hVector
                   templateWindowSize:(int)templateWindowSize
                     searchWindowSize:(int)searchWindowSize
                             normType:(int)normType;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, hVector: FloatVector, templateWindowSize: Int32, searchWindowSize: Int32, normType: Int32)

    Parameters

    srcImgs

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images.

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

    normType

    Type of norm used for weight calculation. Can be either NORM_L2 or NORM_L1

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize
                              hVector:(nonnull FloatVector *)hVector
                   templateWindowSize:(int)templateWindowSize
                     searchWindowSize:(int)searchWindowSize;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, hVector: FloatVector, templateWindowSize: Int32, searchWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images.

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels

    searchWindowSize

    Size in pixels of the window that is used to compute weighted average for given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize
                              hVector:(nonnull FloatVector *)hVector
                   templateWindowSize:(int)templateWindowSize;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, hVector: FloatVector, templateWindowSize: Int32)

    Parameters

    srcImgs

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images.

    templateWindowSize

    Size in pixels of the template patch that is used to compute weights. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Modification of fastNlMeansDenoising function for images sequence where consecutive images have been captured in small period of time. For example video. This version of the function is for grayscale images or for manual manipulation with colorspaces. For more details see http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.6394

    Declaration

    Objective-C

    + (void)fastNlMeansDenoisingMulti:(nonnull NSArray<Mat *> *)srcImgs
                                  dst:(nonnull Mat *)dst
                    imgToDenoiseIndex:(int)imgToDenoiseIndex
                   temporalWindowSize:(int)temporalWindowSize
                              hVector:(nonnull FloatVector *)hVector;

    Swift

    class func fastNlMeansDenoisingMulti(srcImgs: [Mat], dst: Mat, imgToDenoiseIndex: Int32, temporalWindowSize: Int32, hVector: FloatVector)

    Parameters

    srcImgs

    Input 8-bit or 16-bit (only with NORM_L1) 1-channel, 2-channel, 3-channel or 4-channel images sequence. All images should have the same type and size.

    imgToDenoiseIndex

    Target image to denoise index in srcImgs sequence

    temporalWindowSize

    Number of surrounding images to use for target image denoising. Should be odd. Images from imgToDenoiseIndex - temporalWindowSize / 2 to imgToDenoiseIndex - temporalWindowSize / 2 from srcImgs will be used to denoise srcImgs[imgToDenoiseIndex] image.

    dst

    Output image with the same size and type as srcImgs images. Should be odd. Recommended value 7 pixels given pixel. Should be odd. Affect performance linearly: greater searchWindowsSize - greater denoising time. Recommended value 21 pixels parameter applied to all channels or one per channel in dst. Big h value perfectly removes noise but also removes image details, smaller h value preserves details but also preserves some noise

  • Applying an appropriate non-linear transformation to the gradient field inside the selection and then integrating back with a Poisson solver, modifies locally the apparent illumination of an image.

    This is useful to highlight under-exposed foreground objects or to reduce specular reflections.

    Declaration

    Objective-C

    + (void)illuminationChange:(nonnull Mat *)src
                          mask:(nonnull Mat *)mask
                           dst:(nonnull Mat *)dst
                         alpha:(float)alpha
                          beta:(float)beta;

    Swift

    class func illuminationChange(src: Mat, mask: Mat, dst: Mat, alpha: Float, beta: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src.

    alpha

    Value ranges between 0-2.

    beta

    Value ranges between 0-2.

  • Applying an appropriate non-linear transformation to the gradient field inside the selection and then integrating back with a Poisson solver, modifies locally the apparent illumination of an image.

    This is useful to highlight under-exposed foreground objects or to reduce specular reflections.

    Declaration

    Objective-C

    + (void)illuminationChange:(nonnull Mat *)src
                          mask:(nonnull Mat *)mask
                           dst:(nonnull Mat *)dst
                         alpha:(float)alpha;

    Swift

    class func illuminationChange(src: Mat, mask: Mat, dst: Mat, alpha: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src.

    alpha

    Value ranges between 0-2.

  • Applying an appropriate non-linear transformation to the gradient field inside the selection and then integrating back with a Poisson solver, modifies locally the apparent illumination of an image.

    This is useful to highlight under-exposed foreground objects or to reduce specular reflections.

    Declaration

    Objective-C

    + (void)illuminationChange:(nonnull Mat *)src
                          mask:(nonnull Mat *)mask
                           dst:(nonnull Mat *)dst;

    Swift

    class func illuminationChange(src: Mat, mask: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src.

  • Restores the selected region in an image using the region neighborhood.

    The function reconstructs the selected image area from the pixel near the area boundary. The function may be used to remove dust and scratches from a scanned photo, or to remove undesirable objects from still images or video. See http://en.wikipedia.org/wiki/Inpainting for more details.

    @note - An example using the inpainting technique can be found at opencv_source_code/samples/cpp/inpaint.cpp - (Python) An example using the inpainting technique can be found at opencv_source_code/samples/python/inpaint.py

    Declaration

    Objective-C

    + (void)inpaint:(nonnull Mat *)src
          inpaintMask:(nonnull Mat *)inpaintMask
                  dst:(nonnull Mat *)dst
        inpaintRadius:(double)inpaintRadius
                flags:(int)flags;

    Swift

    class func inpaint(src: Mat, inpaintMask: Mat, dst: Mat, inpaintRadius: Double, flags: Int32)

    Parameters

    src

    Input 8-bit, 16-bit unsigned or 32-bit float 1-channel or 8-bit 3-channel image.

    inpaintMask

    Inpainting mask, 8-bit 1-channel image. Non-zero pixels indicate the area that needs to be inpainted.

    dst

    Output image with the same size and type as src .

    inpaintRadius

    Radius of a circular neighborhood of each point inpainted that is considered by the algorithm.

    flags

    Inpainting method that could be cv::INPAINT_NS or cv::INPAINT_TELEA

  • Pencil-like non-photorealistic line drawing

    Declaration

    Objective-C

    + (void)pencilSketch:(nonnull Mat *)src
                    dst1:(nonnull Mat *)dst1
                    dst2:(nonnull Mat *)dst2
                 sigma_s:(float)sigma_s
                 sigma_r:(float)sigma_r
            shade_factor:(float)shade_factor;

    Swift

    class func pencilSketch(src: Mat, dst1: Mat, dst2: Mat, sigma_s: Float, sigma_r: Float, shade_factor: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst1

    Output 8-bit 1-channel image.

    dst2

    Output image with the same size and type as src.

    sigma_s

    %Range between 0 to 200.

    sigma_r

    %Range between 0 to 1.

    shade_factor

    %Range between 0 to 0.1.

  • Pencil-like non-photorealistic line drawing

    Declaration

    Objective-C

    + (void)pencilSketch:(nonnull Mat *)src
                    dst1:(nonnull Mat *)dst1
                    dst2:(nonnull Mat *)dst2
                 sigma_s:(float)sigma_s
                 sigma_r:(float)sigma_r;

    Swift

    class func pencilSketch(src: Mat, dst1: Mat, dst2: Mat, sigma_s: Float, sigma_r: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst1

    Output 8-bit 1-channel image.

    dst2

    Output image with the same size and type as src.

    sigma_s

    %Range between 0 to 200.

    sigma_r

    %Range between 0 to 1.

  • Pencil-like non-photorealistic line drawing

    Declaration

    Objective-C

    + (void)pencilSketch:(nonnull Mat *)src
                    dst1:(nonnull Mat *)dst1
                    dst2:(nonnull Mat *)dst2
                 sigma_s:(float)sigma_s;

    Swift

    class func pencilSketch(src: Mat, dst1: Mat, dst2: Mat, sigma_s: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst1

    Output 8-bit 1-channel image.

    dst2

    Output image with the same size and type as src.

    sigma_s

    %Range between 0 to 200.

  • Pencil-like non-photorealistic line drawing

    Declaration

    Objective-C

    + (void)pencilSketch:(nonnull Mat *)src
                    dst1:(nonnull Mat *)dst1
                    dst2:(nonnull Mat *)dst2;

    Swift

    class func pencilSketch(src: Mat, dst1: Mat, dst2: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst1

    Output 8-bit 1-channel image.

    dst2

    Output image with the same size and type as src.

  • Image editing tasks concern either global changes (color/intensity corrections, filters, deformations) or local changes concerned to a selection. Here we are interested in achieving local changes, ones that are restricted to a region manually selected (ROI), in a seamless and effortless manner. The extent of the changes ranges from slight distortions to complete replacement by novel content CITE: PM03 .

    Declaration

    Objective-C

    + (void)seamlessClone:(nonnull Mat *)src
                      dst:(nonnull Mat *)dst
                     mask:(nonnull Mat *)mask
                        p:(nonnull Point2i *)p
                    blend:(nonnull Mat *)blend
                    flags:(int)flags;

    Swift

    class func seamlessClone(src: Mat, dst: Mat, mask: Mat, p: Point2i, blend: Mat, flags: Int32)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    p

    Point in dst image where object is placed.

    blend

    Output image with the same size and type as dst.

    flags

    Cloning method that could be cv::NORMAL_CLONE, cv::MIXED_CLONE or cv::MONOCHROME_TRANSFER

  • Stylization aims to produce digital imagery with a wide variety of effects not focused on photorealism. Edge-aware filters are ideal for stylization, as they can abstract regions of low contrast while preserving, or enhancing, high-contrast features.

    Declaration

    Objective-C

    + (void)stylization:(nonnull Mat *)src
                    dst:(nonnull Mat *)dst
                sigma_s:(float)sigma_s
                sigma_r:(float)sigma_r;

    Swift

    class func stylization(src: Mat, dst: Mat, sigma_s: Float, sigma_r: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src.

    sigma_s

    %Range between 0 to 200.

    sigma_r

    %Range between 0 to 1.

  • Stylization aims to produce digital imagery with a wide variety of effects not focused on photorealism. Edge-aware filters are ideal for stylization, as they can abstract regions of low contrast while preserving, or enhancing, high-contrast features.

    Declaration

    Objective-C

    + (void)stylization:(nonnull Mat *)src
                    dst:(nonnull Mat *)dst
                sigma_s:(float)sigma_s;

    Swift

    class func stylization(src: Mat, dst: Mat, sigma_s: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src.

    sigma_s

    %Range between 0 to 200.

  • Stylization aims to produce digital imagery with a wide variety of effects not focused on photorealism. Edge-aware filters are ideal for stylization, as they can abstract regions of low contrast while preserving, or enhancing, high-contrast features.

    Declaration

    Objective-C

    + (void)stylization:(nonnull Mat *)src dst:(nonnull Mat *)dst;

    Swift

    class func stylization(src: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    dst

    Output image with the same size and type as src.

  • By retaining only the gradients at edge locations, before integrating with the Poisson solver, one washes out the texture of the selected region, giving its contents a flat aspect. Here Canny Edge %Detector is used.

    @note The algorithm assumes that the color of the source image is close to that of the destination. This assumption means that when the colors don’t match, the source image color gets tinted toward the color of the destination image.

    Declaration

    Objective-C

    + (void)textureFlattening:(nonnull Mat *)src
                         mask:(nonnull Mat *)mask
                          dst:(nonnull Mat *)dst
                low_threshold:(float)low_threshold
               high_threshold:(float)high_threshold
                  kernel_size:(int)kernel_size;

    Swift

    class func textureFlattening(src: Mat, mask: Mat, dst: Mat, low_threshold: Float, high_threshold: Float, kernel_size: Int32)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src.

    low_threshold

    %Range from 0 to 100.

    high_threshold

    Value > 100.

    kernel_size

    The size of the Sobel kernel to be used.

  • By retaining only the gradients at edge locations, before integrating with the Poisson solver, one washes out the texture of the selected region, giving its contents a flat aspect. Here Canny Edge %Detector is used.

    @note The algorithm assumes that the color of the source image is close to that of the destination. This assumption means that when the colors don’t match, the source image color gets tinted toward the color of the destination image.

    Declaration

    Objective-C

    + (void)textureFlattening:(nonnull Mat *)src
                         mask:(nonnull Mat *)mask
                          dst:(nonnull Mat *)dst
                low_threshold:(float)low_threshold
               high_threshold:(float)high_threshold;

    Swift

    class func textureFlattening(src: Mat, mask: Mat, dst: Mat, low_threshold: Float, high_threshold: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src.

    low_threshold

    %Range from 0 to 100.

    high_threshold

    Value > 100.

  • By retaining only the gradients at edge locations, before integrating with the Poisson solver, one washes out the texture of the selected region, giving its contents a flat aspect. Here Canny Edge %Detector is used.

    @note The algorithm assumes that the color of the source image is close to that of the destination. This assumption means that when the colors don’t match, the source image color gets tinted toward the color of the destination image.

    Declaration

    Objective-C

    + (void)textureFlattening:(nonnull Mat *)src
                         mask:(nonnull Mat *)mask
                          dst:(nonnull Mat *)dst
                low_threshold:(float)low_threshold;

    Swift

    class func textureFlattening(src: Mat, mask: Mat, dst: Mat, low_threshold: Float)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src.

    low_threshold

    %Range from 0 to 100.

  • By retaining only the gradients at edge locations, before integrating with the Poisson solver, one washes out the texture of the selected region, giving its contents a flat aspect. Here Canny Edge %Detector is used.

    @note The algorithm assumes that the color of the source image is close to that of the destination. This assumption means that when the colors don’t match, the source image color gets tinted toward the color of the destination image.

    Declaration

    Objective-C

    + (void)textureFlattening:(nonnull Mat *)src
                         mask:(nonnull Mat *)mask
                          dst:(nonnull Mat *)dst;

    Swift

    class func textureFlattening(src: Mat, mask: Mat, dst: Mat)

    Parameters

    src

    Input 8-bit 3-channel image.

    mask

    Input 8-bit 1 or 3-channel image.

    dst

    Output image with the same size and type as src.