EM

Objective-C

@interface EM : StatModel

Swift

class EM : StatModel

The class implements the Expectation Maximization algorithm.

See

REF: ml_intro_em

Member of Ml

Class Constants

  • Declaration

    Objective-C

    @property (class, readonly) int DEFAULT_NCLUSTERS

    Swift

    class var DEFAULT_NCLUSTERS: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int DEFAULT_MAX_ITERS

    Swift

    class var DEFAULT_MAX_ITERS: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int START_E_STEP

    Swift

    class var START_E_STEP: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int START_M_STEP

    Swift

    class var START_M_STEP: Int32 { get }
  • Declaration

    Objective-C

    @property (class, readonly) int START_AUTO_STEP

    Swift

    class var START_AUTO_STEP: Int32 { get }

Methods

  • Returns the cluster centers (means of the Gaussian mixture)

     Returns matrix with the number of rows equal to the number of mixtures and number of columns
     equal to the space dimensionality.
    

    Declaration

    Objective-C

    - (nonnull Mat *)getMeans;

    Swift

    func getMeans() -> Mat
  • Returns weights of the mixtures

     Returns vector with the number of elements equal to the number of mixtures.
    

    Declaration

    Objective-C

    - (nonnull Mat *)getWeights;

    Swift

    func getWeights() -> Mat
  • Creates empty %EM model. The model should be trained then using StatModel::train(traindata, flags) method. Alternatively, you can use one of the EM::train* methods or load it from file using Algorithm::load<EM>(filename).

    Declaration

    Objective-C

    + (nonnull EM *)create;

    Swift

    class func create() -> EM
  • Loads and creates a serialized EM from a file

    Use EM::save to serialize and store an EM to disk. Load the EM from this file again, by calling this function with the path to the file. Optionally specify the node for the file containing the classifier

    Declaration

    Objective-C

    + (nonnull EM *)load:(nonnull NSString *)filepath
                nodeName:(nonnull NSString *)nodeName;

    Swift

    class func load(filepath: String, nodeName: String) -> EM

    Parameters

    filepath

    path to serialized EM

    nodeName

    name of node containing the classifier

  • Loads and creates a serialized EM from a file

    Use EM::save to serialize and store an EM to disk. Load the EM from this file again, by calling this function with the path to the file. Optionally specify the node for the file containing the classifier

    Declaration

    Objective-C

    + (nonnull EM *)load:(nonnull NSString *)filepath;

    Swift

    class func load(filepath: String) -> EM

    Parameters

    filepath

    path to serialized EM

  • Declaration

    Objective-C

    - (nonnull TermCriteria *)getTermCriteria;

    Swift

    func getTermCriteria() -> TermCriteria
  • Returns a likelihood logarithm value and an index of the most probable mixture component for the given sample.

    Declaration

    Objective-C

    - (nonnull Double2 *)predict2:(nonnull Mat *)sample probs:(nonnull Mat *)probs;

    Swift

    func predict2(sample: Mat, probs: Mat) -> Double2

    Parameters

    sample

    A sample for classification. It should be a one-channel matrix of

    1 \times dims
    or
    dims \times 1
    size.

    probs

    Optional output matrix that contains posterior probabilities of each component given the sample. It has

    1 \times nclusters
    size and CV_64FC1 type.

    The method returns a two-element double vector. Zero element is a likelihood logarithm value for the sample. First element is an index of the most probable mixture component for the given sample.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. You need to provide initial means `$$a_k$$` of
     mixture components. Optionally you can pass initial weights `$$\pi_k$$` and covariance matrices
     `$$S_k$$` of mixture components.
    

    Declaration

    Objective-C

    - (BOOL)trainE:(nonnull Mat *)samples
                means0:(nonnull Mat *)means0
                 covs0:(nonnull Mat *)covs0
              weights0:(nonnull Mat *)weights0
        logLikelihoods:(nonnull Mat *)logLikelihoods
                labels:(nonnull Mat *)labels
                 probs:(nonnull Mat *)probs;

    Swift

    func trainE(samples: Mat, means0: Mat, covs0: Mat, weights0: Mat, logLikelihoods: Mat, labels: Mat, probs: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    means0

    Initial means

    a_k
    of mixture components. It is a one-channel matrix of
    nclusters \times dims
    size. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    covs0

    The vector of initial covariance matrices

    S_k
    of mixture components. Each of covariance matrices is a one-channel matrix of
    dims \times dims
    size. If the matrices do not have CV_64F type they will be converted to the inner matrices of such type for the further computing.

    weights0

    Initial weights

    \pi_k
    of mixture components. It should be a one-channel floating-point matrix with
    1 \times nclusters
    or
    nclusters \times 1
    size.

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.

    labels

    The optional output “class label” for each sample:

    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type.

    probs

    The optional output matrix that contains posterior probabilities of each Gaussian mixture component given the each sample. It has

    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. You need to provide initial means `$$a_k$$` of
     mixture components. Optionally you can pass initial weights `$$\pi_k$$` and covariance matrices
     `$$S_k$$` of mixture components.
    

    Declaration

    Objective-C

    - (BOOL)trainE:(nonnull Mat *)samples
                means0:(nonnull Mat *)means0
                 covs0:(nonnull Mat *)covs0
              weights0:(nonnull Mat *)weights0
        logLikelihoods:(nonnull Mat *)logLikelihoods
                labels:(nonnull Mat *)labels;

    Swift

    func trainE(samples: Mat, means0: Mat, covs0: Mat, weights0: Mat, logLikelihoods: Mat, labels: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    means0

    Initial means

    a_k
    of mixture components. It is a one-channel matrix of
    nclusters \times dims
    size. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    covs0

    The vector of initial covariance matrices

    S_k
    of mixture components. Each of covariance matrices is a one-channel matrix of
    dims \times dims
    size. If the matrices do not have CV_64F type they will be converted to the inner matrices of such type for the further computing.

    weights0

    Initial weights

    \pi_k
    of mixture components. It should be a one-channel floating-point matrix with
    1 \times nclusters
    or
    nclusters \times 1
    size.

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.

    labels

    The optional output “class label” for each sample:

    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. You need to provide initial means `$$a_k$$` of
     mixture components. Optionally you can pass initial weights `$$\pi_k$$` and covariance matrices
     `$$S_k$$` of mixture components.
    

    Declaration

    Objective-C

    - (BOOL)trainE:(nonnull Mat *)samples
                means0:(nonnull Mat *)means0
                 covs0:(nonnull Mat *)covs0
              weights0:(nonnull Mat *)weights0
        logLikelihoods:(nonnull Mat *)logLikelihoods;

    Swift

    func trainE(samples: Mat, means0: Mat, covs0: Mat, weights0: Mat, logLikelihoods: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    means0

    Initial means

    a_k
    of mixture components. It is a one-channel matrix of
    nclusters \times dims
    size. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    covs0

    The vector of initial covariance matrices

    S_k
    of mixture components. Each of covariance matrices is a one-channel matrix of
    dims \times dims
    size. If the matrices do not have CV_64F type they will be converted to the inner matrices of such type for the further computing.

    weights0

    Initial weights

    \pi_k
    of mixture components. It should be a one-channel floating-point matrix with
    1 \times nclusters
    or
    nclusters \times 1
    size.

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. You need to provide initial means `$$a_k$$` of
     mixture components. Optionally you can pass initial weights `$$\pi_k$$` and covariance matrices
     `$$S_k$$` of mixture components.
    

    Declaration

    Objective-C

    - (BOOL)trainE:(nonnull Mat *)samples
            means0:(nonnull Mat *)means0
             covs0:(nonnull Mat *)covs0
          weights0:(nonnull Mat *)weights0;

    Swift

    func trainE(samples: Mat, means0: Mat, covs0: Mat, weights0: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    means0

    Initial means

    a_k
    of mixture components. It is a one-channel matrix of
    nclusters \times dims
    size. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    covs0

    The vector of initial covariance matrices

    S_k
    of mixture components. Each of covariance matrices is a one-channel matrix of
    dims \times dims
    size. If the matrices do not have CV_64F type they will be converted to the inner matrices of such type for the further computing.

    weights0

    Initial weights

    \pi_k
    of mixture components. It should be a one-channel floating-point matrix with
    1 \times nclusters
    or
    nclusters \times 1
    size. each sample. It has
    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. You need to provide initial means `$$a_k$$` of
     mixture components. Optionally you can pass initial weights `$$\pi_k$$` and covariance matrices
     `$$S_k$$` of mixture components.
    

    Declaration

    Objective-C

    - (BOOL)trainE:(nonnull Mat *)samples
            means0:(nonnull Mat *)means0
             covs0:(nonnull Mat *)covs0;

    Swift

    func trainE(samples: Mat, means0: Mat, covs0: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    means0

    Initial means

    a_k
    of mixture components. It is a one-channel matrix of
    nclusters \times dims
    size. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    covs0

    The vector of initial covariance matrices

    S_k
    of mixture components. Each of covariance matrices is a one-channel matrix of
    dims \times dims
    size. If the matrices do not have CV_64F type they will be converted to the inner matrices of such type for the further computing. floating-point matrix with
    1 \times nclusters
    or
    nclusters \times 1
    size. each sample. It has
    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. You need to provide initial means `$$a_k$$` of
     mixture components. Optionally you can pass initial weights `$$\pi_k$$` and covariance matrices
     `$$S_k$$` of mixture components.
    

    Declaration

    Objective-C

    - (BOOL)trainE:(nonnull Mat *)samples means0:(nonnull Mat *)means0;

    Swift

    func trainE(samples: Mat, means0: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    means0

    Initial means

    a_k
    of mixture components. It is a one-channel matrix of
    nclusters \times dims
    size. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing. covariance matrices is a one-channel matrix of
    dims \times dims
    size. If the matrices do not have CV_64F type they will be converted to the inner matrices of such type for the further computing. floating-point matrix with
    1 \times nclusters
    or
    nclusters \times 1
    size. each sample. It has
    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. Initial values of the model parameters will be
     estimated by the k-means algorithm.
    
     Unlike many of the ML models, %EM is an unsupervised learning algorithm and it does not take
     responses (class labels or function values) as input. Instead, it computes the *Maximum
     Likelihood Estimate* of the Gaussian mixture parameters from an input sample set, stores all the
     parameters inside the structure: `$$p_{i,k}$$` in probs, `$$a_k$$` in means , `$$S_k$$` in
     covs[k], `$$\pi_k$$` in weights , and optionally computes the output "class label" for each
     sample: `$$\texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N$$` (indices of the most
     probable mixture component for each sample).
    
     The trained model can be used further for prediction, just like any other classifier. The
     trained model is similar to the NormalBayesClassifier.
    

    Declaration

    Objective-C

    - (BOOL)trainEM:(nonnull Mat *)samples
        logLikelihoods:(nonnull Mat *)logLikelihoods
                labels:(nonnull Mat *)labels
                 probs:(nonnull Mat *)probs;

    Swift

    func trainEM(samples: Mat, logLikelihoods: Mat, labels: Mat, probs: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.

    labels

    The optional output “class label” for each sample:

    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type.

    probs

    The optional output matrix that contains posterior probabilities of each Gaussian mixture component given the each sample. It has

    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. Initial values of the model parameters will be
     estimated by the k-means algorithm.
    
     Unlike many of the ML models, %EM is an unsupervised learning algorithm and it does not take
     responses (class labels or function values) as input. Instead, it computes the *Maximum
     Likelihood Estimate* of the Gaussian mixture parameters from an input sample set, stores all the
     parameters inside the structure: `$$p_{i,k}$$` in probs, `$$a_k$$` in means , `$$S_k$$` in
     covs[k], `$$\pi_k$$` in weights , and optionally computes the output "class label" for each
     sample: `$$\texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N$$` (indices of the most
     probable mixture component for each sample).
    
     The trained model can be used further for prediction, just like any other classifier. The
     trained model is similar to the NormalBayesClassifier.
    

    Declaration

    Objective-C

    - (BOOL)trainEM:(nonnull Mat *)samples
        logLikelihoods:(nonnull Mat *)logLikelihoods
                labels:(nonnull Mat *)labels;

    Swift

    func trainEM(samples: Mat, logLikelihoods: Mat, labels: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.

    labels

    The optional output “class label” for each sample:

    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. Initial values of the model parameters will be
     estimated by the k-means algorithm.
    
     Unlike many of the ML models, %EM is an unsupervised learning algorithm and it does not take
     responses (class labels or function values) as input. Instead, it computes the *Maximum
     Likelihood Estimate* of the Gaussian mixture parameters from an input sample set, stores all the
     parameters inside the structure: `$$p_{i,k}$$` in probs, `$$a_k$$` in means , `$$S_k$$` in
     covs[k], `$$\pi_k$$` in weights , and optionally computes the output "class label" for each
     sample: `$$\texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N$$` (indices of the most
     probable mixture component for each sample).
    
     The trained model can be used further for prediction, just like any other classifier. The
     trained model is similar to the NormalBayesClassifier.
    

    Declaration

    Objective-C

    - (BOOL)trainEM:(nonnull Mat *)samples
        logLikelihoods:(nonnull Mat *)logLikelihoods;

    Swift

    func trainEM(samples: Mat, logLikelihoods: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Expectation step. Initial values of the model parameters will be
     estimated by the k-means algorithm.
    
     Unlike many of the ML models, %EM is an unsupervised learning algorithm and it does not take
     responses (class labels or function values) as input. Instead, it computes the *Maximum
     Likelihood Estimate* of the Gaussian mixture parameters from an input sample set, stores all the
     parameters inside the structure: `$$p_{i,k}$$` in probs, `$$a_k$$` in means , `$$S_k$$` in
     covs[k], `$$\pi_k$$` in weights , and optionally computes the output "class label" for each
     sample: `$$\texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N$$` (indices of the most
     probable mixture component for each sample).
    
     The trained model can be used further for prediction, just like any other classifier. The
     trained model is similar to the NormalBayesClassifier.
    

    Declaration

    Objective-C

    - (BOOL)trainEM:(nonnull Mat *)samples;

    Swift

    func trainEM(samples: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing. each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Maximization step. You need to provide initial probabilities
     `$$p_{i,k}$$` to use this option.
    

    Declaration

    Objective-C

    - (BOOL)trainM:(nonnull Mat *)samples
                probs0:(nonnull Mat *)probs0
        logLikelihoods:(nonnull Mat *)logLikelihoods
                labels:(nonnull Mat *)labels
                 probs:(nonnull Mat *)probs;

    Swift

    func trainM(samples: Mat, probs0: Mat, logLikelihoods: Mat, labels: Mat, probs: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    probs0

    the probabilities

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.

    labels

    The optional output “class label” for each sample:

    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type.

    probs

    The optional output matrix that contains posterior probabilities of each Gaussian mixture component given the each sample. It has

    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Maximization step. You need to provide initial probabilities
     `$$p_{i,k}$$` to use this option.
    

    Declaration

    Objective-C

    - (BOOL)trainM:(nonnull Mat *)samples
                probs0:(nonnull Mat *)probs0
        logLikelihoods:(nonnull Mat *)logLikelihoods
                labels:(nonnull Mat *)labels;

    Swift

    func trainM(samples: Mat, probs0: Mat, logLikelihoods: Mat, labels: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    probs0

    the probabilities

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.

    labels

    The optional output “class label” for each sample:

    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Maximization step. You need to provide initial probabilities
     `$$p_{i,k}$$` to use this option.
    

    Declaration

    Objective-C

    - (BOOL)trainM:(nonnull Mat *)samples
                probs0:(nonnull Mat *)probs0
        logLikelihoods:(nonnull Mat *)logLikelihoods;

    Swift

    func trainM(samples: Mat, probs0: Mat, logLikelihoods: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    probs0

    the probabilities

    logLikelihoods

    The optional output matrix that contains a likelihood logarithm value for each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Estimate the Gaussian mixture parameters from a samples set.

     This variation starts with Maximization step. You need to provide initial probabilities
     `$$p_{i,k}$$` to use this option.
    

    Declaration

    Objective-C

    - (BOOL)trainM:(nonnull Mat *)samples probs0:(nonnull Mat *)probs0;

    Swift

    func trainM(samples: Mat, probs0: Mat) -> Bool

    Parameters

    samples

    Samples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.

    probs0

    the probabilities each sample. It has

    nsamples \times 1
    size and CV_64FC1 type.
    \texttt{labels}_i=\texttt{arg max}_k(p_{i,k}), i=1..N
    (indices of the most probable mixture component for each sample). It has
    nsamples \times 1
    size and CV_32SC1 type. mixture component given the each sample. It has
    nsamples \times nclusters
    size and CV_64FC1 type.

  • Returns posterior probabilities for the provided samples

    Declaration

    Objective-C

    - (float)predict:(nonnull Mat *)samples
             results:(nonnull Mat *)results
               flags:(int)flags;

    Swift

    func predict(samples: Mat, results: Mat, flags: Int32) -> Float

    Parameters

    samples

    The input samples, floating-point matrix

    results

    The optional output

    nSamples \times nClusters
    matrix of results. It contains posterior probabilities for each sample from the input

    flags

    This parameter will be ignored

  • Returns posterior probabilities for the provided samples

    Declaration

    Objective-C

    - (float)predict:(nonnull Mat *)samples results:(nonnull Mat *)results;

    Swift

    func predict(samples: Mat, results: Mat) -> Float

    Parameters

    samples

    The input samples, floating-point matrix

    results

    The optional output

    nSamples \times nClusters
    matrix of results. It contains posterior probabilities for each sample from the input

  • Returns posterior probabilities for the provided samples

    Declaration

    Objective-C

    - (float)predict:(nonnull Mat *)samples;

    Swift

    func predict(samples: Mat) -> Float

    Parameters

    samples

    The input samples, floating-point matrix posterior probabilities for each sample from the input

  • Declaration

    Objective-C

    - (int)getClustersNumber;

    Swift

    func getClustersNumber() -> Int32
  • Declaration

    Objective-C

    - (int)getCovarianceMatrixType;

    Swift

    func getCovarianceMatrixType() -> Int32
  • Returns covariation matrices

     Returns vector of covariation matrices. Number of matrices is the number of gaussian mixtures,
     each matrix is a square floating-point matrix NxN, where N is the space dimensionality.
    

    Declaration

    Objective-C

    - (void)getCovs:(nonnull NSMutableArray<Mat *> *)covs;

    Swift

    func getCovs(covs: NSMutableArray)
  • getClustersNumber - see: -getClustersNumber:

    Declaration

    Objective-C

    - (void)setClustersNumber:(int)val;

    Swift

    func setClustersNumber(val: Int32)
  • getCovarianceMatrixType - see: -getCovarianceMatrixType:

    Declaration

    Objective-C

    - (void)setCovarianceMatrixType:(int)val;

    Swift

    func setCovarianceMatrixType(val: Int32)
  • getTermCriteria - see: -getTermCriteria:

    Declaration

    Objective-C

    - (void)setTermCriteria:(nonnull TermCriteria *)val;

    Swift

    func setTermCriteria(val: TermCriteria)