Xfeatures2d
Objective-C
@interface Xfeatures2d : NSObject
Swift
class Xfeatures2d : NSObject
The Xfeatures2d module
Member classes: FREAK, StarDetector, BriefDescriptorExtractor, LUCID, LATCH, DAISY, MSDDetector, VGG, BoostDesc, PCTSignatures, PCTSignaturesSQFD, HarrisLaplaceFeatureDetector, SURF
Member enums: NormalizationType, DistanceFunction, PointDistribution, SimilarityFunction, KeypointLayout
-
+matchGMS:size2: keypoints1: keypoints2: matches1to2: matchesGMS: withRotation: withScale: thresholdFactor: GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS withRotation:(BOOL)withRotation withScale:(BOOL)withScale thresholdFactor:(double)thresholdFactor;Parameters
size1Input size of image1.
size2Input size of image2.
keypoints1Input keypoints of image1.
keypoints2Input keypoints of image2.
matches1to2Input 1-nearest neighbor matches.
matchesGMSMatches returned by the GMS matching strategy.
withRotationTake rotation transformation into account.
withScaleTake scale transformation into account.
thresholdFactorThe higher, the less matches. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS withRotation:(BOOL)withRotation withScale:(BOOL)withScale;Parameters
size1Input size of image1.
size2Input size of image2.
keypoints1Input keypoints of image1.
keypoints2Input keypoints of image2.
matches1to2Input 1-nearest neighbor matches.
matchesGMSMatches returned by the GMS matching strategy.
withRotationTake rotation transformation into account.
withScaleTake scale transformation into account. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS withRotation:(BOOL)withRotation;Parameters
size1Input size of image1.
size2Input size of image2.
keypoints1Input keypoints of image1.
keypoints2Input keypoints of image2.
matches1to2Input 1-nearest neighbor matches.
matchesGMSMatches returned by the GMS matching strategy.
withRotationTake rotation transformation into account. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS;Parameters
size1Input size of image1.
size2Input size of image2.
keypoints1Input keypoints of image1.
keypoints2Input keypoints of image2.
matches1to2Input 1-nearest neighbor matches.
matchesGMSMatches returned by the GMS matching strategy. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
LOGOS (Local geometric support for high-outlier spatial verification) feature matching strategy described in CITE: Lowry2018LOGOSLG .
Declaration
Parameters
keypoints1Input keypoints of image1.
keypoints2Input keypoints of image2.
nn1Index to the closest BoW centroid for each descriptors of image1.
nn2Index to the closest BoW centroid for each descriptors of image2.
matches1to2Matches returned by the LOGOS matching strategy. @note This matching strategy is suitable for features matching against large scale database. First step consists in constructing the bag-of-words (BoW) from a representative image database. Image descriptors are then represented by their closest codevector (nearest BoW centroid).
View on GitHub
Xfeatures2d Class Reference