Xfeatures2d
Objective-C
@interface Xfeatures2d : NSObject
Swift
class Xfeatures2d : NSObject
The Xfeatures2d module
Member classes: FREAK
, StarDetector
, BriefDescriptorExtractor
, LUCID
, LATCH
, DAISY
, MSDDetector
, VGG
, BoostDesc
, PCTSignatures
, PCTSignaturesSQFD
, HarrisLaplaceFeatureDetector
, SURF
Member enums: NormalizationType
, DistanceFunction
, PointDistribution
, SimilarityFunction
, KeypointLayout
-
+matchGMS:
size2: keypoints1: keypoints2: matches1to2: matchesGMS: withRotation: withScale: thresholdFactor: GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS withRotation:(BOOL)withRotation withScale:(BOOL)withScale thresholdFactor:(double)thresholdFactor;
Parameters
size1
Input size of image1.
size2
Input size of image2.
keypoints1
Input keypoints of image1.
keypoints2
Input keypoints of image2.
matches1to2
Input 1-nearest neighbor matches.
matchesGMS
Matches returned by the GMS matching strategy.
withRotation
Take rotation transformation into account.
withScale
Take scale transformation into account.
thresholdFactor
The higher, the less matches. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS withRotation:(BOOL)withRotation withScale:(BOOL)withScale;
Parameters
size1
Input size of image1.
size2
Input size of image2.
keypoints1
Input keypoints of image1.
keypoints2
Input keypoints of image2.
matches1to2
Input 1-nearest neighbor matches.
matchesGMS
Matches returned by the GMS matching strategy.
withRotation
Take rotation transformation into account.
withScale
Take scale transformation into account. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS withRotation:(BOOL)withRotation;
Parameters
size1
Input size of image1.
size2
Input size of image2.
keypoints1
Input keypoints of image1.
keypoints2
Input keypoints of image2.
matches1to2
Input 1-nearest neighbor matches.
matchesGMS
Matches returned by the GMS matching strategy.
withRotation
Take rotation transformation into account. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
GMS (Grid-based Motion Statistics) feature matching strategy described in CITE: Bian2017gms .
Declaration
Objective-C
+ (void)matchGMS:(nonnull Size2i *)size1 size2:(nonnull Size2i *)size2 keypoints1:(nonnull NSArray<KeyPoint *> *)keypoints1 keypoints2:(nonnull NSArray<KeyPoint *> *)keypoints2 matches1to2:(nonnull NSArray<DMatch *> *)matches1to2 matchesGMS:(nonnull NSMutableArray<DMatch *> *)matchesGMS;
Parameters
size1
Input size of image1.
size2
Input size of image2.
keypoints1
Input keypoints of image1.
keypoints2
Input keypoints of image2.
matches1to2
Input 1-nearest neighbor matches.
matchesGMS
Matches returned by the GMS matching strategy. @note Since GMS works well when the number of features is large, we recommend to use the ORB feature and set FastThreshold to 0 to get as many as possible features quickly. If matching results are not satisfying, please add more features. (We use 10000 for images with 640 X 480). If your images have big rotation and scale changes, please set withRotation or withScale to true.
-
LOGOS (Local geometric support for high-outlier spatial verification) feature matching strategy described in CITE: Lowry2018LOGOSLG .
Declaration
Parameters
keypoints1
Input keypoints of image1.
keypoints2
Input keypoints of image2.
nn1
Index to the closest BoW centroid for each descriptors of image1.
nn2
Index to the closest BoW centroid for each descriptors of image2.
matches1to2
Matches returned by the LOGOS matching strategy. @note This matching strategy is suitable for features matching against large scale database. First step consists in constructing the bag-of-words (BoW) from a representative image database. Image descriptors are then represented by their closest codevector (nearest BoW centroid).