FindMatchesMetrics
The evaluation metrics for the find matches algorithm. The quality of your machine learning transform is measured by getting your transform to predict some matches and comparing the results to known matches from the same dataset. The quality metrics are based on a subset of your data, so they are not precise.
Types
Properties
The area under the precision/recall curve (AUPRC) is a single number measuring the overall quality of the transform, that is independent of the choice made for precision vs. recall. Higher values indicate that you have a more attractive precision vs. recall tradeoff.
A list of ColumnImportance
structures containing column importance metrics, sorted in order of descending importance.
The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.