Include confusion matrix threshold details in Performance and Prediction Section

Description

Details:
Let's update the Prediction and Performance section to include details on how the confusion matrix threshold is applied. From the code you can see that we predict the positive class (1, True, or categorical label that is second in the lexicographical ordering of the labels) if the predicted probability is *greater than or equal * to the threshold, which produces the best F1 score.

Documentation Update:
In the confusion matrix section

Below the paragraph and image in that section we should add:

Binary Classification
All predicted probabilities greater than or equal to the F1 Max threshold are labeled with the positive class (e.g., 1, True, or the second label in lexicographical order). The F1 Max threshold is selected to maximize the F1 score calculated from confusion matrix values (true positives, true negatives, false positives, and false negatives).

Multiclass Classification
Prediction class labels are based on the class with the highest predicted probability.

Assignee

Angela Bartz

Fix versions

Reporter

Lauren DiPerna

Support ticket URL

None

Labels

None

Affected Spark version

None

Customer Request Type

None

Task progress

None

CustomerVisible

No

Priority

Major
Configure