Test Your Knowledge
Question 1 of 4
In a binary classification problem, what does the False Positive (FP) represent in a confusion matrix?
Correctly predicted positive instances
Incorrectly predicted negative instances as positive
Incorrectly predicted positive instances as negative
Correctly predicted negative instances
Question 2 of 4
Which metric is calculated as (TP + TN) / (TP + TN + FP + FN)?
Precision
Recall
Accuracy
F1 Score
Question 3 of 4
If a model has high precision but low recall for the positive class, what does this indicate?
The model is correctly identifying most positive instances
The model is missing many positive instances but is reliable when it predicts positive
The model is predicting too many instances as positive
The model is equally balanced in its predictions
Question 4 of 4
What is the primary purpose of the F1 Score?
To provide a single metric that balances precision and recall
To measure the overall accuracy of the model
To determine the number of false positives
To calculate the percentage of correct negative predictions