(Solved):
We trained a logistic regression model, a decision tree model, and the XGBoos ...
???????
We trained a logistic regression model, a decision tree model, and the XGBoost model on a dataset with 5,155 observations. The label variable is binary. The Receiver Operating Characteristics curves are shown below. If I need to correctly classify at least 80% of True Positive observations, then what is the maximum percentage of False Positive observations that I need to tolerate if I choose the Logistic Regression model over the XGBoost? Multiple Choice: (A) Approximately 5\% (B) Approximately 10% (C) Approximately 15% (D) Approximately 20% (E) Approximately 25\%
To determine the maximum percentage of False Positive (FP) observations that we can tolerate while achieving at least 80% True Positive (TP) rate using the Logistic Regression (LR) model instead...