Calculates the confusion matrix of observed and predicted classes, after converting them into factors first (not currently done by conf_mat{yardstick}).

calc_confusion_matrix(x, target_col_name, target_pred_col_name, ...)

Arguments

x

A data frame with two columns: the column with the actual classes; and the column with the predicted classes. Any other columns will be ignored.

target_col_name

A string with the column name of the target variable.

target_pred_col_name

A string with the column name of the predictions for the target variable.

...

Further arguments passed from other methods.

Value

An object with class conf_mat() (see conf_mat{yardstick}).

Examples

library(experienceAnalysis) mtcars %>% dplyr::mutate(carb_pred = sample(carb, size = nrow(.))) %>% # Mock predictions column calc_accuracy_per_class( target_col_name = "carb", target_pred_col_name = "carb_pred" )
#> # A tibble: 6 x 2 #> carb accuracy #> <dbl> <dbl> #> 1 1 0.429 #> 2 2 0.1 #> 3 3 0 #> 4 4 0.3 #> 5 6 0 #> 6 8 0
# Custom column names mtcars %>% dplyr::mutate(carb_pred = sample(carb, size = nrow(.))) %>% # Mock predictions column calc_confusion_matrix( target_col_name = "carb", target_pred_col_name = "carb_pred" )
#> Truth #> Prediction 1 2 3 4 6 8 #> 1 0 3 1 2 0 1 #> 2 2 2 2 4 0 0 #> 3 0 2 0 1 0 0 #> 4 5 2 0 2 1 0 #> 6 0 1 0 0 0 0 #> 8 0 0 0 1 0 0