Quick Start Guide: Optimal Regression Trees

This is an R version of the corresponding OptimalTrees quick start guide.

In this example we will use Optimal Regression Trees (ORT) on the yacht hydrodynamics dataset. First we load in the data and split into training and test datasets:

df <- read.table(
    "yacht_hydrodynamics.data",
    col.names = c("position", "prismatic", "length_displacement",
                  "beam_draught", "length_beam", "froude", "resistance"),
)
  position prismatic length_displacement beam_draught length_beam froude
1     -2.3     0.568                4.78         3.99        3.17  0.125
2     -2.3     0.568                4.78         3.99        3.17  0.150
3     -2.3     0.568                4.78         3.99        3.17  0.175
4     -2.3     0.568                4.78         3.99        3.17  0.200
5     -2.3     0.568                4.78         3.99        3.17  0.225
6     -2.3     0.568                4.78         3.99        3.17  0.250
7     -2.3     0.568                4.78         3.99        3.17  0.275
8     -2.3     0.568                4.78         3.99        3.17  0.300
  resistance
1       0.11
2       0.27
3       0.47
4       0.78
5       1.18
6       1.82
7       2.61
8       3.76
 [ reached 'max' / getOption("max.print") -- omitted 300 rows ]
X <- df[, 1:6]
y <- df[, 7]
split <- iai::split_data("regression", X, y, seed = 1)
train_X <- split$train$X
train_y <- split$train$y
test_X <- split$test$X
test_y <- split$test$y

Optimal Regression Trees

We will use a grid_search to fit an optimal_tree_regressor:

grid <- iai::grid_search(
    iai::optimal_tree_regressor(
        random_seed = 123,
    ),
    max_depth = 1:5,
)
iai::fit(grid, train_X, train_y)
iai::get_learner(grid)
Optimal Trees Visualization
≥ -1.15 < -1.15 ≥ 2.895 < 2.895 ≥ 4.56 < 4.56 ≥ 0.538 < 0.538 ≥ 0.3625 < 0.3625 ≥ 0.3125 < 0.3125 ≥ 0.2375 < 0.2375 ≥ 3.765 < 3.765 ≥ 0.546 < 0.546 ≥ 0.3875 < 0.3875 ≥ 0.2875 < 0.2875 ≥ 0.425 < 0.425 ≥ 0.3375 < 0.3375 ≥ 0.4125 < 0.4125Mean 10.32 n = 2161froudeMean 5.087 n = 1872froudeMean 44.03 n = 2917froudeMean 1.799 n = 1393froudeMean 14.61 n = 4810froudeMean 34.58 n = 1318prismaticMean 51.7 n = 1623beam_draughtMean 1.037 n = 1084froudeMean 4.452 n = 317froudeMean 10.13 n = 3011froudeMean 22.07 n = 1814prismaticMean 38.81 n = 319length_displacementPredict 33.3 n = 1022Mean 57.07 n = 424beam_draughtMean 49.92 n = 1227positionPredict 0.5655 n = 785Predict 2.264 n = 306Predict 3.857 n = 198Predict 5.393 n = 129Predict 7.983 n = 1812Predict 13.36 n = 1213Predict 24.75 n = 415Predict 21.31 n = 1416Predict 41.34 n = 120Predict 37.55 n = 221Predict 62.42 n = 125Predict 55.29 n = 326Predict 48.9 n = 928Predict 52.98 n = 329
×

We can make predictions on new data using predict:

iai::predict(grid, test_X)
 [1]  0.5655128  0.5655128  0.5655128  2.2640000  5.3933333 13.3566667
 [7] 21.3078571  0.5655128  0.5655128  0.5655128  2.2640000  3.8573684
[13]  5.3933333 13.3566667 55.2866667  0.5655128  0.5655128  5.3933333
[19] 21.3078571 33.3050000  0.5655128  0.5655128  0.5655128  2.2640000
[25]  5.3933333  7.9833333  2.2640000  5.3933333 33.3050000  2.2640000
[31]  2.2640000  5.3933333 13.3566667 33.3050000  5.3933333  3.8573684
[37]  0.5655128  0.5655128  0.5655128 13.3566667 33.3050000 48.8955556
[43]  0.5655128  0.5655128  2.2640000  5.3933333 33.3050000  7.9833333
[49] 13.3566667  0.5655128  5.3933333 13.3566667  0.5655128  0.5655128
[55]  0.5655128  2.2640000  2.2640000 33.3050000  0.5655128 13.3566667
 [ reached getOption("max.print") -- omitted 32 entries ]

We can evaluate the quality of the tree using score with any of the supported loss functions. For example, the R2R^2 on the training set:

iai::score(grid, train_X, train_y, criterion = "mse")
[1] 0.9965539

Or on the test set:

iai::score(grid, test_X, test_y, criterion = "mse")
[1] 0.9923405

Optimal Regression Trees with Hyperplanes

To use Optimal Regression Trees with hyperplane splits (ORT-H), you should set the hyperplane_config parameter:

grid <- iai::grid_search(
    iai::optimal_tree_regressor(
        random_seed = 123,
        hyperplane_config = list(sparsity = "all"),
    ),
    max_depth = 1:4,
)
iai::fit(grid, train_X, train_y)
iai::get_learner(grid)
Optimal Trees Visualization
≥ 1.238 < 1.238 ≥ 0.4125 < 0.4125 ≥ 0.3625 < 0.3625 ≥ 0.2625 < 0.2625 ≥ 0.4375 < 0.4375 ≥ 0.905 < 0.905 ≥ 0.3875 < 0.3875Mean 10.32 n = 2161froudeMean 3.278 n = 16920.4704 * prismatic + 1.952 * froudeMean 35.62 n = 479froudeMean 1.631 n = 1343froudeMean 9.583 n = 356froudeMean 27.32 n = 3110froudeMean 51.7 n = 1613-0.04365 * position + 0.3009 * beam_draughtPredict 0.7884 n = 944Predict 3.612 n = 405Predict 7.615 n = 237Predict 13.36 n = 128Predict 22.07 n = 1811Predict 34.58 n = 1312Predict 55.95 n = 614Predict 49.16 n = 1015
×

Now we can find the performance on the test set with hyperplanes:

iai::score(grid, test_X, test_y, criterion = "mse")
[1] 0.9861183

It looks like the addition of hyperplane splits did not help too much here. It seems that the main variable affecting the target is froude, and so perhaps allowing multiple variables per split in the tree is not that useful for this dataset.

Optimal Regression Trees with Linear Predictions

To use Optimal Regression Trees with linear regression in the leaves (ORT-L), you should set the regression_features parameter to list(All = c()) and use the regression_lambda parameter to control the degree of regularization.

grid <- iai::grid_search(
    iai::optimal_tree_regressor(
        random_seed = 123,
        max_depth = 2,
        regression_features = list(All = c()),
    ),
    regression_lambda = c(0.005, 0.01, 0.05),
)
iai::fit(grid, train_X, train_y)
iai::get_learner(grid)
Optimal Trees Visualization
≥ 0.3625 < 0.3625Mean 10.32 n = 2161froudeReg with mean 2.508 n = 1572Reg with mean 31.09 n = 593
×

We can find the performance on the test set:

iai::score(grid, test_X, test_y, criterion = "mse")
[1] 0.9842528

We can see that the ORT-L model is much smaller than the models with constant predictions and has similar performance.

This documentation is not for the latest stable release, but for either the development version or an older release.
Click here to go to the documentation for the latest stable release.