Regression

Quick Start Guide: Optimal Regression Trees

This is an R version of the corresponding OptimalTrees quick start guide.

In this example we will use Optimal Regression Trees (ORT) on the yacht hydrodynamics dataset. First we load in the data and split into training and test datasets:

df <- read.table(
    "yacht_hydrodynamics.data",
    col.names = c("position", "prismatic", "length_displacement",
                  "beam_draught", "length_beam", "froude", "resistance"),
)
  position prismatic length_displacement beam_draught length_beam froude
1     -2.3     0.568                4.78         3.99        3.17  0.125
2     -2.3     0.568                4.78         3.99        3.17  0.150
3     -2.3     0.568                4.78         3.99        3.17  0.175
4     -2.3     0.568                4.78         3.99        3.17  0.200
5     -2.3     0.568                4.78         3.99        3.17  0.225
6     -2.3     0.568                4.78         3.99        3.17  0.250
7     -2.3     0.568                4.78         3.99        3.17  0.275
8     -2.3     0.568                4.78         3.99        3.17  0.300
  resistance
1       0.11
2       0.27
3       0.47
4       0.78
5       1.18
6       1.82
7       2.61
8       3.76
 [ reached 'max' / getOption("max.print") -- omitted 300 rows ]
X <- df[, 1:6]
y <- df[, 7]
split <- iai::split_data("regression", X, y, seed = 1)
train_X <- split$train$X
train_y <- split$train$y
test_X <- split$test$X
test_y <- split$test$y

Optimal Regression Trees

We will use a grid_search to fit an optimal_tree_regressor:

grid <- iai::grid_search(
    iai::optimal_tree_regressor(
        random_seed = 1,
    ),
    max_depth = 1:5,
)
iai::fit(grid, train_X, train_y)
iai::get_learner(grid)
Optimal Trees Visualization

We can make predictions on new data using predict:

iai::predict(grid, test_X)
 [1]  0.5922368  2.2875000 22.0675000  0.5922368  0.5922368  4.6739286
 [7] 57.5000000  0.5922368  0.5922368 12.9800000  4.6739286  8.0806667
[13] 12.9800000  0.5922368  2.2875000  4.6739286  8.0806667 22.0675000
[19] 33.5576923 48.3922222  0.5922368  0.5922368  4.6739286  4.6739286
[25] 33.5576923  2.2875000  4.6739286  4.6739286  8.0806667  0.5922368
[31]  4.6739286  4.6739286  0.5922368  2.2875000  2.2875000  4.6739286
[37] 33.5576923  0.5922368  0.5922368  2.2875000 12.9800000 22.0675000
[43]  0.5922368  2.2875000  4.6739286 22.0675000 48.3922222  0.5922368
[49]  0.5922368  0.5922368  2.2875000  0.5922368  2.2875000  2.2875000
[55]  4.6739286  4.6739286  0.5922368  0.5922368  2.2875000  4.6739286
 [ reached getOption("max.print") -- omitted 32 entries ]

We can evaluate the quality of the tree using score with any of the supported loss functions. For example, the $R^2$ on the training set:

iai::score(grid, train_X, train_y, criterion = "mse")
[1] 0.9941445

Or on the test set:

iai::score(grid, test_X, test_y, criterion = "mse")
[1] 0.9917007

Optimal Regression Trees with Hyperplanes

To use Optimal Regression Trees with hyperplane splits (ORT-H), you should set the hyperplane_config parameter:

grid <- iai::grid_search(
    iai::optimal_tree_regressor(
        random_seed = 1,
        hyperplane_config = list(sparsity = "all"),
    ),
    max_depth = 1:5,
)
iai::fit(grid, train_X, train_y)
iai::get_learner(grid)
Optimal Trees Visualization

Now we can find the performance on the test set with hyperplanes:

iai::score(grid, test_X, test_y, criterion = "mse")
[1] 0.9877064

It looks like the addition of hyperplane splits did not help too much here. It seems that the main variable affecting the target is froude, and so perhaps allowing multiple variables per split in the tree is not that useful for this dataset.

Optimal Regression Trees with Linear Predictions

To use Optimal Regression Trees with linear regression in the leaves (ORT-L), you should set the regression_sparsity parameter to :all and use the regression_lambda parameter to control the degree of regularization.

grid <- iai::grid_search(
    iai::optimal_tree_regressor(
        random_seed = 1,
        max_depth = 2,
        regression_sparsity = "all",
    ),
    regression_lambda = c(0.0005, 0.001, 0.005),
)
iai::fit(grid, train_X, train_y)
iai::get_learner(grid)
Optimal Trees Visualization