Quick Start Guide: Optimal Regression Trees

This is a Python version of the corresponding OptimalTrees quick start guide.

In this example we will use Optimal Regression Trees (ORT) on the yacht hydrodynamics dataset. First we load in the data and split into training and test datasets:

import pandas as pd
df = pd.read_csv(
    "yacht_hydrodynamics.data",
    sep='\s+',
    header=None,
    names=['position', 'prismatic', 'length_displacement', 'beam_draught',
           'length_beam', 'froude', 'resistance'],
)
     position  prismatic  length_displacement  ...  length_beam  froude  resistance
0        -2.3      0.568                 4.78  ...         3.17   0.125        0.11
1        -2.3      0.568                 4.78  ...         3.17   0.150        0.27
2        -2.3      0.568                 4.78  ...         3.17   0.175        0.47
3        -2.3      0.568                 4.78  ...         3.17   0.200        0.78
4        -2.3      0.568                 4.78  ...         3.17   0.225        1.18
5        -2.3      0.568                 4.78  ...         3.17   0.250        1.82
6        -2.3      0.568                 4.78  ...         3.17   0.275        2.61
..        ...        ...                  ...  ...          ...     ...         ...
301      -2.3      0.600                 4.34  ...         2.73   0.300        4.15
302      -2.3      0.600                 4.34  ...         2.73   0.325        6.00
303      -2.3      0.600                 4.34  ...         2.73   0.350        8.47
304      -2.3      0.600                 4.34  ...         2.73   0.375       12.27
305      -2.3      0.600                 4.34  ...         2.73   0.400       19.59
306      -2.3      0.600                 4.34  ...         2.73   0.425       30.48
307      -2.3      0.600                 4.34  ...         2.73   0.450       46.66

[308 rows x 7 columns]
from interpretableai import iai
X = df.iloc[:, 0:-1]
y = df.iloc[:, -1]
(train_X, train_y), (test_X, test_y) = iai.split_data('regression', X, y,
                                                      seed=1)

Optimal Regression Trees

We will use a GridSearch to fit an OptimalTreeRegressor:

grid = iai.GridSearch(
    iai.OptimalTreeRegressor(
        random_seed=123,
    ),
    max_depth=range(1, 6),
)
grid.fit(train_X, train_y)
grid.get_learner()
Optimal Trees Visualization
≥ 3.765 < 3.765 ≥ 0.4125 < 0.4125 ≥ 0.3625 < 0.3625 ≥ 0.2625 < 0.2625 ≥ 0.4375 < 0.4375 ≥ 0.3375 < 0.3375 ≥ 0.3875 < 0.3875Mean: 10.32 n = 2161froudeMean: 3.278 n = 1692froudeMean: 35.62 n = 479froudeMean: 1.799 n = 1393froudeMean: 10.13 n = 306froudeMean: 27.32 n = 3110froudeMean: 51.7 n = 1613beam_draughtPredict 0.7884 n = 944Predict 3.91 n = 455Predict 7.983 n = 187Predict 13.36 n = 128Predict 22.07 n = 1811Predict 34.58 n = 1312Predict 57.07 n = 414Predict 49.92 n = 1215
×

We can make predictions on new data using predict:

grid.predict(test_X)
array([ 0.78840426,  0.78840426,  0.78840426, ..., 13.35666667,
       34.57538462, 49.91583333])

We can evaluate the quality of the tree using score with any of the supported loss functions. For example, the $R^2$ on the training set:

grid.score(train_X, train_y, criterion='mse')
0.9912939792003822

Or on the test set:

grid.score(test_X, test_y, criterion='mse')
0.9885237962078779

Optimal Regression Trees with Hyperplanes

To use Optimal Regression Trees with hyperplane splits (ORT-H), you should set the hyperplane_config parameter:

grid = iai.GridSearch(
    iai.OptimalTreeRegressor(
        random_seed=123,
        hyperplane_config={'sparsity': 'all'},
    ),
    max_depth=range(1, 5),
)
grid.fit(train_X, train_y)
grid.get_learner()
Optimal Trees Visualization
≥ 1.238 < 1.238 ≥ 0.4125 < 0.4125 ≥ 0.3625 < 0.3625 ≥ 0.2625 < 0.2625 ≥ 0.4375 < 0.4375 ≥ 0.905 < 0.905 ≥ 0.3875 < 0.3875Mean: 10.32 n = 2161froudeMean: 3.278 n = 16920.4704 * prismatic + 1.952 * froudeMean: 35.62 n = 479froudeMean: 1.631 n = 1343froudeMean: 9.583 n = 356froudeMean: 27.32 n = 3110froudeMean: 51.7 n = 1613-0.04365 * position + 0.3009 * beam_draughtPredict 0.7884 n = 944Predict 3.612 n = 405Predict 7.615 n = 237Predict 13.36 n = 128Predict 22.07 n = 1811Predict 34.58 n = 1312Predict 55.95 n = 614Predict 49.16 n = 1015
×

Now we can find the performance on the test set with hyperplanes:

grid.score(test_X, test_y, criterion='mse')
0.9861182667312003

It looks like the addition of hyperplane splits did not help too much here. It seems that the main variable affecting the target is froude, and so perhaps allowing multiple variables per split in the tree is not that useful for this dataset.

Optimal Regression Trees with Linear Predictions

To use Optimal Regression Trees with linear regression in the leaves (ORT-L), you should set the regression_sparsity parameter to 'all' and use the regression_lambda parameter to control the degree of regularization.

grid = iai.GridSearch(
    iai.OptimalTreeRegressor(
        random_seed=123,
        max_depth=2,
        regression_sparsity='all',
    ),
    regression_lambda=[0.005, 0.01, 0.05],
)
grid.fit(train_X, train_y)
grid.get_learner()
Optimal Trees Visualization
≥ 0.3625 < 0.3625Mean: 10.32 n = 2161froudeReg with mean 2.508 n = 1572Reg with mean 31.09 n = 593
×

We can find the performance on the test set:

grid.score(test_X, test_y, criterion='mse')
0.984222547936994

We can see that the ORT-L model is much smaller than the models with constant predictions and has similar performance.

This documentation is not for the latest stable release, but for either the development version or an older release.
Click here to go to the documentation for the latest stable release.