Visualization

Decision Tree Visualization

This page contains a guide the many options for visualizing tree learners.

Interactive Visualizations

Tree Visualization

The write_html function allows you to construct an interactive browser visualization of a tree learner:

IAI.write_html("tree.html", lnr)
Optimal Trees Visualization

In a Jupyter notebook, tree learners are automatically visualized in this way. If you are working from a terminal, you can open a browser window with a visualization of any tree learner using show_in_browser.

Questionnaires

The write_questionnaire function allows you to create an interactive questionnaire from a tree learner.

You can open a browser window with the questionnaire for any tree learner using show_questionnaire.

Multi-learner Interactive Visualizations

It is possible to combine many learners into a single interactive visualization that will present the user with the ability to change between the trees. We combine the learners as follows:

The following example prepares a series of questions to choose between a group of learners:

questions = ("Use tree" => [
    "with hyperplanes" => lnr_hyper,
    "without hyperplanes" => ("and maximum depth" => [
        "1" => lnr_nohyper_depth1,
        "2" => lnr_nohyper_depth2,
    ]),
])

We pass this to MultiTreePlot or MultiQuestionnaire to construct the visualization, which can then be saved to file with write_html or opened in the browser with show_in_browser as desired:

IAI.MultiTreePlot(questions)
Optimal Trees Visualization

Static Images

The write_png function allows you to visualize a tree learner as a PNG image:

IAI.write_png("tree.png", lnr)

Note that this requires you have GraphViz installed and on the system PATH. If you do not, you can use write_dot to export the tree in .dot format and use GraphViz to render it as a PNG image at a later time.

Miscellaneous

Missing values

When displaying the tree in the terminal or as an image, if the observations with missing observations goes to the lower child, the split message will say is missing in addition to the split criterion. Otherwise, no explicit message is displayed regarding the missing data direction.

Fitted OptimalTreeClassifier:
  1) Split: XS3obtfU < 39.34
    2) Split: Fr8w5mB0 ≤ 3 or is missing
      3) Predict: 1 (92.66%), [8,101], 109 points, error 8
      4) Predict: 0 (100.00%), [43,0], 43 points, error 0
    5) Split: zAPikvQB is missing
      6) Predict: 1 (91.11%), [8,82], 90 points, error 8
      7) Predict: 0 (91.86%), [237,21], 258 points, error 21

In the interactive tree visualization, the child node that receives data with missing values is denoted by a small dot on top of the node.

Optimal Trees Visualization