API Reference
Documentation for the IAITrees public interface.
Index
IAI.ClassificationTreeLearnerIAI.MultiQuestionnaireIAI.MultiQuestionnaireIAI.MultiTreePlotIAI.MultiTreePlotIAI.PolicyTreeLearnerIAI.PrescriptionTreeLearnerIAI.QuestionnaireIAI.RegressionTreeLearnerIAI.SurvivalTreeLearnerIAI.TreeLearnerIAI.TreePlotIAI.applyIAI.apply_nodesIAI.decision_pathIAI.get_classification_labelIAI.get_classification_probaIAI.get_depthIAI.get_lower_childIAI.get_num_nodesIAI.get_num_samplesIAI.get_parentIAI.get_policy_treatment_rankIAI.get_prescription_treatment_rankIAI.get_regression_constantIAI.get_regression_constantIAI.get_regression_weightsIAI.get_regression_weightsIAI.get_split_categoriesIAI.get_split_featureIAI.get_split_thresholdIAI.get_split_weightsIAI.get_survival_curveIAI.get_upper_childIAI.is_categoric_splitIAI.is_hyperplane_splitIAI.is_leafIAI.is_mixed_ordinal_splitIAI.is_mixed_parallel_splitIAI.is_ordinal_splitIAI.is_parallel_splitIAI.missing_goes_lowerIAI.print_pathIAI.reset_display_label!IAI.set_display_label!IAI.set_threshold!IAI.show_in_browserIAI.show_questionnaireIAI.variable_importanceIAI.write_dotIAI.write_htmlIAI.write_pngIAI.write_questionnaire
Types
IAI.TreeLearner — TypeAbstract type encompassing all tree-based learners.
IAI.ClassificationTreeLearner — TypeAbstract type encompassing all tree-based learners with classification leaves.
IAI.RegressionTreeLearner — TypeAbstract type encompassing all tree-based learners with regression leaves.
IAI.SurvivalTreeLearner — TypeAbstract type encompassing all tree-based learners with survival leaves.
IAI.PrescriptionTreeLearner — TypeAbstract type encompassing all tree-based learners with prescription leaves.
IAI.PolicyTreeLearner — TypeAbstract type encompassing all tree-based learners with policy leaves.
Tree Structure
These functions can be used to query the structure of a TreeLearner. The examples make use of the following tree:
IAI.get_num_nodes — Functionget_num_nodes(lnr::TreeLearner)Return the number of nodes in the trained lnr.
Example
IAI.get_num_nodes(lnr)7IAI.is_leaf — Functionis_leaf(lnr::TreeLearner, node_index::Int)Return true if node node_index in the trained lnr is a leaf.
Example
IAI.is_leaf(lnr, 1)falseIAI.get_depth — Functionget_depth(lnr::TreeLearner, node_index::Int)Return the depth of node node_index in the trained lnr.
Example
IAI.get_depth(lnr, 6)2IAI.get_num_samples — Functionget_num_samples(lnr::TreeLearner, node_index::Int)Return the number of training points contained in node node_index in the trained lnr.
Example
IAI.get_num_samples(lnr, 6)72IAI.get_parent — Functionget_parent(lnr::TreeLearner, node_index::Int)Return the index of the parent of node node_index in the trained lnr.
Example
IAI.get_parent(lnr, 2)1IAI.get_lower_child — Functionget_lower_child(lnr::TreeLearner, node_index::Int)Return the index of the lower child of node node_index in the trained lnr.
Example
IAI.get_lower_child(lnr, 1)2IAI.get_upper_child — Functionget_upper_child(lnr::TreeLearner, node_index::Int)Return the index of the upper child of node node_index in the trained lnr.
Example
IAI.get_upper_child(lnr, 1)5IAI.is_parallel_split — Functionis_parallel_split(lnr::TreeLearner, node_index::Int)Return true if node node_index in the trained lnr is a parallel split.
Example
IAI.is_parallel_split(lnr, 1)trueIAI.is_hyperplane_split — Functionis_hyperplane_split(lnr::TreeLearner, node_index::Int)Return true if node node_index in the trained lnr is a hyperplane split.
Example
IAI.is_hyperplane_split(lnr, 2)trueIAI.is_categoric_split — Functionis_categoric_split(lnr::TreeLearner, node_index::Int)Return true if node node_index in the trained lnr is a categoric split.
Example
IAI.is_categoric_split(lnr, 5)trueIAI.is_ordinal_split — Functionis_ordinal_split(lnr::TreeLearner, node_index::Int)Return true if node node_index in the trained lnr is an ordinal split.
Example
IAI.is_ordinal_split(lnr, 1)falseIAI.is_mixed_parallel_split — Functionis_mixed_parallel_split(lnr::TreeLearner, node_index::Int)Return true if node node_index in the trained lnr is a mixed categoric/parallel split.
Example
IAI.is_mixed_parallel_split(lnr, 2)falseIAI.is_mixed_ordinal_split — Functionis_mixed_ordinal_split(lnr::TreeLearner, node_index::Int)Return true if node node_index in the trained lnr is a mixed categoric/ordinal split.
Example
IAI.is_mixed_ordinal_split(lnr, 5)falseIAI.missing_goes_lower — Functionmissing_goes_lower(lnr::TreeLearner, node_index::Int)Return true if missing values take the lower branch at node node_index in the trained lnr.
Applies to non-leaf nodes.
Example
IAI.missing_goes_lower(lnr, 1)falseIAI.get_split_feature — Functionget_split_feature(lnr::TreeLearner, node_index::Int)Return the feature used in the split at node node_index in the trained lnr.
Applies to categoric, ordinal, parallel, categoric/ordinal, and categoric/parallel splits.
Example
IAI.get_split_feature(lnr, 1):score1IAI.get_split_threshold — Functionget_split_threshold(lnr::TreeLearner, node_index::Int)Return the threshold used in the split at node node_index in the trained lnr.
Applies to hyperplane, parallel, and categoric/parallel splits.
Example
IAI.get_split_threshold(lnr, 1)59.980015IAI.get_split_categories — Functionget_split_categories(lnr::TreeLearner, node_index::Int)Return a Dict containing the categoric/ordinal information used in the split at node node_index in the trained lnr, where the keys are the levels used in the split and the values are true if that level follows the lower branch and false if that level follows the upper branch.
Applies to categoric, ordinal, categoric/ordinal, and categoric/parallel splits.
Example
IAI.get_split_categories(lnr, 5)Dict{Any,Bool} with 5 entries:
"B" => true
"A" => true
"C" => false
"D" => false
"E" => falseIAI.get_split_weights — Functionget_split_weights(lnr::TreeLearner, node_index::Int)Return two Dicts containing the weights for numeric and categoric features, respectively, used in the hyperplane split at node node_index in the trained lnr.
The numeric Dict has key-value pairs of feature names and their corresponding weights in the hyperplane split.
The categoric Dict has key-value pairs of feature names and a corresponding Dict that maps the categoric levels for that feature to their weights in the hyperplane.
Any features not included in either Dict has zero weight in the hyperplane, and similarly, any categoric levels that are not included have zero weight.
Applies to hyperplane splits.
Example
numeric_weights, categoric_weights = IAI.get_split_weights(lnr, 2)
numeric_weightsDict{Symbol,Float64} with 2 entries:
:score3 => 0.0980674
:score2 => 0.00123692categoric_weightsDict{Symbol,Dict{Any,Float64}} with 1 entry:
:region => Dict{Any,Float64}("E"=>0.105715)Classification
These functions can be used to query the structure of a ClassificationTreeLearner. The examples make use of the following tree:
IAI.get_classification_label — Functionget_classification_label(lnr::ClassificationTreeLearner, node_index::Int)Return the predicted label at node node_index in the trained lnr.
Applies to leaf nodes.
Example
IAI.get_classification_label(lnr, 2)"setosa"IAI.get_classification_proba — Functionget_classification_proba(lnr::ClassificationTreeLearner, node_index::Int)Return the predicted probabilities of class membership at node node_index in the trained lnr.
Applies to leaf nodes.
Example
IAI.get_classification_proba(lnr, 4)Dict{String,Float64} with 3 entries:
"virginica" => 0.0925926
"setosa" => 0.0
"versicolor" => 0.907407Regression
These functions can be used to query the structure of a RegressionTreeLearner. The examples make use of the following tree:
IAI.get_regression_constant — Methodget_regression_constant(lnr::RegressionTreeLearner, node_index::Int)Return the constant term in the regression prediction at node node_index in the trained lnr.
Applies to leaf nodes.
Example
IAI.get_regression_constant(lnr, 2)30.88IAI.get_regression_constant(lnr, 3)26.56192IAI.get_regression_weights — Methodget_regression_weights(lnr::RegressionTreeLearner, node_index::Int)Return the weights for each feature in the regression prediction at node node_index in the trained lnr. The weights are returned as two Dicts in the same format as described for get_split_weights.
Applies to leaf nodes.
Example
numeric_weights, categoric_weights = IAI.get_regression_weights(lnr, 3)
numeric_weightsDict{Symbol,Float64} with 2 entries:
:Disp => -0.0210445
:HP => -0.0188614categoric_weightsDict{Symbol,Dict{Any,Float64}} with 0 entriesSurvival
These functions can be used to query the structure of a SurvivalTreeLearner. The examples make use of the following tree:
IAI.get_survival_curve — Functionget_survival_curve(lnr::SurvivalTreeLearner, node_index::Int)Return the SurvivalCurve fitted at node node_index in the trained lnr.
Applies to leaf nodes.
Example
IAI.get_survival_curve(lnr, 2)SurvivalCurve with 22 breakpointsPrescription
These functions can be used to query the structure of a PrescriptionTreeLearner. The examples make use of the following tree:
IAI.get_prescription_treatment_rank — Functionget_prescription_treatment_rank(lnr::PrescriptionTreeLearner,
node_index::Int)Return a Vector containing the treatments ordered from most effective to least effective at node node_index in the trained lnr.
Applies to leaf nodes.
Example
IAI.get_prescription_treatment_rank(lnr, 5)2-element Array{String,1}:
"B"
"A"IAI.get_regression_constant — Methodget_regression_constant(lnr::PrescriptionTreeLearner, node_index::Int,
treatment::Any)Return the constant in the regression prediction for treatment at node node_index in the trained lnr.
Applies to leaf nodes.
Example
IAI.get_regression_constant(lnr, 5, "A")18.507454IAI.get_regression_weights — Methodget_regression_weights(lnr::PrescriptionTreeLearner, node_index::Int,
treatment::Any)Return the weights for each feature in the regression prediction for treatment at node node_index in the trained lnr. The weights are returned as two Dicts in the same format as described for get_split_weights.
Applies to leaf nodes.
Example
numeric_weights, categoric_weights = IAI.get_regression_weights(lnr, 5, "A")
numeric_weightsDict{Symbol,Float64} with 2 entries:
:DiastolicBP => -0.00853409
:AM => 1.31641categoric_weightsDict{Symbol,Dict{Any,Float64}} with 0 entriesPolicy
These functions can be used to query the structure of a PolicyTreeLearner. The examples make use of the following tree:
IAI.get_policy_treatment_rank — Functionget_policy_treatment_rank(lnr::PolicyTreeLearner, node_index::Int)Return a Vector containing the treatments ordered from most effective to least effective at node node_index in the trained lnr.
Applies to leaf nodes.
Example
IAI.get_policy_treatment_rank(lnr, 3)3-element Array{String,1}:
"A"
"C"
"B"Learners
IAI.apply — Functionapply(lnr::TreeLearner, X::FeatureInput)Return a Vector{Int} that contains the leaf index in lnr into which each point in the features X falls.
IAI.apply_nodes — Functionapply_nodes(lnr::TreeLearner, X::FeatureInput)Return a Vector with one entry for each node in lnr. The tth element is a Vector{Int} containing the indices of the points from the features X that fall into node t or its children.
IAI.decision_path — Functiondecision_path(lnr::TreeLearner, X::FeatureInput)Return a SparseMatrixCSC{Bool,Int64} where entry (i, j) is true if the ith point in the features X passes through the jth node in lnr.
IAI.print_path — Functionprint_path(lnr::TreeLearner, X::FeatureInput)Print the decision path for each sample in the features X. The output displays the value of the relevant features for the specified sample and the rules for the path that it takes through the tree.
print_path(lnr::TreeLearner, X::FeatureInput, i::Int)Print the decision path for the ith sample in the features X.
IAI.variable_importance — Methodvariable_importance(lnr::TreeLearner)For tree learners, the importance of each variable is measured as the total decrease in the loss function as a direct result of each split in the trees of lnr that use this variable.
Task-specific Functions
Classification
IAI.set_threshold! — Functionset_threshold!(lnr::ClassificationTreeLearner, label::Any, threshold::Real,
simplify::Bool=false)For a binary classification problem, update the the predicted labels in the leaves of lnr. After running, a leaf will predict label only if the predicted probability for this label is at least threshold; otherwise, the other label will be predicted.
If simplify is true, the tree will be simplified so that there is no split that has two leaves with the same label prediction as children. This means that if both sides of a split are leaf nodes with the same label prediction, the split will be deleted from the tree and replaced with a single leaf node. This simplification is applied recursively throughout the tree.
Refer to the documentation on setting the threshold for more information.
Visualization
Interactive Visualizations
IAI.write_html — Methodwrite_html(f, lnr::TreeLearner; keyword_arguments...)
write_html(f, grid::GridSearch; keyword_arguments...)Write interactive browser visualization of lnr or grid to f in HTML format.
Keyword Arguments
show_node_id=true: whether to show the ID label for each nodedata: specify data to be shown in the visualization, should be passed as aTuplein the same order as passed tofit!, i.e.:data=(X, y)for classification and regression problemsdata=(X, deaths, times)for survival problemsdata=(X, treatments, outcomes)for prescription problemsdata=(X, rewards)for policy problems
Refer to the Tree Visualization documentation for more information.
Example
Save tree to mytree.html:
IAI.write_html("mytree.html", lnr)IAI.show_in_browser — Methodshow_in_browser(lnr::TreeLearner; keyword_arguments...)
show_in_browser(grid::GridSearch; keyword_arguments...)Show interactive visualization of lnr or grid in default browser.
Supports the same keyword arguments as write_html.
IAI.write_questionnaire — Functionwrite_questionnaire(f, lnr::TreeLearner; keyword_arguments...)
write_questionnaire(f, grid::GridSearch; keyword_arguments...)Write interactive questionnaire based on lnr or grid to f in HTML format.
Keyword Arguments
Supports the same keyword arguments as write_html.
Example
Save questionnaire to questions.html:
IAI.write_questionnaire("myquestionnaire.html", lnr)IAI.show_questionnaire — Functionshow_questionnaire(lnr::TreeLearner; keyword_arguments...)
show_questionnaire(grid::GridSearch; keyword_arguments...)Show interactive questionnaire based on lnr or grid in default browser.
Supports the same keyword arguments as write_questionnaire.
IAI.TreePlot — TypeTreePlot(lnr::TreeLearner; keyword_arguments...)Specifies an interactive tree visualization of lnr.
Keyword Arguments
feature_renames,level_renamesandlabel_renamesallow renaming different aspects of the dataextra_contentallows including additional output at each node in the visualization
Refer to the documentation on advanced visualization for more information on using these keyword arguments.
IAI.Questionnaire — TypeQuestionnaire(lnr::TreeLearner; keyword_arguments...)Specifies an interactive questionnaire based on lnr.
Supports the same keyword arguments as TreePlot.
IAI.MultiTreePlot — TypeMultiTreePlot(questions::Pair; keyword_arguments...)Specifies an interactive tree visualization of multiple tree learners as specified by questions. Refer to the documentation on multi-learner visualizations for more details. The keyword arguments are the same as for TreePlot.
IAI.MultiTreePlot — MethodMultiTreePlot(grid::GridSearch; keyword_arguments...)Constructs an interactive tree visualization containing the final fitted learner as well as the learner found for each parameter combination. The keyword arguments are the same as for TreePlot.
IAI.MultiQuestionnaire — TypeMultiQuestionnaire(questions::Pair; keyword_arguments...)Specifies an interactive questionnaire using multiple tree learners as specified by questions. Refer to the documentation on multi-learner visualizations for more details. The keyword arguments are the same as for Questionnaire.
IAI.MultiQuestionnaire — MethodMultiQuestionnaire(grid::GridSearch; keyword_arguments...)Constructs an interactive questionnaire containing the final fitted learner as well as the learner found for each parameter combination. The keyword arguments are the same as for Questionnaire.
Static Images
IAI.write_png — Functionwrite_png(filename::AbstractString, lnr::TreeLearner; keyword_arguments...)Write lnr to filename as a PNG image.
Requires GraphViz be installed and on the system PATH.
Keyword Arguments
node_style=Dict():Dictof styles to apply to all split nodes in tree (see GraphViz docs for possible keys/values)leaf_style=Dict():Dictof styles to apply to all leaf nodes in tree (see GraphViz docs for possible keys/values)simple_layout=false: Whether to suppress printing of extra information in each node
Example
Save tree to mytree.png:
IAI.write_png("mytree.png", lnr)IAI.write_dot — Functionwrite_dot(f, lnr::TreeLearner; keyword_arguments...)Write the trained tree of lnr into .dot format to the stream f.
Supports the same keyword arguments as write_png.
Example
Save tree to mytree.dot:
IAI.write_dot("mytree.dot", lnr)You can then convert mytree.dot to PNG image at the command line (requires GraphViz be installed):
$ dot -Tpng -o mytree.png mytree.dotMiscellaneous
IAI.set_display_label! — Functionset_display_label!(lnr::ClassificationTreeLearner, display_label::Any)Changes which predicted probability is displayed when visualizing lnr to show the probability of display_label.
set_display_label!(grid::GridSearch{<:ClassificationTreeLearner}, display_label::Any)Changes which predicted probability is displayed when visualizing grid to show the probability of display_label.
IAI.reset_display_label! — Functionreset_display_label!(lnr::ClassificationTreeLearner)Resets the predicted probability displayed for lnr to be that of the predicted label.
reset_display_label!(grid::GridSearch{<:ClassificationTreeLearner})Resets the predicted probability displayed for grid to be that of the predicted label.