Differences between R and Julia

The IAI R interface matches the Julia API very closely, so you can refer to the Julia documentation for information on most tasks. On this page we note the main differences between the R and Julia interfaces.

Conversion of Julia data types to R

In order to figure out the types to pass to an IAI function from the R interface, you can refer to the equivalent function in the Julia API and translate the types to their R equivalent. Most literal data types convert in a straighforward manner, for example:

  • Int to integer (can also pass as a round-number double, e.g., 1.0)
  • Float64 to double
  • String to character
  • Dict to list

The following Julia types can be passed as follows:

  • nothing can be passed using NULL
  • a Symbol can be passed as a character
  • a Vector can be passed as an atomic vector
  • a Matrix can be passed as a matrix
  • a DataFrame can be passed as a data.frame

Specifying Feature Set in R

We list the R input types for specifying set of features in a dataframe as learner parameters. Refer to IAI.FeatureSet for the Julia equivalence:

Input TypeDescriptionExamples
AllUse all columnslist(All = c())
Integer or a vector of IntegersSpecify indices of columns to use1, c(1, 3, 4)
String or a vector of StringsSpecify names of columns to use"x1", c("x1", "x3")
NotSpecify columns not to uselist(Not = 1), list(Not = c("x2", "x4"))
BetweenSpecify range of columns to uselist(Between = c("x1", "x4"))

Interactive Visualizations

The write_html and show_in_browser functions work the same in R as in Julia for saving visualizations to file or displaying in an external browser, respectively. Additionally, visualizations will be automatically shown in the viewer pane when using RStudio, similar to how visualizations are automatically displayed in Jupyter notebooks.

To include data in the tree visualization, the data keyword argument should be passed as an unnamed list:

iai::write_html("tree.html", lnr, data = list(X, y))

Below is an example that shows the equivalent R code for the advanced visualization examples in Julia. In these examples we work with the following tree learner:

Optimal Trees Visualization

We can rename the features with a list that maps from the original names to more descriptive names:

vis_renamed_features <- iai::tree_plot(lnr, feature_renames = list(
  "disp" = "Displacement",
  "hp" = "Horsepower",
  "wt" = "Weight"
Optimal Trees Visualization

We can also have a finer-grained control of what is displayed for each node, such as adding summary statistics. We create a list of lists with the parameters controlling what you want to show in each node and pass this as extra_content:

node_inds <- iai::apply_nodes(lnr, X)
extras <- lapply(node_inds, function(inds) {
  list(node_details_extra = paste0("<b>Mean horsepower in node:</b> ",
                                   round(mean(X[inds, "hp"]), digits = 2)))
vis_extra_text <- iai::tree_plot(lnr, extra_content = extras)
Optimal Trees Visualization

Finally, we can combine multiple learners into a single visualization as described in the Julia documentation. In R, a question is a single-entry named list of the form list(question = responses), where question is the string for the question and responses is itself a list of possible responses:

questions <- list("Use learner with" = list(
  "renamed features" = vis_renamed_features,
  "extra text output" = vis_extra_text
Optimal Trees Visualization

Tree Stability

Below are examples showing the equivalent R code for the tree stability examples in Julia. In these examples we work with the following tree learner:

Optimal Trees Visualization

Stability Analysis

We conduct the stability analysis using stability_analysis:

stability <- iai::stability_analysis(lnr, train_X, train_y, criterion = "gini")

We can plot a summary of the analysis using plot.stability_analysis:


We can use get_stability_results to extract the trees in order of training objective along with the importance of each feature in the tree:

   train_error tree_index  variance  skewness    curtosis entropy
1    0.1276390         38 0.6017131 0.3954914 0.002795447       0
2    0.1276773         21 0.6159242 0.2375321 0.146543689       0
3    0.1276773         22 0.6159242 0.2375321 0.146543689       0
4    0.1276773         29 0.6159242 0.2375321 0.146543689       0
5    0.1276773         39 0.6159242 0.2375321 0.146543689       0
6    0.1276773         55 0.6159242 0.2375321 0.146543689       0
7    0.1276773         59 0.6159242 0.2375321 0.146543689       0
8    0.1276773         70 0.6159242 0.2375321 0.146543689       0
9    0.1276773         96 0.6159242 0.2375321 0.146543689       0
10   0.1276773         97 0.6159242 0.2375321 0.146543689       0
 [ reached 'max' / getOption("max.print") -- omitted 90 rows ]

We can use get_cluster_details to summarize the clustering of the first 10 trees:

iai::get_cluster_details(stability, 10)
  train_error_mean  variance  skewness    curtosis entropy
1        0.1276390 0.6017131 0.3954914 0.002795447       0
2        0.1276773 0.6159242 0.2375321 0.146543689       0

We can use get_cluster_distances to get the relative distances between each pair of clusters:

iai::get_cluster_distances(stability, 10)
          [,1]      [,2]
[1,] 0.0000000 0.2140483
[2,] 0.2140483 0.0000000

We can use get_cluster_assignments to see which trees comprise each cluster:

iai::get_cluster_assignments(stability, 10)
[1] 38

[1] 21 22 29 39 55 59 70 96 97

We see that Tree 38 is in its own cluster, and the other trees are all grouped together. Given this, we might want to inspect how Tree 38 differs to the others. We can use get_tree to construct a new learner that uses the tree at a specified index:

iai::get_tree(lnr, 38)
Optimal Trees Visualization