Decision trees are probably one of the most common and easily understood decision support tools.
The decision tree learning automatically find the important decision criteria to consider and uses the most intuitive and explicit visual representation.
Current visual implements the popular and widely used tools of recursive partitioning for decision tree construction. Each leaf of the tree is labeled with a class and a probability distribution over the classes. Beside this we use cross validation to estimate the statistical performance of the decision tree.
If the target variable is categorical or has only few possible values the "Classification Tree" is constructed, whereas if the target variable is numeric the result of the visual is "Regression Tree".
You can control the algorithm parameters and the visual attributes to suit your needs.
Here is how it works:
R package dependencies(auto-installed): rpart, rpart.plot, RColorBrewer
Supports R versions: R 3.3.1, R 3.3.0, MRO 3.3.1, MRO 3.3.0, MRO 3.2.2
This is an open source visual. Get the code from GitHub: https://github.com/microsoft/PowerBI-visuals-decision-tree