Decision tree is based on recursive partitioning pattern Ginni Index = 1-sum(proportionsquare of each category) select min GI (0-0.5) among attributes and that will become rootnode and follow same steps for nxt node..
Drawback of tree model- if behaviour of data change slightly then variance increases Ensemble Modelling- Grp of models Bagging-Random Forest- Parallel execution-same data with random duplicate (Bootstrapping) Boosting-Gradient Boosting - sequential execution- next model should predict last misclassification correctly