id3 operator - decision tree is getting more complex when increasing minimal gain

Ike3000Ike3000 Member Posts: 5 Contributor I
edited November 2018 in Help

Hi,

I have a small example set of data which was provided to us in a study assignment. It's an excel list with two sheets. The first sheet is the training set. It has 25 instances of animals, these are the columns: class , animal, respiration, reproduction, habitat, body hair, limbs front, limbs back, mammal (yes/no).

 

Out of these, the following are actual attributes: respiration, reproduction, habitat, body hair, limbs front, limbs back.

I use these in a chain or operators to process them with the ID3 operator. The ID3 operator is set to: criterion (information gain), minimal size for split (4), minimal leaf size (2)

 

The second sheet is equivalent to the first sheet, but only contains 5 instances of animals.

 

When I now vary the minimal gain setting of the ID3 operator, counterintuitively the tree is getting mor complex, the higher the minimal gain is set.

The tree is most simple, when minimal gain is set to 0.1, more complex when set to 1.0 and the most complex when set to 10.0. How is this possible?

 

Excuse my English!

Tagged:

Answers

  • Ike3000Ike3000 Member Posts: 5 Contributor I

    Does anyone have an idea? Am I correct to assume that increasing minimal gain parameter in the ID3 operator should result in a more compact decision tree?

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,503 RM Data Scientist

    I agree the usual behaviour should be the smaller the min_gain, the bigger the tree.

    But i would also expect that the min_gain is in [0-1] and not > 1.

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • Ike3000Ike3000 Member Posts: 5 Contributor I

    minimal gain

    Description: The gain of a node is calculated before splitting it. The node is split if its Gain is greater than the minimal gain. Higher value of minimal gain results in fewer splits and thus a smaller tree. A too high value will completely prevent splitting and a tree with a single node is generated.

     

    In a training set of only 25 instances and an example set (to be classifier) of only 5 instances - what would happen if I change the minimal gain parameter, but there aren't enough examples in the set to properly use the minimal gain value? I mean, the tree would still be constructed right? Would it acknoledge another parameter instead? I don't see the behaviour that the parameter description tells us (=too high value will prevent splitting), even if I change the minimal gain parameter to 1000.

Sign In or Register to comment.