The RapidMiner community is on read-only mode until further notice. Technical support via cases will continue to work as is. For any urgent licensing related requests from Students/Faculty members, please use the Altair academic forum here.

Decision Tree - only one attribute per branch?

DBenkertDBenkert Member Posts: 2 Newbie
Hello, i hope everyone here is doing great!

I have a question regarding decision trees. Is it possible to set up the decision tree in a way, so that the model will use every attribute just once per branch? I need this for a project for my studies, and it would mean a lot if someone here can help me :smile: .

Thanks in advance!
Tagged:

Answers

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University Professor Posts: 3,525 RM Data Scientist
    Hi,
    not with the normal decision trees. The interactive ones may be grown like this.

    BR,
    Martin
    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
  • machbrown2machbrown2 Member Posts: 3 Contributor I

    Hello,

    Certainly! In a One Rule decision tree:

    1. Select Best Attribute:

      • Evaluate each attribute's classification error.
      • Choose the attribute with the lowest error.
    2. Create Rule:

      • Formulate a rule based on the selected attribute.
    3. Apply Rule:

      • Classify instances based on the rule.
    4. Repeat for Each Attribute:

      • Iterate through all attributes to find the best rule.
    5. Choose Best Rule:

      • Select the rule that performs best on the data.
    6. Tree Structure:

      • The decision tree has a single level, with branches corresponding to attributes and leaf nodes representing predicted classes.

    Example in Python (using scikit-learn):

    <p>from sklearn.tree import DecisionTreeClassifier, export_text</p><p><br></p><p># Create a One Rule decision tree</p><p>tree_clf = DecisionTreeClassifier(max_depth=1)</p><p><br></p><p># Train the classifier</p><p>tree_clf.fit(X_train, y_train)</p><p><br></p><p># Display the selected rule</p><p>tree_rules = export_text(tree_clf, feature_names=iris.feature_names)</p><p>print(tree_rules)</p>




  • DBenkertDBenkert Member Posts: 2 Newbie
    Ahh thank you so much!!! :) i really appreciate your help!
Sign In or Register to comment.