Text Mining for user comments from different business units

usct01usct01 Member Posts: 10 Contributor II
edited November 2018 in Help
Hello All
I have a data in excel which has 1000 rows and 10 columns.
each row represent a record about an customer and the various column have attributes of customer such as ID, business Unit, Location, Revenue Group, Product Subscribed etc. and One column which have feedback of Customer.
I want to analyze the Feedback column using text processing and gain insights by different product, location, Business Unit etc.

I am able to create a frequency count of words in Feedback column if I use a new data having only Feedback column.
This way I can't distinguish the impact by location e.g There are 4 locations Loc1,Loc2,Loc3,Loc4, for each location I want to know which words are appearing most in the feedback

Please help me in achieving this.
I am new to rapid Miner and in the phase of learning



  • Options
    MariusHelfMariusHelf RapidMiner Certified Expert, Member Posts: 1,869 Unicorn
    Hi Deepak,

    you can do it with the Loop Values operator and a Filter Examples inside.
    Loop Values loops over the values of an attribute and sets a macro to the current value. Inside the loop you can filter your example set to contain only examples where the chosen attribute has the current value.

    In the example below I loop over the values of the label, but you can choose any other nominal or integer attribute. Just be sure to change it in both the Loop Values operator and in the expression of Filter Examples.

    Best, Marius
    <?xml version="1.0" encoding="UTF-8" standalone="no"?>
    <process version="5.2.009">
      <operator activated="true" class="process" compatibility="5.2.009" expanded="true" name="Process">
        <process expanded="true" height="428" width="924">
          <operator activated="true" class="generate_nominal_data" compatibility="5.2.009" expanded="true" height="60" name="Generate Nominal Data" width="90" x="112" y="30"/>
          <operator activated="true" class="loop_values" compatibility="5.2.009" expanded="true" height="76" name="Loop Values" width="90" x="246" y="30">
            <parameter key="attribute" value="label"/>
            <process expanded="true" height="446" width="942">
              <operator activated="true" breakpoints="after" class="filter_examples" compatibility="5.2.009" expanded="true" height="76" name="Filter Examples" width="90" x="112" y="30">
                <parameter key="condition_class" value="attribute_value_filter"/>
                <parameter key="parameter_string" value="label=%{loop_value}"/>
              <operator activated="true" class="subprocess" compatibility="5.2.009" expanded="true" height="76" name="Do whatever you want :)" width="90" x="313" y="30">
                <process expanded="true">
                  <portSpacing port="source_in 1" spacing="0"/>
                  <portSpacing port="source_in 2" spacing="0"/>
                  <portSpacing port="sink_out 1" spacing="0"/>
                  <portSpacing port="sink_out 2" spacing="0"/>
              <connect from_port="example set" to_op="Filter Examples" to_port="example set input"/>
              <connect from_op="Filter Examples" from_port="example set output" to_op="Do whatever you want :)" to_port="in 1"/>
              <connect from_op="Do whatever you want :)" from_port="out 1" to_port="out 1"/>
              <portSpacing port="source_example set" spacing="0"/>
              <portSpacing port="sink_out 1" spacing="0"/>
              <portSpacing port="sink_out 2" spacing="0"/>
          <connect from_op="Generate Nominal Data" from_port="output" to_op="Loop Values" to_port="example set"/>
          <connect from_op="Loop Values" from_port="out 1" to_port="result 1"/>
          <portSpacing port="source_input 1" spacing="0"/>
          <portSpacing port="sink_result 1" spacing="0"/>
          <portSpacing port="sink_result 2" spacing="0"/>
  • Options
    usct01usct01 Member Posts: 10 Contributor II
    Thanks Marius, will try and post the results.
Sign In or Register to comment.