"[SOLVED] Tokenize - Generate n-grams and Filters"
I need to perform the following procedure.
1) Read a text document
3) Generating compound words (n grams)
4) Delete all compound words that are not equal to last list.
I could tokenize and generate compound words.
and filter operator "Text: Filter Tokens (by Content)" in the "string" added the compound word to filter and I filters.
The problem is I not how to add more than one word, to filter various compounds.
From already thank you very much