Two simple Questions concerning Stem (Snowball) and Tokenize (Linguistic Tokens)
Hello,
I'am writing a thesis about a text mining issue and use the Stem (Snowball) as well as the Tokenize (mode: Linguistic Tokens) operator in my process.
Are there any resources who explain what Stem (Snowball) and Tokenize (mode: Linguistic Tokens) does exactly? How they work?
Thank you.
I'am writing a thesis about a text mining issue and use the Stem (Snowball) as well as the Tokenize (mode: Linguistic Tokens) operator in my process.
Are there any resources who explain what Stem (Snowball) and Tokenize (mode: Linguistic Tokens) does exactly? How they work?
Thank you.
0
Answers