Due to recent updates, all users are required to create an Altair One account to login to the RapidMiner community. Click the Register button to create your account using the same email that you have previously used to login to the RapidMiner community. This will ensure that any previously created content will be synced to your Altair One account. Once you login, you will be asked to provide a username that identifies you to other Community users. Email us at Community with questions.
[SOLVED] Concatenation/grouping of rows based on a shared ID
I have been looking for a while but I couldn't find a solution to my problem, so maybe you guys know.
I have a performed a web crawl of a forum (through a seperate web crawler), the data of this web crawl has been written to a CSV.
My problem now is that every entry (original post & replies) is written on a seperate row.
The format of my CSV is as following; in the one column is the title of the topic, one column later the title of the post, and in the last column the actual text of each post.
How can I either combine all the text of one topic in one row or create seperate files per topic, with all the text of the seperate posts in them?
I have added a link to a Google Docs spreadsheet with a small sample of my data.: http://bit.ly/VDXeNU
Thanks a lot in advance!
I have a performed a web crawl of a forum (through a seperate web crawler), the data of this web crawl has been written to a CSV.
My problem now is that every entry (original post & replies) is written on a seperate row.
The format of my CSV is as following; in the one column is the title of the topic, one column later the title of the post, and in the last column the actual text of each post.
How can I either combine all the text of one topic in one row or create seperate files per topic, with all the text of the seperate posts in them?
I have added a link to a Google Docs spreadsheet with a small sample of my data.: http://bit.ly/VDXeNU
Thanks a lot in advance!
0
Answers
supposed your first column is called topic_name, you can use the following process to split the big file into smaller ones, containing only rows from the same topic.
Best,
Marius
Thanks a lot in advance!
If you are not familiar with macros, you should experiment a bit with e.g. Set Macro etc.
Happy Mining!
~Marius