Options

Memory problems

Viktor_MeyerViktor_Meyer Member Posts: 6 Contributor II
edited November 2018 in Help
Hello,

I'm using version 4.6 of rapidminer and I'm using the example source operator to load in 1.2 Million datarows with 65 attributes. I've got 4 gb ram on my mac... is there any way to read in these amount of data, because i'm always getting a running out of memory problem and till then in takes 12 or more to read the data... is there a way and maybe a bit faster way in doing that `?

Greets Viktor

Answers

  • Options
    dragoljubdragoljub Member Posts: 241 Contributor II
    Update to the latest RapidMiner 5.009, because the old version is not supported anymore. A simple computation of how large the file you are working with (in binary) is compared to the ram you have available should let you know if you can fit it in memory. If not use a database and process it on the fly without loading all to memory.

    -Gagi
  • Options
    fischerfischer Member Posts: 439 Maven
    Hi,

    let me add that we also fixed some memory problems in 5.0.009 so it actually may be worth updating.

    Best,
    Simon
  • Options
    Viktor_MeyerViktor_Meyer Member Posts: 6 Contributor II
    Okay, and how do i get the .dat file into a database.... if tried to load it in with rapidminer (AML) and export to ingres but this also doesn't work because of insufficient memory.

    cheers
    viktor
  • Options
    landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi Viktor,
    if you have exported it as an .aml file, you probably have generated it with RapidMiner? You then could export it in chunks and reload it using a loop files operator in chunks and append it to the table. Since it's a more complex process, I save the design until you confirm that it is actually possible :)

    Greetings,
    Sebastian
Sign In or Register to comment.