The Altair Community is migrating to a new platform to provide a better experience for you. The RapidMiner Community will merge with the Altair Community at the same time. In preparation for the migration, both communities are on read-only mode from July 15th - July 24th, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here.

Freezing after running a process

SadeghSadegh Member Posts: 2 Newbie
edited June 2021 in Help
My rapidminer stops working and becomes freeze, as a result I cant get any outputs. I have checked my ram, cpu, and all other things work properly. Linux Ubuntu 20.04 64 bit, 12 G byte of Ram, i7 core processor, GTX 850m graphics. I have downloaded the latest version of RP today and it was the same.
300*12 csv file with auto model build was used.

Let me add some info to the question

This is a screenshot of the time that my model ended. It freezes just here. No resources is being used now and it just stops working.

This is the last few lines of Terminal:

[ForkJoinPool-1-worker-1] INFO io.jhdf.GroupImpl - Lazy loading children of '//'
[ForkJoinPool-1-worker-1] INFO io.jhdf.HdfFile - Closed HDF file '/home/sadegh/Documents/RapidMiner/Temporary Repository/heart_failure_clinical_records_dataset Temp 29012.rmhdf5table'
Jun 28, 2021 11:25:44 AM logWarning
WARNING: Example set has no nominal label: using shuffled partition instead of stratified partition
Jun 28, 2021 11:25:44 AM com.rapidminer.Process execute
INFO: Process finished successfully after 0 s

Also, this is what happens if I try to do anything with it.

These are some results of my configuration:

sadegh@black-pearl:~$ which java
sadegh@black-pearl:~$ java -version
openjdk version "1.8.0_292"
OpenJDK Runtime Environment (build 1.8.0_292-8u292-b10-0ubuntu1~20.04-b10)
OpenJDK 64-Bit Server VM (build 25.292-b10, mixed mode)


  • Options
    kaymankayman Member Posts: 662 Unicorn
    Use the resource screen, this helps you to get an idea of where things go wrong. Start your process and use many stoppoints, on each stop look at memory consumption. If you see a problem sudden increase optimise the previous step.

    If you have loops an parallel processing, try without first. If you have filters use the 'remove unused...' right after your filter operator to get rid of unnecessary clutter.

    12GB is on the low side, but should be enough. Also, in your settings limit the total amount of memory RM can use. Say like 6GB, so you ensure the rest of your machine has some resources when needed. RM tends to take whatever it can, which is not the same as what it needs. I had the same on ubuntu. 

    Also, make use of thd release memory and materialize data operators after heavy transformation parts, this also has a positive impact on memory consumption. 
  • Options
    SadeghSadegh Member Posts: 2 Newbie
    edited June 2021
    please consider given detail in the question
Sign In or Register to comment.