The Altair Community is migrating to a new platform to provide a better experience for you. The RapidMiner Community will merge with the Altair Community at the same time. In preparation for the migration, both communities are on read-only mode from July 15th - July 24th, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here.

Importing Large Data Sets into the Cloud

User13User13 Member Posts: 155 Maven

When reading in large files to the cloud inside of a process you may run out of the local memory on your machine which will cause the reading of this data to fail.  You may see this error "

main memory limit reached

This is happening because the system is loading the file into local memory before streaming it to the cloud.

This is not the way this process should be completed, instead you should load the file into your cloud repository manually. 

This is can be done by selecting file > import data> import whichever type is more appropriate

The use of the import wizard should be the same until Step 5 where the cloud repository should be selected instead of the local.

Once the data is in the cloud repository you will be able to execute processes without error.

Sign In or Register to comment.