The RapidMiner community is on read-only mode until further notice. Technical support via cases will continue to work as is. For any urgent licensing related requests from Students/Faculty members, please use the Altair academic forum here.
"Storing data throws batch update error"
mario_playing_w
Member Posts: 5 Contributor II
Hello,
currently I am building a few processes on rapid analytics. When storing larger data sets I encounter a strange problem. As soon as I try to store more than 400k cases (5 attributes) the process fails on rapid analytics an throws
"javax.ejb.EJBException: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not execute JDBC batch update."
Is my data simply to large or my ram insufficient? I already worked on larger datasets, so this error while storing date seems strange to me.
Does anybody knows this error and maybe a workaround for it?
Thanks!
Mario
currently I am building a few processes on rapid analytics. When storing larger data sets I encounter a strange problem. As soon as I try to store more than 400k cases (5 attributes) the process fails on rapid analytics an throws
"javax.ejb.EJBException: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not execute JDBC batch update."
Is my data simply to large or my ram insufficient? I already worked on larger datasets, so this error while storing date seems strange to me.
Does anybody knows this error and maybe a workaround for it?
Thanks!
Mario
Tagged:
0
Answers
1. What database are you using? What is the version?
2. How are you saving the data? Are you using a write DB operator or you are storing in the repository? Providing a sample process would also be helpful to find the cause for the issue.