The Altair Community and the RapidMiner community is on read-only mode until further notice. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here.
Options
"Storing data throws batch update error"
![mario_playing_w](https://us.v-cdn.net/6030995/uploads/defaultavatar/nCCNNSPK1YM69.jpg)
![](https://s3.amazonaws.com/rapidminer.community/vanilla-rank-images/contributor-16x16.png )
Hello,
currently I am building a few processes on rapid analytics. When storing larger data sets I encounter a strange problem. As soon as I try to store more than 400k cases (5 attributes) the process fails on rapid analytics an throws
"javax.ejb.EJBException: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not execute JDBC batch update."
Is my data simply to large or my ram insufficient? I already worked on larger datasets, so this error while storing date seems strange to me.
Does anybody knows this error and maybe a workaround for it?
Thanks!
Mario
currently I am building a few processes on rapid analytics. When storing larger data sets I encounter a strange problem. As soon as I try to store more than 400k cases (5 attributes) the process fails on rapid analytics an throws
"javax.ejb.EJBException: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not execute JDBC batch update."
Is my data simply to large or my ram insufficient? I already worked on larger datasets, so this error while storing date seems strange to me.
Does anybody knows this error and maybe a workaround for it?
Thanks!
Mario
Tagged:
0
Answers
1. What database are you using? What is the version?
2. How are you saving the data? Are you using a write DB operator or you are storing in the repository? Providing a sample process would also be helpful to find the cause for the issue.