Large data set model apply
Hi, I try to apply model on 10mln records database. I use "read database" operator but it copies all data from database to memory in my computer so it coses out of memory exception, moreover there is timeout on database. "Stream database" looks nice but it looks like it works only to make model not to apply (I got an error when applaying with this operator). I think about building a loop to get data with parametrized SQL limit - limiting data f.eg. to 10 000 records is working very well in applying model. Please help - I think there is smarter way than making loops. Most of ETL got streaming DB read.