We are downloading huge number of rows from sql via thingworx using export widget. Since the rows are more than 10- 15 lakhs, the tomcat consumes memory and thingworx goes blank abruptly. We tried many solutions suggested by PTC ad also to increase the tomcat memory allocation but it did not work. The reports goes down more frequently. Could you suggest an ideal number of rows that can be queried at a time so that we will not query huge rows at a time and avoid fetching huge data at once and will split up the rows with ideal count.
Eg: If the suggested ideal rows are 100,000 and we have to pull 1M rows. We can query only 100,000 rows at once and would split the querying to 10 time to fetch 1M rows (1M / 100,000).
Please suggest us as this is critically impacting in the production server.
Thanks in advance,
I recommend you do this export directly in SQL server if possible.
If not, I recommend you have a ThingWorx server side process that pulls records and writes it to a csv file in a ThingWorx File Repository.
Then you provide download links for those files.
Remember that also when pulling the records server side, the tomcat memory will be impacted, so I suggest smaller number of rows and more iterations.
There are many factors that would need to be considered for determining optimal settings. What is your use case? How often does this need to occur? How many users have this ability?
There's no easy answer to this question, but we have consultants that would be happy to assist with sizing and design. You can engage with a consultant via the Sales team.