Community Tip - Want the oppurtunity to discuss enhancements to PTC products? Join a working group! X
Hello all,
I am trying to import a large number of historical sensor data.
I am using a self written service to import stream entries from csv files.
The largest csv file has 2,6 million rows. During the import, the service takes some minutes and the execution was successful.
But counting the stream entries returns just 2,1 million entries.
I got the same issue by deleting stream entries. Using the "PurgeStreamEntries" serivce on the stream to delete all 2,1 million entries returns a successful result but it just deleting 1/5 of all stream entries.
Could it be a tomcat memory issue?
I am running my test system with max 8GB memory for tomcat.
Solved! Go to Solution.
I found the solution for this issue.
I increased the "Max Queue Size" and the "Max Wait Time Before Flushing Stream Buffer" in the StreamProcessingSubsystem configuration.
I found the solution for this issue.
I increased the "Max Queue Size" and the "Max Wait Time Before Flushing Stream Buffer" in the StreamProcessingSubsystem configuration.