cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Loading / Deleting Millions of Stream Entries

SOLVED
Highlighted
Newbie

Loading / Deleting Millions of Stream Entries

Hello all,

I am trying to import a large number of historical sensor data.

I am using a self written service to import stream entries from csv files.

The largest csv file has 2,6 million rows. During the import, the service takes some minutes and the execution was successful.

But counting the stream entries returns just 2,1 million entries.

I got the same issue by deleting stream entries. Using the "PurgeStreamEntries" serivce on the stream to delete all 2,1 million entries returns a successful result but it just deleting 1/5 of all stream entries.

Could it be a tomcat memory issue?

I am running my test system with max 8GB memory for tomcat.

Tags (1)
1 ACCEPTED SOLUTION

Accepted Solutions

Re: Loading / Deleting Millions of Stream Entries

I found the solution for this issue.

I increased the "Max Queue Size" and the "Max Wait Time Before Flushing Stream Buffer" in the StreamProcessingSubsystem configuration.

View solution in original post

1 REPLY 1

Re: Loading / Deleting Millions of Stream Entries

I found the solution for this issue.

I increased the "Max Queue Size" and the "Max Wait Time Before Flushing Stream Buffer" in the StreamProcessingSubsystem configuration.

View solution in original post

Announcements

Thingworx Navigate content has a new home! Click here to access the new Thingworx Navigate forum! ______________________________