Community Tip - Learn all about PTC Community Badges. Engage with PTC and see how many you can earn! X
I'm facing issue with the Stream logging.
We are storing the asset property data to the stream at 1 minute interval, by running a subscription with 1min timer event. On debugging we found the subscription is running properly at every minute and we used a temp data table and found that the data is logging properly minute wise in it.
But still certain entries are still missing when checked in the streams.
am sharing the image of the Stream Log, stream processing and event processing subsystems status.
Thanks in advance.
Hey,
I just have some ideas:
- The result may not be sorted by timestamp - so the value may exist but at a different position?
- You may want so specify the "maxItems" parameter as some high value "99999" as if not defined it will default to 500
- The code how you insert it into the stream may be interesting. At least one thing which we almost missed one time was the primary key (https://support.ptc.com/help/thingworx/platform/r9/en/index.html#page/ThingWorx/Help/Composer/DataStorage/Streams.html
* The timestamp is a key field for streams. If you add an entry with the same timestamp, it will overwrite the previous timestamp (upsert). To avoid overwriting, specify milliseconds for the timestamp.
* In PostgreSQL and H2, stream entries are keyed by the unique key entries of the timestamp and source.
But with 1 minute difference you may not run into duplicate timestamps.
Hi @UN_10218888.
Value streams are more suited to time series data. With value streams, as property updates arrive, any property you mark as a logged property will get logged to the value stream automatically. This eliminates the need for timers and is therefore more efficient.
Have you given any thought to using a value stream? Here are a couple of links to our Help Center with more information:
Regards.
--Sharon