Community Tip - If community subscription notifications are filling up your inbox you can set up a daily digest and get all your notifications in a single email. X
Hello,
I need your help for a problem on my project (thingworx 9.1)
1- I have a subscriber to mqtt (SubscriberThing, + valuestream to store data)
2- Received messages are parsed to infotable (ParserThing + valuestream to store data) i have a subscription on ParserThing triggred when SubscriberThing receive a new mqtt message.
I log all properties of SubscriberThing and ParserThing
when i querypropertyhistory i got a logic result on SubscriberThing (timestamp change every second en average)
However, on ParserThing all data are like stored on temporary memory and after some minutes they are stored one time to the value stream of ParserThing
Logically, the i should get a new row each time ParserThing triggred! but no.
The graphic attached show the evolution of data on SubscriberThing. you can see that each period all data en temporary memory are stored one time.
I find no explanation for this behavior!
Any help please
thank you
Solved! Go to Solution.
Problem solved.
Explanation: i have others services on project executing some api rest requests.
The ip address or the rest api is unreachable, so the service try again and again until timeout. In the meanwhile, others services are on paused. So finally when the service executing the rest api is released, all other standby services are executed and insert all received data at the same time. That's why i see in time series graphics vertical lines.
Thank you for your help.
Are you explicitly providing VTQ based on the received data when writing to the value stream or are you applying 'many updates at once' to the property value?
Your behavior looks like you received a batch of values and you wrote them consecutively to the Property Value, vs. doing a direct Value Stream write with VTQ.
The property is updated when a new mqtt message is received. The checkbox log is checked to trace the change
Example :
1 - MQTT message is published on the network
2- ThingSubscribe receive the message on the property MqttMsg
when i querypropertyhistory i see the row on the valuestream linked to ThingSubscribe
3- When the property MqttMsg is changed, a subscription in the ThingParser is triggered , the subscription contains a service that convert Json (MqttMsg) to an infotable stored on the property InfotableMsg.
So logically, if I receive mqtt message each second, I shall have a new row each second on ThingSubscriber valuestream and ThingParser valuestream.
But, actually, I have the row each second on ThingSubscriber (which is correct). However, on ThingParser it starts well but after some times, instead of having a row each property update, I have several rows stored on the same time each period of time. It's like there is temporary memory and then bulk copy all the rows at the same timestamp.
Maybe the service I developed is blocked because of whatever issue, and then the thread freeze for a duration of time, then when released it process all events at the same time, so he inserts the same timestamps. But I don't know how to check that. Is there a way to check on log the duration of function ? or is there another problem
Well, the easiest would be just to do logger.debug(new Date()) in that service and see if you are getting duplicates in the log. Also, keep in mind that the writes to streams and value streams are buffered with the default flushing delay of 10s. See ThingworxPersistenceProvider > Configuration > Stream Settings.
I think that the issue that you're having is related to your use of an InfoTable property. Changes on an InfoTable property do not trigger a data change, however by default logged properties are logged on change. Check in the property settings at the bottom under Advanced Settings and ensure that the setting for Data Change is ALWAYS.
This answer probably makes the most sense, I would still recommend though to use a write VTQ for the best possible fidelity.
Problem solved.
Explanation: i have others services on project executing some api rest requests.
The ip address or the rest api is unreachable, so the service try again and again until timeout. In the meanwhile, others services are on paused. So finally when the service executing the rest api is released, all other standby services are executed and insert all received data at the same time. That's why i see in time series graphics vertical lines.
Thank you for your help.
Thanks for sharing the solution to your problem and closing the thread!
I'm not sure to understand 100% your explanation of the vertical line, but from your explanation it makes me wonder if you are inserting the data with the correct timestamps. You should always use the datas VTQ (value, timestamp, quality) as this will be applied early on in the collection process and will live throughout the rest of the data processing pipeline. Using UpdatePropertiesVTQ() allows you to do batched updates including the source timestamp which will then be used when the data is logged - so you shouldn't have any data anomalies like spikes or holes afterwards once things have resumed normal operations.
By the way, ContentLoader services allow specifying REST timeouts. To avoid thread starvation in the described case you can set it to some small value, like 2 seconds, for example.