Hi Team,
I am looking to better understand/ best practice of storing PLC tag data from the machine,
I am using Azure SQL as a persistence provider.
Current architecture.
With above configuration, we are getting around 3500 tags every min for 1 machine, When we scale to multiple and more tags, this number will surely grow and value stream being a table isn't able to handle.
Please suggest!
Solved! Go to Solution.
Are you using Kepserver to obtain this data or just Azure IoT Hub? If so, Kepserver has a Data Logger add-on that allows you to push data directly from KepServer to the database / table of your choice.
If you're using Azure IoT Hub, you have several options on how you process the data. You can have up to 20 consumer groups on an IoT Hub that another application can subscribe to. Additionally, you can add extra endpoints to send data directly to an Azure Storage account.
On the ThingWorx side, I typically stay away from Value Streams for IoT Hub messages as I don't have as much control over the data that gets inserted. For example, the read time on your IoT Hub messages will be different than the time it arrives at ThingWorx. However, your Value Stream will record the time that it arrived at your Thing.
As far as best practices go, are you trying to create the whole solution within ThingWorx or are you willing to use other Azure services to accomplish your goal? ThingWorx by itself might be able to get you there, but some of the Azure solutions like ServiceBus, Functions, Stream Analytics, etc. can really help when it's time to scale up.
Hope that helps
- Nick
Are you using Kepserver to obtain this data or just Azure IoT Hub? If so, Kepserver has a Data Logger add-on that allows you to push data directly from KepServer to the database / table of your choice.
If you're using Azure IoT Hub, you have several options on how you process the data. You can have up to 20 consumer groups on an IoT Hub that another application can subscribe to. Additionally, you can add extra endpoints to send data directly to an Azure Storage account.
On the ThingWorx side, I typically stay away from Value Streams for IoT Hub messages as I don't have as much control over the data that gets inserted. For example, the read time on your IoT Hub messages will be different than the time it arrives at ThingWorx. However, your Value Stream will record the time that it arrived at your Thing.
As far as best practices go, are you trying to create the whole solution within ThingWorx or are you willing to use other Azure services to accomplish your goal? ThingWorx by itself might be able to get you there, but some of the Azure solutions like ServiceBus, Functions, Stream Analytics, etc. can really help when it's time to scale up.
Hope that helps
- Nick
Thank you @nmilleson for the response!
I would like to explore both the options
Start with Thing Worx and then parallelly use Azure solutions, as we definitively would be scaling up and huge data might come in. We already are using Azure datastore to store some other kind of data. If you can guide with more info it would be helpful to make some decision.
Regards,
Vipul
I would start here for Azure IoT reference architecture. There are suggestions and examples straight from Azure on how to architect your solution. ThingWorx will be useful for things like visuals, device state, alerts, orchestrations, integrations to other devices, integrations to 3rd party applications, etc. Additionally, if you're using Azure VMs to host ThingWorx, you can utilize the VM Managed Identity to authenticate to all of your Azure services. Let me know if you have other questions.
- Nick