Community Tip - You can Bookmark boards, posts or articles that you'd like to access again easily! X
What are the best practices for Influx DB Archival to Store the data in Azure or anywhere ?
@slangley
Thanks,
Shashi.
Solved! Go to Solution.
Hi,
After a couple of the Architecture meetings with PTC. The flow is as follows:
InfluxDB Cloud ---> Python--->Azure Data Lake(Blob Storage).
Python Scripts runs monthly to store the previous month's data in the Blob Storage. Containers are created in the Blob Storage based on the Things.
Based on the Business use case, this was one of the best practices I came up with for the Archival, where Azure Cognitive Services can be used for Data Analytics and predictive Maintenance or the Chat GPT.
Thanks,
Hi @psp316r.
You may need to check the Influx site for guidance around archiving data if you're hosting ThingWorx on-premise. If you're inquiring about a PTC Cloud instance, we would need to open a case for discussion with the Cloud team.
Regards.
--Sharon
Hi @psp316r
Were you able to find an answer to your question? If so, please feel free to share on the community.
Regards.
--Sharon
Hi,
After a couple of the Architecture meetings with PTC. The flow is as follows:
InfluxDB Cloud ---> Python--->Azure Data Lake(Blob Storage).
Python Scripts runs monthly to store the previous month's data in the Blob Storage. Containers are created in the Blob Storage based on the Things.
Based on the Business use case, this was one of the best practices I came up with for the Archival, where Azure Cognitive Services can be used for Data Analytics and predictive Maintenance or the Chat GPT.
Thanks,