Best Approach to Move Large Data from ThingWorx DataTable to SQL Server
Hello ThingWorx Community,
I am currently working with a ThingWorx DataTable that contains approximately 150,000 entries. Since handling large datasets in ThingWorx DataTables is not optimal, I have implemented SQL Server as a more efficient storage solution.
Now, I need to migrate this large dataset from ThingWorx DataTable to SQL Server, but I am facing challenges in doing so efficiently. Here are the key concerns:
- Row-by-Row Insert: When I try to insert data row by row, I run into execution time issues.
- Bulk Insert Using SQL File: I attempted to create a large SQL insert file, but some data entries are missing or cause issues during execution.
- Performance & Best Practices: What is the most optimal and scalable way to migrate this large dataset to SQL Server?
Has anyone successfully handled a similar data migration in ThingWorx? I would appreciate any insights, best practices, or recommendations for moving large datasets from ThingWorx DataTables to SQL Server efficiently.
Looking forward to your suggestions! 🚀
Thanks in advance! 😊

