Skip to main content
16-Pearl
March 19, 2025
Solved

Best Approach to Move Large Data from ThingWorx DataTable to SQL Server

  • March 19, 2025
  • 2 replies
  • 1244 views

Hello ThingWorx Community,

 

I am currently working with a ThingWorx DataTable that contains approximately 150,000 entries. Since handling large datasets in ThingWorx DataTables is not optimal, I have implemented SQL Server as a more efficient storage solution.

 

Now, I need to migrate this large dataset from ThingWorx DataTable to SQL Server, but I am facing challenges in doing so efficiently. Here are the key concerns:

 

  1. Row-by-Row Insert: When I try to insert data row by row, I run into execution time issues.
  2. Bulk Insert Using SQL File: I attempted to create a large SQL insert file, but some data entries are missing or cause issues during execution.
  3. Performance & Best Practices: What is the most optimal and scalable way to migrate this large dataset to SQL Server?

Has anyone successfully handled a similar data migration in ThingWorx? I would appreciate any insights, best practices, or recommendations for moving large datasets from ThingWorx DataTables to SQL Server efficiently.

 

Looking forward to your suggestions! 🚀

Thanks in advance! 😊

Best answer by wcui

Hi @MA8731174 

 

Javascript service has a default execution timeout of 30 seconds. It is configured with platform-setting.json. Changing this value needs tomcat restart.

https://support.ptc.com/help/thingworx/platform/r9.7/en/index.html#page/ThingWorx/Help/Composer/System/Logs/ScriptErrorLog.html

 

However, you can set the serivce "Async" option to allow long-time execution.

https://support.ptc.com/help/thingworx/platform/r9.7/en/index.html#page/ThingWorx/Help/Composer/Things/ThingServices.html

 

It might be a good idea to loop the data of the large DataTable to create Insert SQL statement into a file and finally run those statements on DB side.

 

2 replies

MA873117416-PearlAuthor
16-Pearl
March 20, 2025

 

I have been working on this migration and wanted to share my progress so far.

When using a for loop to read from the ThingWorx DataTable and insert records into SQL Server, I encountered the following error:

 

 
Error executing service. Message ::Execution of Script terminated after : 41 seconds. Timeout configured for 30 seconds. - See Script Error Log for more details.
 

I have already increased the SQL query timeout in the Service Info settings to 7200 seconds, but the issue persists. Based on my observations, I suspect this might be related to the Apache server timeout settings rather than ThingWorx itself.

Has anyone faced a similar issue before? Could this be an Apache timeout limitation, and if so, is there a way to adjust it? Or should I consider breaking down the data migration into smaller batches to prevent timeouts? But i cannot break it into smaller batches as i have 150K entries in datatable. it will be very time consuming...

 

Any insights or suggestions would be greatly appreciated!

 

Thanks in advance!

18-Opal
March 20, 2025

Hi @MA8731174 ,

 

After upgrading Postgres DB Schema to the latest version, I recently encountered this error. This is due to Database User Permission issues.

 

Datatable data gets stored in a Table in the Database itself. I suggest writing SQL Script or Python code to move the data from one Table (where the data of the Datatable exists) to a new table in the same database.

 

Thanks,

wcui16-PearlAnswer
16-Pearl
March 21, 2025

Hi @MA8731174 

 

Javascript service has a default execution timeout of 30 seconds. It is configured with platform-setting.json. Changing this value needs tomcat restart.

https://support.ptc.com/help/thingworx/platform/r9.7/en/index.html#page/ThingWorx/Help/Composer/System/Logs/ScriptErrorLog.html

 

However, you can set the serivce "Async" option to allow long-time execution.

https://support.ptc.com/help/thingworx/platform/r9.7/en/index.html#page/ThingWorx/Help/Composer/Things/ThingServices.html

 

It might be a good idea to loop the data of the large DataTable to create Insert SQL statement into a file and finally run those statements on DB side.

 

MA873117416-PearlAuthor
16-Pearl
March 21, 2025

Thank you @wcui  for your response! Your suggestion helped a lot.

 

Now, I’m facing another challenge: when generating an SQL file with more than 1000 rows, SQL Server returns an error stating that the number of row value expressions in the INSERT statement exceeds the maximum allowed limit of 1000. This limitation is preventing me from executing bulk inserts efficiently.

Would you have any recommendations on how to handle this? Should I split the INSERT statements into batches of 1000 rows, or would using BULK INSERT be a more efficient approach? And how can i split it in SQL Server? Any recommendation

 

I’ve also attached a screenshot of the error for reference. Looking forward to your insights!

 

MA8731174_0-1742553743518.png

 

16-Pearl
March 21, 2025

Hi @MA8731174 

Glad to know it helped.

I'm not very proficient in SQL insert script, depending on the error content, the following link may be able to help you.

https://dba.stackexchange.com/questions/82921/the-number-of-row-value-expressions-in-the-insert-statement-exceeds-the-maximum