Best Practice for Large Data Migration โ Script Timeout Issues
Hi ThingWorx Community,
Iโm working on a large data migration project in ThingWorx and would love your advice or best practices!
๐ Context:
I have around 100 DataTables, each with a significant number of rows.
For each DataTable, I read all entries, transform the data (sometimes with nested loops, adding extra logic), and then either:
Build a large InfoTable result,
This logic works fine for small datasets, but with real data volumes, Iโm consistently hitting this error:
Execution of Script terminated after: 30 seconds. Timeout configured for 30 seconds.๐ I understand the ThingWorx ScriptTimeout is there for a reason, but:
Iโve seen loops that just insert thousands of rows run much longer without hitting this.
Now that Iโm reading + transforming + updating, Iโm hitting the limit quickly.
๐ก Questions:
How do others handle large migrations like this without hitting the script timeout?
Is there a proven batching or chunking pattern that works well inside ThingWorx services?
Should I switch to a Timer/Scheduler Thing approach? Any examples or pitfalls?
Any other general tips for making heavy transformations scalable and safe?
Iโd really appreciate any real-world lessons or suggestions.
Thanks in advance for your help!
Best regards,

