ThingWorx Navigate is now Windchill Navigate Learn More

Translate the entire conversation x

Big datatable Iteration problem

AliBenBelgacem
14-Alexandrite

Big datatable Iteration problem

Hi,

I have a datable containing 100 000 rows.

I need to iterate this datatable and do some operation for each row.

Obviously I cannot do that on one service (I will have 100% timeout problem) and I cannot increase the timeout value.

 

The solution I am thinking of:

 

1-Take rows slice by slice. (50 rows every time).

2-When the 50 iterations are done , I trigger the next 50 rows (subscription on a property).

3-I finish with I reach the count of all rows in the datatable.

 

What do you think about this solution ?

Is there a better alternative ?

How to get the 50 rows I need to process since I don't have an id that increment automatically (so I cannot see how to manage the query to get the right 50 rows I need to process)

 

Thank you for your help

3 REPLIES 3

Hi @AliBenBelgacem  can you take a look at the below reference, and let us know your output?

 

https://community.ptc.com/t5/ThingWorx-Developers/Iterate-Datatable/m-p/646167#M42203

Hi @AliBenBelgacem,

 

I suggest (if you have the CSV-Parser)

  1. getting the data into the CSV-File, bind the GetDataTableEntries into the Export Widget
  2. Once the CSV is downloaded, get the data of the CSV-Parser
  3. Iterate it. On each iteration, do the Manipulation of the data 
  4. You can either update all the data batch-wise or totally with AddorUpdateDataTableEntries or UpdateDataTableEntries.

This will atleast reduce the time for 'GetDataTabelEntries' you do on the initial step.

 

Just a though from top of head

 

Thanks,

Shashi Preetham,
Email: psp316r@outlook.com,
Mobile: +91 8099838001.

Hi @AliBenBelgacem 

 

If you found either of the responses helpful, please mark the appropriate one as the Accepted Solution for the benefit of others in the community.

 

Regards.

 

--Sharon

Announcements




Top Tags