cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Did you get an answer that solved your problem? Please mark it as an Accepted Solution so others with the same problem can find the answer easily. X

How to create custom DataTable in Thingworx

MG_2602
9-Granite

How to create custom DataTable in Thingworx

Hi, 

There is a requirement of storing large data in data table dynamically. We are facing an indexing issue while deleting the data. Does anybody knows how to configure custom Datatable in Thingworx

4 REPLIES 4

Hi @MG_2602.,

First, I saw you mentioned "large data" in data table. You should not use a DataTable for quantities larger than 100.000 rows, as per this Help Center section.

Please note that even if you store lower numbers of rows, you might end up having issues due to the way you access the data (frequency, data size etc)

You mentioned "we are facing an indexing issue while deleting the data". What exactly is the issue?

I also do not fully understand your question about "how to configure custom Datatable in ThingWorx", because all the DataTables are defined using a custom DataShape (hence, all are custom). Have you created maybe your own Java-based DataTable implementation?

 

Hi @VladimirRosu ,

 

Can we use DataTable generic services like AddConfigurationTableDefinition, GetConfigurationTable.

Can you please help me out in understanding what are these configuation Table. How can we use them and any limitation?

 

Thanks,

Meghna

The information you asked for is available in the Help Center, here.

In the future, I suggest you bookmark the ThingWorx Help Center website, and to try to use the search function there. If then you do not find the information in the Help Center, then we would be happy to help in the community.

I noticed you did not answer my questions above - is that problem solved?

Hi @VladimirRosu,

 

We are storing CSV files (File1.csv and File2.csv) from mashup to datatable with combine data nearby 1,00,000 rows having primary key as ID(increases by 1) and Timestamp. When we try to upload new data for File1.csv, we are deleting the previous data and uploads new data, which results us in indexing issue in DB.

 

Thanks,

Meghna

Top Tags