Community Tip - If community subscription notifications are filling up your inbox you can set up a daily digest and get all your notifications in a single email. X
Hi Team,
In many places of our code, we use services like QueryDataTableEntries,GetDataTableEntries . we expected that query parameters with maxItems undefined, we would fetch all the records. but it will return only 500 records(May be first 500 based on the primary key). so if have a logic like as below,
var params = {
maxItems: undefined /* NUMBER */
};
// result: INFOTABLE
var RKTableEntries= Things["RkDataTable"].GetDataTableEntries(params);
var query = {
"filters": {
"fieldName": "isDeleted",
"type": "EQ",
"value": false
}
};
var params = {
t: resultOfCombinedAsset /* INFOTABLE */,
query: query /* QUERY */
};
// result: INFOTABLE
var resultofSubQuery= Resources["InfoTableFunctions"].Query(params);
So the resultofSubQuery is done on the 500 records instead of entire table. Please let me know if there is any number or term that we can use similar to undefined, to avoid this problem.?
Thanks in advance.
Hi,
There's a hardcoded 500 behind the scenes on any query, if you want to get more or the entire table/stream the only option you have it's to set maxItems to a big value ( 1000000000 ), nasty but it's the only way.
Best Regards,
I have similar issue where my output infotable is only returning 500 records. Let me know where and how I can set a maxitems limit for "return InfoTable".
var myRS = Things["Solar.Windchill.Services.PSECycleTime.FM.HRTest"].get_WC_ECR_Weekly_Input_Exit_Data();
Regards,
Hemant
Please refer to my comment on Thread Service output InfoTable only returning maxItems 500 records
If your data tables are that large (over 10,0000 rows) or you expect them to be that large, you should not use data tables. Performance will be terrible. Hopefully you're using the Postgres version. If so, simply create a new database and build out a relational schema. Then create a new DB thing and add SQL queries to get your data. You can look at your default persistence provide parameters to set up your Thing.
Your app and users will thank you.
My understanding is that the ceiling was closer to 100,000 (especially in PostGres version). We're accessing 60K records with good success. Are you seeing issues with less than that?
Hi Crhistopher,
If you fine tune with indexes I think you can go further than the 100k limit, also it would depend on hardware configuration.
Carles.
Per Thingworx documentation, performance significantly degrades for datatables with row sizes over 150K. The documentation also recommends using an external database for large datasets.
Hi Wayne,
But under the hood, if you do a Query using Indexes ( using values parameter, or for instance source parameter ) you won't load all the DataTable on memory, just entries which complies with values, then still if you have a 1million data table but your query by index only returns a few rows ( few thousands ) still will perform well.
Using a external Database should not be an option neither the recomended way of working with a platform like TW, why we have DataTables, Streams and ValueStreams then? they are working on improving this and we must push them to do it. Otherwise just select another IoT platform.
Carles.
Hi Charles...
I'm actually involved in very specific conversations with the TWX product management team regarding this very issue since it directly impacts a project on which I'm currently working. The main take away is that TWX product management is aware of the issue with datatables, but the messaging is that for complex relational datasets that are expected to be very large in nature, use an external database. For timeseries data, use native TWX data structures.
Hi Wayne,
Yes, but this is one project for which you have the exact requirements, 100k limit it's super vague and relative. And what's Complex Relational Dataset definition? If it's an "ERP" like of course, but then maybe you have to build a whole system just to manage data data and don't use TW for that data and just do a connection between both systems...
Carles.
Hello Chris,
I've seen a pretty big systems with Data Tables as a main storage options and it was working fine - but it takes a lot of resources from the system (especially memory) and - as Carles said - needs a good indexing. With good hardware and configuration, even ~200k DT can work fine.
But please keep in mind, that it's not officially a preferable way.