cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

[1,018] Data store unknown error: [Error occurred while accessing the model provider. ?

Re: [1,018] Data store unknown error: [Error occurred while accessing the model provider. ?

@Constantine ,

 

My case has lots of CRUD operations on time independent data,  also,  stream doesn't has index on "field_values" , so stream is not suitable .  SQL is always the last choice I think, I don't want the main business logic implemented on SQL , it would cause more maintenance effort in the future. 

 

The good news for using data table in my case is that the data won't be changed once confirmed/submitted a few weeks later,  so I have chance to archive the data to another schema/database, meanwhile I can compress the lines by transforming the data into a complex JSON object,  so that the data volumn in DT won't be two large in years.  

Re: [1,018] Data store unknown error: [Error occurred while accessing the model provider. ?

@seanccc 

 

We had a somewhat similar use case (stored workflow instances in DTs, each ~20K when serialized to JSON), also archived old instances. This setup started having performance issues at around ~20K rows, and while doing optimizations it became obvious that it won't scale for one more year, and so we ended up rewriting it in SQL, reusing the same DB instance and putting all DDL into ThingWorx services. It's actually not that bad in terms of maintenance until you attempt any refactoring. Luckily we were exporting / importing the whole thing on each major release, which provided a simple and convenient way to "migrate" data between changing SQL table definitions.

 

/ Constantine

Re: [1,018] Data store unknown error: [Error occurred while accessing the model provider. ?

@Constantine ,

 

Thank you very much for sharing such a precious experience.  20K is also not scale for my case even if the lines are compressed into JSON.   I'll test my case as well. 

 

The lucky thing is that the query condition is relatively simple for the archived data, maybe I can avoid to query on field_values but always on entity_key field (service: GetDataTableEntryByKey).  Hope that would be fast even for 20K records.