I'm desperately trying to understand best practices for creating the platform described below. I can't find an answer for whether to use remote or internal DB, a Thing-centered data structure or traditional tables and whether to develop the front-end on mashups or on an external server with API to ThingWorx.
What we're trying to do is:
And one last question I can't wrap my head around: If ThingWorx is a Thing-Centered system, wouldn't it 'degrade' its performance if we work with traditional table-based relational DB? Are we even utilizing any of ThingWorx advantages in the field of data management here, or just using it as a portal for accessing the external DB?
Any help would be appreciated, thanks.
Your IoT use is fairly standard. The twx thing model and associated value streams are all you need. You can build reports by issuing quer against your value streams. As for your front end, go with something like Angular and Node/Hapi. Hapi will be your call mtrollwr layer and implement the REST calls back to your twx services. Angular will give you the front end flexibility you require.
Thank you for the reply Wayne.
Sorry if I misunderstood, but do you mean I don't need a dedicated DB server? I mean, where are those value streams stored?
As for the front end, we do intend to use Angular / React, but can it be coded directly to TWX or is a separate server better?
TWX ships with support for various persistence providers--among them are PostgreSQL, SQL Server, and Cassandra. Depending on the IoT use-case and licensing limitations, one of default providers will be more appropriate than the others. PostgreSQL is free and is the default persistence provider option for TWX based on standard licensing. The default persistence provider is where TWX will store it's representation of the thing model, data tables, and value streams.
So for a true IoT deployment where incoming data is typical of sensor values and other similar attributes coming from edge devices, PostgresSQL is robust enough to handle the load and no additional DB is required. I've been told that PTC has tested TWX using PostgresSQL with up to 1,000,000 incoming connections and they had no issues. If you're going to be significantly above that number of edge devices (and depending on the amount of data being received per device) you may want to start to consider something like Cassandra as the default persistence provider.
Thanks Wayne Posner for all these answers.
In our front end we don't view live data, only queried data. In addition, we want it to run smoothly and not requery the whole DB every time I change a field. With a standard DB we would have run the queries in the background, once in a while, to update auxiliary tables which are easier to draw data from. Are these features available with TWX? You said we can rely on value streams and queries, can we output these queries to other streams or tables, so reaction times for the user will be shorter?
Sorry..forgot to answer your question about Angular. It would be significantly easier to create a separate view layer and connect to TWX using REST APIs rather than trying to build your customizations directly in to TWX. Each customization would essentially be a custom widget. If you go with a separate view layer, then your UI is just your UI. You're free to make it look and behave however you wish, all the way down to custom CSS files. You don't have to use the mashup builder or try to figure out how to make TWX display graphical widgets that might not currently exist. Your controller layer (via a node based framework) talks to TWX via the REST APIs and then passes data back to your Angular UIs.
Hope this helps.
First, check if ThingWorx is really required for your use cases. Because for Non-IoT use cases you can develop using latest web technologies as already you might have few in the market.
If ThingWorx is required then as per your data input frequency you must go for external database. PostgreSQL may drop in performance if the database size exceeds 500GB. In case if you use default persistance provider for your needs, you data ingestion thing always busy inserting data into you postgres database. you can feel the slowness over the period since your thingworx composer itself is stored and drwan from default persistance provider.
If at all you have developed a application using postgresql, it may respond quick enough for single user. But think for if multiple users logged in and perform some time consuming tasks on your application. So, it is advisible to go for external db for you data ingestion rate and volume.