cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Have a PTC product question you need answered fast? Chances are someone has asked it before. Learn about the community search. X

Software Design Patterns in ThingWorx

joserobertsE
5-Regular Member

Software Design Patterns in ThingWorx

 

Hi All,

 

i did a few quick searches in Google and here and didnt find anything that quite worked or explained how to use common design patterns, such as factory design pattern, within thingworx. i understand that not all concepts from software dev and OOP translate perfectly or are possible in thingworx, but i do believe such a platform should have something similar that would allow developers to create something a littble bit more dynamic. i tried cloneThing and createThing methods but these create actual instantiations of things (i.e. concrete instatiated/permanent things), but what i am trying to do is more like having a generic Thing, with all the properties and services and creat a "copy" of the thing by calling a service and passing the params. however, this should not create an actual thing but rather, inn OOP terms, create an instance of the thing that lives in memory until it is discarded. an example of how this could be used is to allow the user/application instance to pass the arguments/parameters to connect to a database, without having to store them as values in the properties of the thing. i know you can do this by leaving the values undefined and creatiung a service, however, from a behavior stand point, it was difficult to understand if the thing was more of  a Singleton and i was simply changing the property values by calling a set property method or if i infact instantiated two distinct objects (i.e. different addresses in memory) like in the Factory Design Pattern. if there is some documentation or tutorial or if anyone has successfully applied the factory design pattern in thingworx i would appreciate some guidance. now for the reason. there are some things that where we will need 300+ instantiations, and others where we will need 300 * 5000 instantiations, etc, and creating and maintaing individual actual things is not desireable, even if we could use a script or routine that would create them. i would rather instantiate my things using database records and data models

6 REPLIES 6

Hello @joserobertsE,

 

Here's how I approach those on my projects:

  • Factory pattern: I create a thing like AssetFactory, with services like CreateAsset, which creates thing instances (yes, they are persistent, so we only do it at the end, to commit the transaction sort of say);
  • "Transient" things or DTOs: I use INFOTABLEs for that purpose and pass their instances back and forth between services and mashups. I instantiate the actual thing at the very end, once I validated all inputs, etc.;
  • Each thing in the database has exactly one object loaded in memory on platform startup;
  • In your 300 x 5000 example different options are available:
    1. Create 1 500 000 things (not really feasible in 99% cases);
    2. Create 300 things with an INFOTABLE property, which has 5000 rows (usually not a good idea, because of the way infotables are protected from multithreaded access);
    3. Create a Data Table with 1 500 000 rows -- this is the closest to the "database records and data models" concept you mentioned, but most probably won't work well due to the default indexing configuration of data tables. The rule of thumb is to avoid inserting more than 50K rows into them;
    4. Same as (3), but use Streams for that. This should be your default option in most of the cases;
    5. Handle those 1.5M rows in some external storage, e.g. an RDBMS exposed to your services via a JDBC extension;
    6. Consider using NoSQL storage like InfluxDB (supported by ThingWorx out-of-the-box);
    7. You can always create a ThingWorx extension, which will simply expose something like a public static collection to your services, which you can use as a cache. It's easy to shoot yourself in the foot by doing that, so you need to make sure that it's the only viable option, and you need to know what you're doing.

It would be easier to give some meaningful advice, if you shed some light on your data model and use case.

 

Regards,

Constantine

joserobertsE
5-Regular Member
(To:Constantine)

Thanks @Constantine,

 

What i want to achieve is something close to the option 5 you described. Instead of creating a persisted Thing within the thingworx platform, i want to make a FactoryThing that can be callable and used to create persisted thing in the database. However, this FactoryThing, would need to handle multiple concurrent calls. Once all Things have been created in the database, the FactoryThing would be used to fetch a record from the database and used in the program.

 

https://www.tutorialspoint.com/design_pattern/factory_pattern.htm

 

I have used the factory pattern described in the link above and it becomes very convenient to load the content and context of the object when a request is made, even creating or calling multiple objects at the same time. 

 

Suppose, 300 Things are Transianet things ThingWorx and i use the infotables, becaue the infotable is protected, would this mean those transient things are not readable by multiple services? and would that mena that if they are persisted things created in thingworx, then they are multithread capable?

@joserobertsE, can you tell a bit more about your use case? What are those 300 / 5000? Factory machines and their readings? Airplanes and their sensors? The design pattern will depend on the nature of this data.

 

/ Constantine

joserobertsE
5-Regular Member
(To:Constantine)

@Constantine , Sadly the company i work for is not using the tool correctly and thus, we dont really connect to actual IIoT devices. Our Thingworx is being used more as an Application Development platform. All of our data lives in Databases that are found in individual plants.

 

Sensor data (wired) > DCS (streams) > CIMIO (data buffer/cache) > Long Term Storage (data Historian) > SOAP (api) > ThingWorx

 

i would like to avoid creating 300 Things to connect to the different API's, instead i would prefer to create an interface that allows me to connect and fetch the relevant data that the user (service/mashup/BI tool) requests from the specific API by providing the API address. 300 (api's) * 5000+ (searchable sensors). this is but one of the examples of what we will need to do. in the future, we will have more structure that has a more direct mapping from the simulated individual asset (pump, actuator, etc) to their physical sensor (from the historian data), this increasing the overall amount of Things what we will have to create and maintain and possibly synchronize between all the sources

@joserobertsE, if you need an efficient search index for 1.5M sensors, then I would definitely advocate for an external database for keeping this index. Consider setting up an external server, which you connect to ThingWorx via JDBC extension, wrapping all access to it in SQL services. It doesn't have to be a relational database, something like Lucene exposed via JDBC might be a better option.

 

If you really work on a system aggregating data from hundreds of historians / factories and millions of sensors, then keeping a searchable index probably will be one of your least concerns, even if you're just proxying external APIs acting as an API gateway.

 

/ Constantine

joserobertsE
5-Regular Member
(To:joserobertsE)

so i found what seems like a feasible solution, but i dont know the best way to test it to make sure it can handles the scale. essesntially my approach consists of creating a generic thing instance as my connector with the right properties and services. the properties are what is necessary to connect to the specific historian source (Host, user, pw, etc). then i created a data shape witht the same properties and a data table with the same properties (for now the data is stroed in the internal database). from the generic thing instance, i can now create new entries (calling the AddOrUpdateDataTableEntry service) for additional historian configuration, i can get the right historian configuration (calling the QueryDataTableEntries service) by passing a key parameter to the query. 

 

this now gives me something like the factory method where i can have a generator for different historian connections in a dynamic fashion. what i havent been able to test is how many simultaneous connections it can support and if two or more users call on the services at the same time, will each request execute as diferent objects in memory or will it reuse the same memory space and perform more like a LIFO or FIFO, etc, when executing a race condition. in design patterns terminology, will it function as a factory or a singleton.

Top Tags