cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Did you know you can set a signature that will be added to all your posts? Set it here! X

IoT Tips

Sort by:
Build, customize, and deploy IoT applications with ThingWorx.   NOTE: Complete the following guides in sequential order. The estimated time to complete this learning path is 270 minutes.   1. Create Your Application UI  Part 1 Part 2 Part 3 Part 4 Part 5 2. Basic Mashup Widgets  Part 1 Part 2 Part 3 3. Define Your UI Style  Part 1 Part 2 4. Object-Oriented UI Design Tips 5. Deploy an Application 6. How to Display Data in Charts  Part 1 Part 2 Part 3
View full tip
  Use the Collection Widget to organize visual elements of your application.   GUIDE CONCEPT   This project will introduce the Collection Widget.   Following the steps in this guide, you will learn to display different values from a single dataset in real-time.   We will teach you how to utilize the Collection Widget to generate a series of repeated Mashups for every row of data in a table.     YOU'LL LEARN HOW TO   Create a Datashape to define columns of a table Create a Thing with an Infotable Property Create a base Mashup to display data Utilize a Collection Widget to display data from multiple rows of a table   NOTE: This guide's content aligns with ThingWorx 9.3. The estimated time to complete ALL 3 parts of this guide is 60 minutes.      Step 1: Create a Datashape    Create a Datashape   The Collection Widget uses a Service to dynamically define visual content.   The data must be in a tabular format (for example: Data Table, Info Table, or external Database-connection) in order for the Collection Widget to access it.   In this part of the lesson, we'll create a Data Shape to define the columns of the table.   In a subsequent step, we'll create an Info Table Property within a Thing in order to store the information.   In the ThingWorx Composer Browse tab, click Modeling > Data Shapes, + New              2.  In the Name field, enter cwht_datashape.                      3. If Project is not already set, search for and select PTCDefaultProject.         4. At the top, click Field Definitions          First Definition:          1. Click + Add to open the New Field Definition slide-out.          2. In the Name field, enter first_number.          3. Change the Base Type to NUMBER.          4. Check the Is Primary Key box. All Datashapes must have a single Primary Key, and the first Field is as acceptable as any other for our purposes here.                5. At the top-right, click the "Check with a +" button for Done and Add.   Second Definition:   In the Name field, enter second_number.       2. Change the Base Type to NUMBER.               3. At the top-right, click the "Check with a +" button for Done and Add.     Third Definition:   In the Name field, enter third_number.         2. Change the Base Type to NUMBER.               3. At the top-right, click the "Check" button for Done         4. At the top, click Save.                   Step 2: Create a Thing   Create a Thing   In the previous step, we created a Data Shape to define the Info Table Property columns.   Now, we will create a Thing, add an Info Table Property, format the Info Table Property with the Data Shape, and set some default values.   On the ThingWorx Composer Browse tab, click Modeling > Things, + New.           2. In the Name field, enter cwht_thing.         3. If Project is not already set, search for and select PTCDefaultProject.         4. Select GenericThing in the Base Thing Template field       Add InfoTable Property         1. At the top, click Properties and Alerts.        2. Click the + Add button to open the New Property slide-out.        3. In the Name field, enter infotable_property.        4. Change the Base Type to INFOTABLE.        5. In the Data Shape field, select cwht_datashape.        6. Check the Persistent checkbox.       First Default   Check the Has Default Value checkbox. A cwht_datashape icon will appear underneath.            2. Under Has Default Value, click the cwht_datashape button to open the pop-up menu which  sets the default values               3. Click the + Add icon.         4. Enter values in each number field, such as 1, 2, and 3.              5. At the bottom-right, click the green Add button.     Second Default   Click the + Add icon.         2. Enter values in each number field, such as 10, 20, and 30            3. At the bottom-right, click the green Add button       Third Default   Click the + Add icon.       2. Enter values in each number field, such as 100, 200, and 300            3. At the bottom-right, click the greenAddbutton           4.  At the bottom-right, click Save to close the pop-up menu               5. At the top-right, click the "Check" button for Done.       6. At the top, click Save          Click here to view Part 2 of this guide.
View full tip
Original Post Date:            June 6, 2016   Description: This is a video tutorial on creating a Stream, adding a Data Shape with properties, and writing values to the Stream.    
View full tip
I have put together a small sample of how to get property values from a Windows Powershell command into Thingworx through an agent using the Java SDK. In order to use this you need to import entities from ExampleExport.xml and then run SteamSensorClient.java passing in the parameters shown in run-configuration.txt (URL, port and AppKey must be adapted for your system). ExampleExport.xml is a sample file distributed with the Java SDK which I’ve also included in the zipfile attached to this post. You need to go in Thingworx Composer to Import/Export … Import from File … Entities … Single File … Choose File … Import. Further instructions / details are given in this short video: Video Link : 2181
View full tip
It’s critical for us to configure all correct parameters while running your application in Production environment or even in development env. While GUI makes it very user-friendly and easy to set up the right values in the right fields, it's useful to know how to do the same programmatically/without the "Configure Tomcat" utility. One way, if you're using Tomcat as a Windows service, you can adjust the JVM options by going to the bin dir and running: tomcat8 //US//MYSERVICENAME ++JvmOptions=-Dexample.license.directory="C:\Program Files\example" Turn the service off before you do this and restart it when you finish. cd $CATALINA_HOME .\bin\service.bat install tomcat .\bin\tomcat8.exe //US//tomcat8 --JvmMs=512 --JvmMx=1024 --JvmSs=1024 Setting the --JvmXX parameters may not be enough. You may also need to specify the JVM memory values explicitly. From the command line it may look like this: bin\tomcat8w.exe //US//tomcat8 --JavaOptions=-Xmx=1024;-Xms=512;.. Be careful not to override the other JavaOptions. But the best and recommended way is to use setenv.sh/setenv.bat (Linux/Windows respectively). It isn't in the as-downloaded Tomcat. But if you look in catalina.sh/catalina.bat, there's a check for a file called setenv. If it's there, it's run. That's where you set JAVA_OPTS, CATALINA_OPTS, etc. We use it to set JAVA_HOME, JAVA_OPTS, CATALINA_OPTS and JPDA_ADDR. Putting all your environment variables into this file is ideal because then you don't have to change the stock startup scripts. Then when monitoring the log we can see the parameters taken:
View full tip
Recently, mentor.axeda.com was retired. The content has not yet been fully migrated to Thingworx Community, though a plan is in place to do this over the coming weeks. Attached, please find OneNote 2016 and PDF attachments that contain the content that was previously available on the Axeda Mentor website.
View full tip
In this video you would see how to start to use your already created Virtual Image of ThingWorx Analytics using Oracle Virtual Box. This Video is Part-1 of the Series Getting Started with ThingWorx Analytics.   Updated Link for access to this video:  Getting Started with ThingWorx Analytics: Part 1 of 2
View full tip
I recently had a customer who wanted to run services on ThingWorx from Power BI to retrieve existing operational data, and we were a bit stumped on how to pass the API key over in the headers, so I did a bit of Googling and pieced together the solution. It's not quite intuitive on the Power BI side, so I thought it would be helpful to share. If you have any other experience with integrating ThingWorx with Power BI, feel free to add a comment.    Prepare ThingWorx Create an Application Key that has Run Time execution access to the services you need. Understand the inputs needed for the service you would like. I'll have examples of none, one, an InfoTable, and multiple inputs.   Power BI Following the following steps in Power BI: 1. In Power BI, create a new blank query   2. On the left, right click on Query1 and go to the Advanced Editor:   3. Replace all of the body content with the following, replacing your API key, appropriate end point, and base URL as needed (this is an example with NO input parameters, I'll follow with examples of other parameters):     let appKey = "your-application-key-here", endpoint = "Things/YourThingNameHere/Services/YourServiceNameHere", baseUrl = "https://YourServerNameHere/Thingworx/", url = Text.Combine({baseUrl,endpoint}), body = "", request = Web.Contents( url, [ Headers = [ appKey = appKey, #"Content-Type" = "application/json", Accept = "application/json" ], Content = Text.ToBinary(body) ] ), Source = Json.Document(request) in Source       4. Click "Done", and now you'll have a warning about how to connect. Click the "Edit Credentials" button. 5. Leave it on Anonymous and click "Connect":   6. You should now see the return data coming from ThingWorx.   Note that I had a little trouble with this authentication initially and it saved the wrong method. To clear that out, go to the ribbon bar item "Data source settings" and select the server and clear it out.   Other Examples Here is an example for sending a single string parameter:   let appKey = "your-application-key-here", endpoint = "Things/YourThingNameHere/Services/YourServiceNameHere", baseUrl = "https://YourServerNameHere/Thingworx/", url = Text.Combine({baseUrl,endpoint}), body = "{""InputParameter"": ""InputValue""}", request = Web.Contents( url, [ Headers = [ appKey = appKey, #"Content-Type" = "application/json", Accept = "application/json" ], Content = Text.ToBinary(body) ] ), Source = Json.Document(request) in Source     Here's an example of sending a string and an integer: let appKey = "your-application-key-here", endpoint = "Things/YourThingNameHere/Services/YourServiceNameHere", baseUrl = "https://YourServerNameHere/Thingworx/", url = Text.Combine({baseUrl,endpoint}), body = "{""InputString"": ""Hello, world!"", ""InputNumber"" : 42}", request = Web.Contents( url, [ Headers = [ appKey = appKey, #"Content-Type" = "application/json", Accept = "application/json" ], Content = Text.ToBinary(body) ] ), Source = Json.Document(request) in Source   Here is an example for sending an InfoTable. Note that you must supply the dataShape with fieldDefinitions. If you're using an existing Data Shape, you can get the JSON by using the service GetDataShapeMetadataAsJSON() that is on the data shape.     let appKey = "your-application-key-here", endpoint = "Things/YourThingNameHere/Services/YourServiceNameHere", baseUrl = "https://YourServerNameHere/Thingworx/", url = Text.Combine({baseUrl,endpoint}), body = "{""propertyNames"": { ""rows"": [ { ""name"": ""FirstEntityName"", ""description"": ""The first entity"" }, { ""name"": ""SecondEntityName"", ""description"": ""The second entity"" }], ""dataShape"": { ""fieldDefinitions"": { ""name"": { ""name"": ""name"", ""aspects"": { ""isPrimaryKey"": true }, ""description"": ""Entity name"", ""baseType"": ""STRING"", ""ordinal"": 0 }, ""description"": { ""name"": ""description"", ""aspects"": {}, ""description"": ""Entity description"", ""baseType"": ""STRING"", ""ordinal"": 0 } } } }}", request = Web.Contents( url, [ Headers = [ appKey = appKey, #"Content-Type" = "application/json", Accept = "application/json" ], Content = Text.ToBinary(body) ] ), Source = Json.Document(request) in Source       If I find any more interesting ways to use Power BI with ThingWorx services, I'll add them on here.  
View full tip
  Hi everyone,   In anticipation of ThingWorx 9.0’s biggest feature, active-active clustering, we’d like to provide an architectural overview of a sample active-active configuration and its underlying components. If you haven’t already seen it, we invite you to read our previous Community tech tip, where we introduce the concept of active-active clustering for ThingWorx Foundation, which enables you to: significantly reduce unplanned downtime for your mission-critical services and apps support horizontal scalability of the ThingWorx Server where you can scale your services up and down based on your requirements easily run, package, deploy, and operate advanced apps and services with the help of intuitive browser-based navigation, interactive monitoring and debugging tools, and more deploy anywhere - public cloud, private datacenter, on-premise, hybrid, or even locally on your laptop with deep optimizations for Azure Now, before we go too deep, we’d like to let you know that you can continue to seamlessly upgrade from previous versions of ThingWorx releases to upcoming ThingWorx 9.0  releases. Previously, you could deploy ThingWorx Foundation in a “single server” mode, and, for a high availability in “active-passive cluster” mode (see here for details). In the ThingWorx 9.0 release and onwards, you’ll be able to continue to deploy ThingWorx Foundation in a “single server” mode and for high availability scenarios via our new “active-active cluster” mode. Please note that active-passive clustering configuration will no longer be supported in ThingWorx 9.0 or onwards.   Let’s start with a quick recap on how the ThingWorx Foundation 9.0 release would look like in single server deployment.   ThingWorx 9.0 Deployed in a "Single-Server" Architecture Below is a high-level diagram depicting the main architectural layers and components of ThingWorx Foundation deployed in a single server mode.   Deployment Components Below is a brief summary of all major architectural components and their purpose in the deployment architecture:   The Client Layer This layer is comprised of everything that connects with, sends data to, and receives content from the ThingWorx platform. It be broken down into two groups: Devices/Things: Things, devices, agents, and other assets. Users/Clients: People and the respective products (primarily web browsers) they use to access ThingWorx.   The Application Layer This layer is where ThingWorx Foundation and other applications deployed with ThingWorx Foundation reside, such as ThingWorx Analytics, ThingWorx Connection Server, ThingWorx Azure IoT Hub Connector and others. This layer provides connectivity to the client layer, performs authentication and authorization checks, ingests/processes/analyzes content, and reacts to conditions by sending alerts. For a specific ThingWorx Foundation deployment that needs basic device data ingestion, processing and storage, you can setup only ThingWorx Foundation server. In some cases, with large number of device connections, you may want to setup ThingWorx Connection Server with ThingWorx Foundation in a single server for further scalable connectivity. ThingWorx Foundation: ThingWorx Foundation is a java-based application that serves as a rapid, model-based application development platform. Shared File Storage: Shared Disk space to contain ThingWorx Storage repositories, store and archive log files accessed by all ThingWorx Foundation servers. A NAS file storage, AWS Elastic File System, or Azure Files and others could be used for this purpose.   The Data Layer ThingWorx Foundation includes several persistence provider implementations that enable you to choose a database option that best fits your use case. A persistence provider enables the connection to a data store and the ability to perform a CRUD operation on that data. See here for more information. Currently, there are two basic variations of persistence providers: Model Provider – Responsible for ThingWorx model metadata and system data. Data Provider – Responsible for runtime data ingested against the model elements, including streams, value streams, data tables, etc. ThingWorx supports H2 (in-memory Database), PostgreSQL, MS SQL Server and AzureSQL as both model and data providers, and InfluxDB as only a data provider. Please see here for model and data best practices.   ThingWorx 9.0 Deployed in an Active-Active Clustering Reference Architecture Below is a reference architecture diagram for ThingWorx 9.0 with multiple ThingWorx Foundation servers configured in an active-active cluster deployment. Please note that this is only one reference example of how ThingWorx 9.0 can be deployed in an active-active clustered environment. There could be other architectural configurations dependent upon the needs of the specific deployment. Deployment Components Once you have developed an understanding of the basic architectural components in a single server mode, below are the additional components required to run ThingWorx in active-active cluster mode.   The Client Layer This will be similar to what has been mentioned in the above single server configuration.   The Application Layer In this layer, if you’re familiar with ThingWorx active-passive cluster configuration, then you may be aware of most of the components used below—with the exception of a new component: Apache Ignite that provides Distributed Caching for the horizontally scalable ThingWorx Foundation servers. Load Balancer: A third-party device that receives network traffic and distributes requests among available servers. In active-active cluster configuration, the load balancer is used to direct WebSocket-based traffic to the ThingWorx Connection Servers while user requests (http/https) traffic is directly distributed to the ThingWorx Foundation servers. Users can continue to use a load balancer with ThingWorx 9.0 that they might already be using for their existing active-passive or single server deployments with ThingWorx 8.X or previous releases.  Some example load balancers include, but are not limited to: HAProxy, Azure Application Gateway, and AWS Application Load Balancer. ThingWorx Connection Services: These services handle all messages routing to and from devices, providing scalable connectivity to the ThingWorx Foundation Server. With the ThingWorx 9.0 release, ThingWorx Connection Services have been upgraded with many additional features to support active-active clustering of the ThingWorx Foundation servers, where now they route all WebSocket traffic in a round robin fashion to the connected ThingWorx Foundation servers. Depending upon the various use cases, one could use multiple ThingWorx Connection Services available, such as ThingWorx Connection Server, ThingWorx Azure IoT Hub Connector, and ThingWorx Protocol Adapter Toolkit. Please see here for further details. Please note that for ThingWorx 9.0 releases, ThingWorx Connection Server would be required in an active-active configuration to support all the WebSocket-based traffic routing, including egress of files and device messages from multiple ThingWorx Foundation servers back to the devices, and it would also serve as WebSocket communication from ThingWorx Mashup-based applications to ThingWorx Composer. ThingWorx Foundation: With ThingWorx 9.0 and onwards, you can set up ThingWorx Foundation servers in an N-active-active cluster model to provide higher availability to your applications and horizontally scale the Foundation server nodes up and down based on your scalability needs. Apache Zookeeper: Apache ZooKeeper is a centralized service for maintaining configuration information and naming as well as providing distributed synchronization and group services. It is a coordination service for distributed applications that enables synchronization across a cluster. Specific to ThingWorx, ZooKeeper is used for distributed locking, selecting a singleton server during the server initialization, service discovery for Apache Ignite, allowing it to find instances of ThingWorx Foundation servers. Apache Ignite: This offers a distributed cache for the active-active cluster setup. It is used by ThingWorx Foundation Servers to share state. It may be embedded with each ThingWorx instance or can be run as a standalone cluster for larger scale. In this configuration, Ignite is set up in a standalone cluster but can be run embedded within the ThingWorx Foundation. Running Ignite in a standalone cluster is more ideal for larger scale, as it supports higher vertical scale of memory in the deployment setup.   The Data Layer ThingWorx is largely database agnostic. You can continue to use officially supported persistence providers that you may already be using in your existing deployments based off of ThingWorx 8.X or previous releases. Please look out for an upcoming ThingWorx update as well as enhanced installation documents to help with your upgrade and migration questions with the general availability of ThingWorx 9.0. Please note that this diagram does not make the distinction between model and data providers; depending on your data ingestion needs, separate model and data providers can be used. As a reminder, all databases should be deployed in a high-availability configuration to help eliminate any single point of failure.   In closing, we can't wait to launch active-active clustering in 9.0 to help you: dramatically further reduce application downtime scale your deployments and more efficiently manage your apps, regardless of where they’re deployed   If you have any questions about active-active clustering or its architecture, please do not hesitate to reach out!   Stay connected! Kaya
View full tip
Users of ThingWorx Analytics (TWA) may choose to create a predictive model using TWA or import a predictive model that was created using other software. When importing into or exporting out of TWA, this predictive model must be in a PMML (Predictive Model Markup Language) version 4.3+ format. This post describes how to complete the import and export processes. Exporting: The user may create a model in two main ways inside of TWA: using the Builder user interface, or by using ‘Create Job’ service that exists the Training Thing. Whichever method is used, a model Job Id is created automatically by TWA for that model. It is this model Job Id that is used to identify the model inside of TWA, regardless of what is being done with that model.   If a model is trained using Builder, the user may highlight that model, click ‘Job Details’, and then copy the Job ID. This is done as follows:   Next, the user will navigate to Browse --> Things --> …TrainingThing. This is the Training Microservice inside of TWA where all the functionality involved with training a model exists. Within the …TrainingThing, the user will execute the ‘RetrieveModel’ service under Services. When executing the service, the user will paste the model Job ID (ex. 49704f1a-7fcd-4e38-ab53-84ef46517d0a) they copied earlier, and press ‘Execute’. The resulting text can then be highlighted and copied to Notepad or some other text editor, and saved as .pmml format (ex. ‘ModelExport.pmml’).   Importing Through Results Microservice: To import a model that has been saved in PMML 4.3+ format into TWA using the Results Microservice, the user will navigate to Manage --> Repositories (ex. AnalyticsUploadStorage) --> Actions --> Upload, and choose the PMML file. The user will then navigate to Browse --> Things --> …ResultsThing. This is the Results Microservice inside of TWA where all the functionality exists related to previously trained models. Within the …ResultsThing, the user will execute the ‘UploadModel’ service under Services. Alternatively, the user can upload the model from any repository using ‘UploadModelFromRepository” service.   To create a model from the uploaded PMML inside of TWA, the user will fill out the filePath and name then execute the service. Note: This model will not show up in Builder, as that would require model validation information that is not part of the imported PMML file.   The resulting Job Id can be used to make predictions, such as by using the …PredictionThing’s BatchScore or RealtimeScore services. At this point, the uploaded model acts the same way as if the model were created inside of that TWA environment.       Importing Through Analytics Manager: To import a model that has been saved in PMML 4.3+ format into TWA using the Analytics Manager, the user will navigate to Analytics --> Analytics Manager --> Analysis Models, and click the green “New” button. Next the user will choose the provider name (or create a new one by navigating to Analytics --> Analytics Manager --> Analysis Providers). The user will also check the box to “Upload Model”, and click the grey “Choose File” button to find the PMML file. Finally, the user will click the black “Upload” button, then the green “Save” button.     At this point, the model is uploaded into ThingWorx Analytics, and the user may progress through the subsequent steps to set up “Analysis Events” and “Analysis Jobs” that will be powered by the imported model.
View full tip
Previously Installing & Connecting C SDK to Federated ThingWorx with VNC Tunneling to the Edge device Installing and configuring Web Socket Tunnel Extension on ThingWorx Platform Overview     Using the C SDK Edge client configuration we did earlier, we'll now create above illustrated setup. In this C SDK Client we'll push the data to ThingWorx Publisher with servername : TW802Neo to ThingWorx Subscriber servername : TW81. Notice that the SteamSensor2 on the pulisher server is the one binding to the C SDK client and the FederatedSteamThing on subscriber is only getting data from the SteamSensor2. Let's crack on!   Content   Configure ThingWorx to publish Configure ThingWorx to subscribe Publish entities from Publisher to the Subscriber Create Mashup to view data published to the subscriber Pre-requisite Minimum requirement is to have two ThingWorx servers running. Note that both ThingWorx systems can be publisher & subscriber at the same time.   Configure ThingWorx Publisher   Configuring Federation Subsystem   1. Navigate to ThingWorx Composer > Subsystems > Federation Subsystems and configure the following highlighted sections   Essentially its required to configure the Server Identification, i.e. My Server name (FQDN / IP) , My Server Description (optional) Federation subscribers this server publishes to, i.e. all the server you want to publish to from this server. Refer to the Federation Subsystem doc in the Help Center to check detail description on each configurable parameter.   2. Save the federation subsystem   Configuring a Thing to be published   1. Navigate back to the Composer home page and select the entity which you'd like to publish 2. In this case I'm using SteamSensor2 which is created to connect to the C SDK client 3. To publish edit the entity and click on Publish checkbox, like so 4. Save the entity   Configure ThingWorx Subscriber     Configuring Federation Subsystem   1. Navigate to ThingWorx Composer > Subsystems > Federation Subsystems and configure the following highlighted sections   Essentially its required to configure the Server Identification, i.e. My Server name (FQDN / IP) , My Server Description (optional) Refer to help center doc on Federation Subsystems should you need more detail on the configurable parameter If you only want to use this server as a subscriber of entities from the publishing ThingWorx Server, in that case you don't have to Configure the section Federation subscribers this server publishes to, I've configured here anyway to show that both servers can work as publishers and subscribers   2. Save the federation subsystem   Configuring a Thing to subscribe to a published Thing 1. Subscribing to an entity is fairly straight forward, I'll demonstrate by utilizing the C SDK client which is currently pushing values to my remote thing called SteamSensor2 on server https://tw802neo:443/Thingworx 2. I have already Published the StreamSensor2, see above section Configuring a Thing to be published 3. Create a Thing called FederatedStreamThing with RemoteThingWithTunnels as ThingTemplate, 4. Browser for the Identifier and select the required entity to which binding must be done, like so   5. Navigate to the Properties section for the entity, click Manage Bindings to search for the remote properties like so for adding them to this thing:     6. Save the entity and then we can see the values that were pushed from the client C SDK     7. Finally, we can also quickly see the values pulled via a Mashup created in the subscriber ThingWorx , below a is a simple mashup with grid widget pulling values using QueryPropertyHistory service  
View full tip
There are many choices in life and ThingWorx offers some persistence provider options as well. As of ThingWorx release 8.2, five Database options are provided. 1 PostgreSQL  9.4.5 minimum 2 DataStax Enterprise Edition 4.6.3,5 3 SAP HANA  SPS 11, 12 4 Microsoft SQL Server 2014 and later 5 H2 (version info is not available, maybe because it's an embedded?) H2 is for small scale, mainly for testing purpose, PostgreSQL and Microsoft SQL Server are for middle scale and finally DataStax Enterprise Edition is for big scale. I don't have enough information about SAP HANA so would like to leave it untouched in my comment... I don't have a number as to how many customers are using which database but my gut feeling tells me that PostgreSQL is a popular option, especially cost-wise. PostgreSQL offers powerful tools, such as logging and utilities, to troubleshoot issues.   In this post I would like to cover some useful information you can retrieve by using pgstattuple and pgstatindex of contrib module. By default, PostgreSQL takes a good care of fragmentation and reindex by itself. But in some cases, there's a situation that you want to review status of the database to narrow down the cause of your troubleshooting issue. There are many ways to achieve it but contrib module is provided to review stats of tables and indexes. As explained in this article, it is recommended to keep the number of records in value_stream and stream less than 100,000. That means you'll insert and delete many records when running ThingWorx. What happens then? If you delete(/update) a record in a table, PostgreSQL keeps the previous record in a page but mark it as deleted(and inserts a new record when it's update operation) If the number of those logically deleted records increases, PostgreSQL needs to access many pages of the table to obtain records which meets the criteria user might experience slow performance because of this Those logically deleted records will be ultimately removed from pages when vacuum is run   If you have installed contrib module and enabled it, you can review stats of tables by command below; select * from pgstattuple('stream');                             //This returns the stats of stream table select * from pgstatindex('stream_id_time_index');    //This returns the stats of an index on stream table   pgstattuple returns information below (I modified the format to make it more readable in this post) and meaning of each items are explained in the document .   table_len tuple_count tuple_len tuple_percent dead_tuple_count dead_tuple_len dead_tuple_percent free_space free_percent  8192 1 33 0.4 3  97 1.18 8004 97.71   Before obtaining the stat, I Inserted 4 records and Deleted 3 records and therefore it shows that tuple_count (the active record is 1) and dead_tuple_count (the logically deleted records are 3) and dead_tuple_percent is 1.18. If dead_tuple_percent is high, that means the table is not vacuumed or many DML were executed after the last vacuum operation and this could be the cause of the slow ststem performance.   * IMPORTANT: pgstattuple, pgstatindex consumes resources so it's recommended to run them during the maintenance window.   Takaaki
View full tip
This example is to achieve to update objects in Windchill thru extensions. It is really hard to find a resource for Windchill extension's services to take an advantage of them. So, I wrote a simple example to update objects in Windchill from Thingworx.   There are three data shapes needed to do this. One is "PTC.PLM.WindchillPartUfids" which has only "value" field (String) in it and another is "PTC.PLM.WindchillPartCheckedOutDS" which has a "ufid" field (String). Last one is "PTC.PLM.WindchillPartPropertyDS" which has a "ufid" field (String) and fields for "attributes". For an instance of the last data shape, there might be three fields as "ufid", "partPrice" and "quantity" to update parts. In this example, this data shape has two fields which are "ufid" and "almProjectId".   In this example, this needs two input parameters. One is ufid (String) and almProjectId (String). If you need to have multiple objects to update at once, you can use InfoTable type as an "ufid" input parameter instead of String type.   Note that this is an example code and need to handle exceptions if needed.     // To var params = {     infoTableName : "InfoTable",     dataShapeName : "PTC.PLM.WindchillPartUfids" };   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.PLM.WindchillPartUfids) var ufids = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);   // PTC.PLM.WindchillPartUfids entry object var newValue = new Object(); newValue.value = ufid; // STRING   ufids.AddRow(newValue);   // Check out var params = {     ufids: ufids /* INFOTABLE */,     comment: undefined /* STRING */,     dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */ };   // checkedOutObjs: INFOTABLE dataShape: "undefined" var checkedOutObjsFromService = me.CheckOut(params);   var params = {     infoTableName : "InfoTable",     dataShapeName : "PTC.PLM.WindchillPartUfids" };   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.PLM.WindchillPartUfids) var checkedOutObjs = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);   try {     var tableLength = checkedOutObjsFromService.rows.length;       for (var x = 0; x < tableLength; x++) {         var row = checkedOutObjsFromService.rows;               // PTC.PLM.WindchillPartUfids entry object         var checkedOutObj = new Object();         checkedOutObj.value = row.ufid.substring(0,row.ufid.lastIndexOf(":")); // STRING               //logger.warn("UFID : " + checkedOutObj.value);         checkedOutObjs.AddRow(checkedOutObj);           /* Update Objects in Windchill */         var params = {             infoTableName : "InfoTable",             dataShapeName : "PTC.PLM.WindchillPartPropertyDS"         };           // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.ALM.WindchillPartPropertyDS)         var wcInfoTable = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);           // PTC.ALM.WindchillPartPropertyDS entry object         var newEntry = new Object();         newEntry.ufid = checkedOutObj.value; // STRING         newEntry.almProjectId = almProjectId; // STRING           wcInfoTable.AddRow(newEntry);           var params = {             objects: wcInfoTable /* INFOTABLE */,             modification: "REPLACE" /* STRING */,             dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */         };           // result: INFOTABLE dataShape: "undefined"         var result = me.Update(params);     }   } catch(err) {     logger.warn("ERROR Catched");     var params = {         ufids: ufids /* INFOTABLE */,         dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */     };       // result: INFOTABLE dataShape: "undefined"     var result = me.CancelCheckOut(params);  }   var params = {     ufids: checkedOutObjs /* INFOTABLE */,     comment: undefined /* STRING */,     dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */ };   // result: INFOTABLE dataShape: "undefined" var result = me.CheckIn(params);
View full tip
Original Post Date:            September 30, 2016   Description: This tutorial video will walk you through the installation process for the PostgreSQL-based version of the ThingWorx Platform (7.2) in a RHEL environment.  All required software components will be covered in this video.    
View full tip
Underneath video walks through how to Publish a Model from Analytics Builder into Analytics Manager using the connector named TW.AnalysisServices.AnalyticsServer.AnalyticsServerConnector.
View full tip
  Remotely administer Windows Edge IoT Devices without coding.   GUIDE CONCEPT   Learn how to download, install, and configure the Edge Microserver (EMS) to create an AlwaysOn (TM) connection between Edge IoT Devices and ThingWorx Foundation.       YOU'LL LEARN HOW TO   Install the Edge MicroServer (EMS) Configure the EMS Connect the EMS to ThingWorx Foundation   NOTE: The estimated time to complete all parts of this guide is 30 minutes.     Step 1: Description   The Web Socket Edge MicroServer (WSEMS... or just EMS for short) is a pre-compiled application based on the C SDK.     Typically, the EMS is used on devices "smart" enough to have their own operating system, such as a Raspberry Pi or personal computer.   Rather than editing code and compiling into a custom binary (as with the SDKs), the EMS allows you to simply edit some configuration files to point the Edge IoT device towards the appropriate ThingWorx Foundation instance.   In addition, the EMS utilizes the PTC-proprietary AlwaysOn protocol to "phone-home", rather than having Foundation reach out to it. As such, the EMS will typically not require port forwarding/opening, and can easily communicate from a more-secure Edge environment to the Foundation server.     Step 2: Install EMS   In this step, you'll download and extract the EMS onto your personal Windows computer.   Versions of the EMS are available for Linux running on both x86 and ARM processors. Those are outside the scope of this guide, but require only minor modifications versus the instructions presented here.   Download the EMS.   Navigate to the directory where you downloaded the .zip file.     Extract the .zip file and explore into the extracted folder.     Navigate into and Copy all contents inside the \microserver directory.     Navigate to the C:\ root directory.     Create a C:\CDEMO_EMS folder. Note that this directory is not mandatory, but will be used throughout the rest of this guide.      Paste the contents of the extracted \microserver directory into C:\CDEMO_EMS.       Create Additional Directories   New folders may be added to the \CDEMO_EMS directory for various purposes.   Some of these will be utilized within this guide, while others may be utilized in future guides using the EMS.   Note that these particular names are not mandatory, and are simply the names used within this guide.    Create a C:\CDEMO_EMS\other directory. Create a C:\CDEMO_EMS\tw directory. Create a C:\CDEMO_EMS\updates directory.         Create Test Files   It can also be helpful during testing to have some small files in these folders to further demonstrate connectivity.   As these files were custom-created for the guide, seeing them within ThingWorx Foundation ensures that the connection between Foundation and the EMS is real and current.   In the C:\CDEMO_EMS\tw directory, create a text file named tw_test_01.txt. In the C:\CDEMO_EMS\other directory, create a text file named other_test_01.txt.     Click here to view Part 2 of this guide.
View full tip
  Step 8: Call Custom Service   In order to execute a Service of a specific Thing with the REST API, you can use the POST verb.   Required Parameters:   AppKey created by your ThingWorx server Name of the Thing that implements a custom Service Name of the custom Service Names of inputs, if any, required by the Service Request   Construct the URL. To call a custom Service of an existing Thing, make an HTTP POST to this endpoint: <server_ip:port>/Thingworx/Things/<name of Thing>/Services/<name of Service> Substitute <name of Thing> with the actual name of a Thing that exists on the ThingWorx server, and <name of Service> with an existing Service. Send request parameters The names of the inputs along with their values are sent in the body of the POST as a JSON object. For example, the JSON object below will send a parameter named 'firstNumber' with a value of 35 and a parameter named secondNumber with a value of 711. { "firstNumber": "35", "secondNumber": "711" } NOTE: The full request must include a header with the appKey for your specific ThingWorx server.   Response   A successful call to a Service will return a JSON object in the body of the response containing both a DataShape object and an array named rows. Inside the array, an object named result will have the value returned by the custom Service. Here is an example response:   { "dataShape": { "fieldDefinitions": { "result": { "aspects": {}, "baseType": "NUMBER", "description": "", "name": "result", "ordinal": 0 } } }, "rows": [ { "result": 746.0 } ] } WARNING for other HTTP clients: Most HTTP clients do not set a Content-Type header by default, without this header set the server will return an error message. The POST request to the Service endpoint has a JSON body so the header must be set to match the format of the request body.   Step 9: Import and Export Entities   Collections of Entities that perform a function can be grouped then shared by exporting from a server. These entity collections are called Extensions and can be uploaded using the REST API. You can create custom Extensions or download Extensions created by other developers. You can use the REST API to automate the process of uploading an Extension to a ThingWorx server.   Required Parameters   AppKey created by your Foundation server Path to properly formatted zip file containing extension Entities Request   Construct the URL. Upload an Extension by making an HTTP POST to the endpoint: <Server IP:port〉Thingworx/ExtensionPackageUploader Send request parameters. The zip file that contains the extension entities is uploaded as a multi-part POST. The name of the file parameter is upload. Use a library to properly format the multi-part POST request You must also send this header: X-XSRF-TOKEN:TWX-XSRF-TOKEN-VALUE Authenticate the Request. All API requests to the ThingWorx server must be authenticated either with a username and password or with an appKey. For this example we will authenticate by passing the appKey as a URL query string parameter. The parameter appKey is recognized by the ThingWorx server as an authentication credential in requests, it can be passed either as a URL query string parameter .../CreateThing?appKey=64b87... , or as request header appKey: 64b87...   Response   A successful call to upload an Extension will return a description of the Entities that were successfully uploaded in the body of the response.   HTTPie example: http -f POST iotboston.com:8887/Thingworx/ExtensionPackageUploader upload@/home/ec2-user/extension.zip X-XSRF-TOKEN:TWX-XSRF-TOKEN-VALUE appKey:d0a68eff-2cb4-4327-81ea-7e71e26bb645 cURL example: curl -v --header X-XSRF-TOKEN:TWX-XSRF-TOKEN-VALUE --header appKey:d0a68eff-2cb4-4327-81ea-7e71e26bb645 -F upload=@extension.zip iotboston.com:8887/Thingworx/ExtensionPackageUploader?purpose=import&validate=false     Download Things By Name   The REST API can be used to export a file representation of Things on a ThingWorx Foundation server. The downloaded file can be imported to another ThingWorx server making the Thing available for use.   Required Parameters   AppKey created by your Foundation server Name of the Thing Request   Construct the URL. Retrieve the components of a Thing by making an HTTP GET to the endpoint. Substitute <name of Thing> with the actual name of a Thing that exists on the ThingWorx server that wil be downloaded. <Server IP:port>/Thingworx/Exporter/Things/<name of Thing> Send request parameters. No parameters are sent. Authenticate the Request. All API requests to the ThingWorx server must be authenticated either with a username and password or with an appKey. For this example we will authenticate by passing the appKey as a URL query string parameter. The parameter appKey is recognized by the ThingWorx server as an authentication credential in requests, it can be passed either as a URL query string parameter .../CreateThing?appKey=64b87... , or as request header appKey: 64b87...   Response   It is possible for the content to be returned in two different formats by sending an Accept header with the request.   Desired Response Type  Accept Header Values JSON application/json XML text/xml HTML text/html (or omit Accept Header) CSV text/csv   A successful call to download a Thing will return a file in the body of the response suitable for importing into a ThingWorx Foundation server.   HTTPie example:   http -v GET iotboston.com:8081/Thingworx/Exporter/Things/PiThing appKey==d0a68eff-2cb4-4327-81ea-7e71e26bb645 Accept:text/xml     Download Things By Tag   The REST API can be used to export a file representation of Things on a ThingWorx Foundation server. This file can be imported to another ThingWorx server making the Thing available for use.   Required Parameters   AppKey created by your Foundation server Name of the Tag Request   Construct the URL. Retrieve the components of a Thing by making an HTTP GET to the endpoint <Server IP:port〉/Thingworx/Exporter/Things Send request parameters. The Tag name is sent as a request parameter named: searchTags Authenticate the Request. All API requests to the ThingWorx server must be authenticated either with a username and password or with an appKey. For this example we will authenticate by passing the appKey as a URL query string parameter. The parameter appKey is recognized by the ThingWorx server as an authentication credential in requests, it can be passed either as a URL query string parameter .../CreateThing?appKey=64b87... , or as request header appKey: 64b87...   Response   It is possible for the content to be returned in two different formats by sending an Accept header with the request.   Desired Response Type  Accept Header Values JSON application/json XML text/xml HTML text/html (or omit Accept Header) CSV text/csv   A successful call to download a Thing will return a file in the body of the response suitable for importing into a ThingWorx Foundation server   HTTPie example:   http -v GET iotboston.com:8081/Thingworx/Exporter/Things searchTags==Applications:Raspberry_light appKey==d0a68eff-2cb4-4327-81ea-7e71e26bb645 Accept:text/xml     Step 10: Authentication Tags   A Tag is composed of two parts: a Vocabulary, and a specific vocabulary term. A Tag is shown as Vocabulary:VocabularyTerm. Almost every ThingWorx entity can be tagged. Tags can be used to create a relationship between many different ThingWorx Entities.   Create New Tag   You can use the REST API to create a new dynamic Tag vocabulary.   Required Parameters   AppKey created by your Foundation server Name of Tag Vocabulary   Request   Construct the URL. Create a new Tag Vocabulary by making an HTTP PUT to this endpoint: 〈Server IP:port〉/Thingworx/ModelTags Send Request Parameters. The name of the new DataShape and the name of the base DataShape that the new DataShape extends are sent in the body of the POST as a JSON object. For example, the JSON object below will create a new DataShape named SomeTestDataShape using the system template GenericThing. { "name": "SecondTest", "isDynamic": "true" } Authenticate Request. All API requests to the ThingWorx server must be authenticated either with a username and password or with an appKey. For this example we will authenticate by passing the appKey as a URL query string parameter. The parameter appKey is recognized by the ThingWorx server as an authentication credential in requests, it can be passed either as a URL query string parameter .../CreateThing?appKey=64b87... , or as request header appKey: 64b87...   Response   A successful call to the ModelTag Service does not return any content in the body of the response, only an HTTP 200 is returned.   HTTPie example:   http -v -j PUT http://52.201.57.6/Thingworx/ModelTags name=SecondTest isDynamic=true appKey==64b879ae-2455-4d8d-b840-5f5541a799ae     Warning for other HTTP clients: Most HTTP clients do not set a Content-Type header by default, without this header set the server will return an error message. The PUT request to the ModelTags endpoint has a JSON body so the header must be set to match the format of the request body. The Content-Type header does not appear in the sample HTTPie call because HTTPie sets the Accept and Content-type request headers to application/json by default. Below is an example cURL call that explicitly sets the Content-Type header to application/json.   curl -v -H "Content-Type: application/json" -X PUT -d '{"name": "SecondTest", "isDynamic":"true"}' http://52.201.57.6/Thingworx/ModelTags?appKey=d0a68eff-2cb4-4327-81ea-7e71e26bb645   Add Tag to Thing   You can use the REST API to add a Tag to a Thing. There must be a Thing and a Dynamic Tag Vocabulary already created on your Foundation Server before you can add a Tag.   Required Parameters   AppKey created by your Foundation server Name of the Thing to be tagged Name of Dynamic Tag Vocabulary Name of for Tag to be assigned to Thing Request   Construct the URL. Substitute 〈name of Thing〉 with the actual name of a Thing that exists on the ThingWorx server that will have the Tag added. Add a new Tag to an existing Thing by making an HTTP POST to this endpoint: 〈Server IP:port〉/Thingworx/Things/〈name of Thing〉/Services/AddTags Send request parameters. The name of the new field to be added and type of the field are sent in the body of the POST as a JSON object. For example, the JSON object below will create a new field named SomeNumber using the ThingWorx base type NUMBER. Some other commonly used types are STRING, INTEGER, and BOOLEAN. Include a header in the full request with the appKey for your specific ThingWorx server. { "tags" : "SecondlightTest:RaspberryTest", }   Response   A successful call to the AddTags Service does not return any content in the body of the response. Only an HTTP 200 is returned.   HTTPie example:   http -v -j http://52.201.57.6/Thingworx/Things/SomeTestThing/Services/AddTags appKey==64b879ae-2455-4d8d-b840-5f5541a799ae tags=SecondTest:RaspberryTest curl -v -H "Content-Type: application/json" -X POST -d '{"tags": "SecondlightTest:RaspberryTest"}' http://52.201.57.6/Thingworx/Things/PiThing/Services/AddTags?appKey=d0a68eff-2cb4-4327-81ea-7e71e26bb645 Warning for other HTTP clients: Most HTTP clients do not set a Content-Type header by default, without this header set the server will return an error message. The POST request to the AddPropertyDefinition endpoint has a JSON body so the header must be set to match the format of the request body. The Content-Type header does not appear in the sample HTTPie call because HTTPie sets the Accept and Content-type request headers to application/json by default. Below is an example cURL call that explicitly sets the Content-Type header to application/json.   curl -v -H "Content-Type: application/json" -X POST -d '{"tags": "SecondlightTest:RaspberryTest"}' http://52.201.57.6/Thingworx/Things/PiThing/Services/AddTags?appKey=d0a68eff-2cb4-4327-81ea-7e71e26bb645      Click here to view Part 4 of this guide.  
View full tip
This zip file contains my slides and buildable project from my Liveworx 207 developer chat presentation featuring the Thing-e Robot and raspberry pi zero integration with ThingWorx using the C SDK.
View full tip
As of Dec. 23, 2021, ThingWorx 9.3.0 is available for download and on Jan. 7, 2022, it will be available for Cloud Services.   What’s new in ThingWorx 9.3? Composer and Mashup Builder  Enhanced ability to find entities, code references, and project dependencies with a new Referenced by report feature. New Grid widget allows for improved visualization of row/columnar type data, with improved performance, styling, and configuration experiences.  Composer enhancements allow for faster configuration and application management in the areas of test execution and dynamic use of Master Mashups.    Analytics  New configuration parameters (binningStrategy and allowOverlap) available for profiles to tailor application of the algorithm to meet needs of domain use cases.  New option to utilize K-Fold cross validation to improve quality of predictive models created on limited data sets using industry standard technique.  Automate treatment of missing values to reduce data preparation before application of advanced analytics algorithms.    Foundation  Query performance improvements with indexing enables performance of quick “search” queries across thousands of connected assets and pre-filtering of query resultsets for better overall query performance.  Upgrade script improvements for logging, script execution and error handling which simplify the upgrade process.    Remote Access and Control  Improvements to Remote Access and Control includes Auto & Click to Launch capabilities that streamline the remote access startup process by starting the support application you need for you. New connectivity options enable user control and selection of local IP and Port addresses used in the session creation.    Software Content Management  Instructions can now be configured to execute either Synchronously or Asynchronously for a streamlined execution process. Added support for data migration of older, from the audit data table information into a format to the new audit database.    View release notes here and be sure to upgrade to 9.3!   Stay Connected, Rachel
View full tip
In our interactions with PTC customers we often learn they have previously performed Analytics modeling in Python, Matlab, R, or even built home grown analyses in languages such as Java or C++. As expected, when adopting an Industrial Innovation Platform such as ThingWorx that also has its own ThingWorx Analytics module, customers do not want to reimplement everything from scratch and would rather integrate their previous work in the Smart Applications built in ThingWorx, leveraging a combination of their existing toolset together with ThingWorx Analytics modeling. That is certainly possible and there are multiple ways to do that. In this article we will focus on several general ways to make that happen, but it is important to keep in mind that language specific approaches are also possible and we are happy to discuss those in the specific context of the customer.   Here are five different ways to bring existing Analytics into ThingWorx: If the task is to reuse an existing predictive model developed in a language such as Python/R/Matlab, typically one can export that model in PMML (Predictive Model Markup Language), an xml format, and import it in ThingWorx Analytics using the AnalyticsServer_ResultsThing -> UploadModel service. Libraries such as sklearn2pmml & r2pmml can be utilized towards that goal. The imported model can then be used in the same fashion as a ThingWorx Analytics developed model to power smart applications built in ThingWorx. If the Analysis involves more complex tasks than Predictive Modeling, such as custom data normalizations or non-standard Machine Learning models or home grown algorithms, one can use the options below. Call the ThingWorx exposed REST Web API from Python/Matlab/R/Java/Javascript. Every service from ThingWorx can be called that way, and the API can also be used to push analyses results into ThingWorx for further consumption, perhaps together with other sources of data such as sensor readings, in the smart applications built there. The documentation for the ThingWorx REST API can be found here.  Expose the existing Analytics via using a thin layer of REST Web Services. For example, in Python, this can be done using Flask, with few lines of code. Then, the orchestration can happen from ThingWorx by calling the exposed Web Service and weaving the results back into smart applications. Often our customers' current architecture involves a relational database (e.g. SQL Server, Oracle, etc) that is powering the existing Analytics, and stores the end results (predictions, correlations, etc). In this scenario, we can connect ThingWorx directly to that database to read these results.  Finally, in the case of complex Analytics, where a tighter integration with ThingWorx is desired, existing Analytics / algorithms can be wrapped into a ThingWorx Extension or an Analytics Provider using the corresponding PTC SDKs.  When choosing an integration option, customers need to carefully balance complexity of integration, constraints of their architecture, Analytics modeling complexity, as well as end user consumption requirements.
View full tip
    Automate business processes with Services, Events and Subscriptions.   Guide Concept   This project will introduce Services, Events, and Subscriptions inside of the ThingWorx platform. Following the steps in this guide, you will be able to expand your data model using Services, Events, and Subscriptions.   We will teach you how to make a more robust and enjoyable experience for users simply by using the resources inside of the ThingWorx Composer.   You'll learn how to   Create Events, Services and Subscriptions in Composer Utilize the ThingWorx Edge SDK platforms with Services, Subscriptions, and Events   NOTE:  The estimated time to complete this guide is 60 minutes     Step 1: Completed Example   This guide references the attached EventsServicesSubscription.zip. The sample application is based on a company needing to make deliveries. The rules engine will handle much of the work outside of the transportation workflow.   Unzip and utilize this file to see a finished example and return to it as a reference if you become stuck during this guide and need some extra help or clarification.   Keep in mind, this download uses the exact names for entities used in this tutorial. If you would like to import this example and also create entities on your own, change the names of the entities you create.   What's Inside    Name                                            Type PTCDelivers Thing PTCDeliversBusinessLogic Thing PackageStream Stream OrdersDatabase Database DeliveryDataTable Database PackageDataTable Database MerchantDatabase Database PlaneThingTemplate ThingTemplate MerchantThingTemplate ThingTemplate TruckThingTemplate ThiingTemplate ClientThingShape ThingShape VehicleDeliveryShape ThingShape PackageDataShape DataShape CustomerDataShape DataShape OrderDataShape DataShape PackageDeliveryDataShape DataShape MerchantOrderDataShape DataShape MerchantDataShape DataShape default_user User TestDashboardMashup1 Mashup TestGadget Mashup     Step 2: Functionality   You can combine Services, Subscriptions, and Events to automate business processes or trigger notifications. See the definitions and examples before to gain a better understanding as to the role each plays.    Name                 Function Services Methods and functions to be used within a ThingWorx application. There are Services provided by the system, but custom services need to be made to create the application's functional requirements. All services in the ThingWorx Composer can be accessed by a user with the appropriate permissions. Remote services can be created with the use of the ThingWorx SDKs. Events An Event represent a change in state of a property or changes in a running application. Changes can be handled in different ways based on the method used to fire or queue them. They are a great resource for property value changes and alerts. Subscriptions The implementation for a subscription is the same as that of a service. Subscriptions are activated when the Event they are listened for is triggered.       Step 3: Create Events   Events are used to mark situations that can occur within an application. They can be used for scenarios ranging from a dangerous situation that is imminent and needs to be checked by personnel to just a system notification. Each Event is based on a DataShape that will hold the necessary information about the Event.   For example, it wouldn’t be enough to just know that a rain Event has occurred. It might be helpful to also know the location, the time the rain started and expectations of the amount of rain to fall.   Follow the steps below to create an Event for the TruckThingTemplate Entity.   Open the TruckThingTemplate Entity and click the Events tab. Click Add.   Enter an Event name, such as PackageDelivered. For the DataShape, in the Search Data Shape field, filter and select PackageDataShape.   Events can be queued or fired based on updates done in Composer, property changes of a running application, or programmatically. Triggering or firing an Event can be done in Services utilizing JavaScript.       Step 4: Services Interface   To create Services within ThingWorx, go to the Services tab of the Entity that will house the Service. Our PTCDeliversBusinessLogic Thing will contain several Services for us.   Within the Services tab under the Entity Information section of a Thing, you will see built in functionality for all the ThingTemplates and ThingShapes the Thing inherits from. You will also see the section that allows you to create new Services.   Service Info Tab   This tab is for the general information of the Service. This is where you will add the Service name, description, category, whether the service is asynchronous, and whether the Service can be overridden.     Inputs Tab   This tab is for the parameters for the Service. For input parameters, a parameter can be required and default values are allowed.     Outputs Tab   This tab is for the return type of the Service. The resulting output of the Service can stretch from being nothing to user defined Entities.     Snippets Tab   This tab allows users to pick from reusable JavaScript code that is specific to ThingWorx. These snippets are grouped by resources used, functionality, and type. You are also provided a search bar to look for keywords or titles.   When a snippet is found, click the arrow. This will insert the JavaScript code into the Script section wherever the cursor is located. From here, edit the inserted snippet to work with the rest of your code. This section is often helpful in getting to know how to perform service calls and code for the ThingWorx environment.     Me/Entities Tab   A listing of all Properties, Events, and Services belonging to custom and provided Entities in ThingWorx. As with the Snippets section, clicking the blue and white arrow will insert the reusable code to wherever the cursor is located. After the code is inserted, update the code to your liking. This is often a great resource for buildiung up payload for Service calls.       Click here to view Part 2 of this guide.
View full tip
Announcements