ThingWorx Navigate is now Windchill Navigate Learn More

IoT Tips

Sort by:
  Use Subsystems to retrieve User/Thing-count and License-expiration information   Guide Concept   In this guide, you'll learn how a Foundation Administrator can access important license-accounting Subsystems.    In particular, we'll look at the User Management and Licensing Subsystems, and how you can use them to ensure that you're not getting close to running out of Users, Things, or your time-to-expiration.    We'll also explore how you can import a new license_capability_response.bin file when your original license is running low on time.      You'll learn how to   Access the Foundation Subsystems Execute built-in Services to retrieve: User counts Thing counts License expiration count Create a "License Dashboard" Mashup Update to a new License   NOTE: The estimated time to complete ALL parts of this guide is 30 minutes     Step 1: Introduction   ThingWorx Foundation uses a licensing system based around a file named license_capability_response.bin.  However, if you're using the downloadable installer, then it's possible that you've never even touched this part of the system before.   Besides doing the initial install, you'll also have to check to see that your license has enough Users, Things, and time left before expiration. You need to ensure that a new license is acquired and properly replaced beforehand... or else your Foundation installation may become unusable.    Therefore, this guide will run you through how to access that information, as well as how to update to a new license_capability_response.bin file when the time is right.      Step 2: Get User Count   The first item we'll investigate is checking on our number of Foundation Users.    When first developing an IoT application, low User counts are typically the norm. Only your team really needs access to Foundation itself, so having only a few Users more than your R&D-team-size is possibly going to be sufficient.    And, if your application is something along the lines of factory-monitoring, then it's possible that your User counts, even when deployed, are going to continue to stay relatively low.    However, many IoT applications involve a tremendous number of Users, as your end-customers will generate a Foundation User whenever they sign up for your application. Think something along the lines of a ride-sharing app, or even a Smart Cities play... either of those can result in thousands (if not tens-of-thousands) of Users.    As such, a Foundation system administrator will need to keep a tight track on the User counts to ensure, whenever you're approaching your upper threshold, that enough warning is given to provide time to receive and install a new license_capability_response.bin with a larger User count.    Navigate to Foundation Composer's Browse > All.   On the left-side Navigation, scroll down until you see the System section. Note that you will likely be unable to even see the System section unless you are an Administration-level User.   Click Subsystems.   Click UserManagementSubsystem.   At the top, click Services.   Scroll down and find the built-in GetUserCount Service.   On the line for GetUserCount, click the "play" icon for Execute Service.   At the bottom-right of the pop-up, click Execute.   Click Done to close the pop-up. The return value from GetUserCount is one way to reveal how many current Users are provisioned for your Foundation system.    Moving forward, we'll explore yet another way, while also looking into our Thing counts.    In particular, you might find that the GetUserCount value doesn't completely match due to some internal system accounts. These system accounts are not counted against your license, however, and may need to be accounted for when using the GetUserCount Service.        Step 3: Get Current License Info   While GetUserCounts can be helpful for getting the current amount of provisioned Users, it does nothing to compare that count to the total allowed by the license.    Instead, we'll make use of a different Subsystem, i.e. the LicensingSubsystem.    Return to Browse > System > Subsystems.   Click LicensingSubsystem.   At the top, click Services.   Scroll down until you find GetCurrentLicenseInfo.   On the GetCurrentLicenseInfo line, click the "play" icon for Execute Service.   On the bottom-right of the pop-up, click Execute.   When you're done analyzing the counts, click Done to close the pop-up.   The first thing to notice is that the InUseFeatureCount for twx_named_user possibly does not match the return of GetUserCount. As already mentioned, this is because of system accounts that are not counted against your license.    For example, for a fresh installation, the GetUserCount may return 4, while InUseFeatureCount returns 2. The 2-count is more accurate, as it is used versus your total license-amount. However, GetCurrentLicenseInfo is less useful for doing a simple comparison between a stored "last user amount" vs "current user amount".    The solution is simple. Compare the return of GetUserCount versus the return of GetCurrentLicenseInfo to determine the true total. In this case, the number of system accounts is 2, so some "GetUserCount - 2" custom-Service could be very helpful.   In addition, GetCurrentLicenseInfo returns the very important twx_things value, i.e. how many Thing Entities have been created in the system. This is typically another hard limit in your license, and needs to be watched over.    Finally, the DaysRemaining column shows how long you have before your license becomes inactive. This is something which needs to be constantly monitored to ensure that your Foundation system as a whole is still running!   Next, we'll explore making a Mashup to reference these built-in Services in a more comfortable environment which auto-updates and can be used as, effectively, a Foundation system dashboard.     Step 4: Create Dashboard Mashup   While the User Management and Licensing Subsystems can be used as previously described to determine relevant admin information, traversing through the Foundation backend can be tedious.    To make User, Thing, and License Expiration Date counts easier to monitor, we'll now create a "dashboard" which automatically pulls this information into a convenient Mashup.    Navigate to Browse > Visualization > Mashups.   Click + New.    Leave the defaults and click OK.   In the Name field, type Licensing_Mashup.   If Project is not set, search for and select PTCDefaultProject. Click Save. Click Design.   Set Layout   Now that we have a new Mashup, we'll start adding the items we'll need to display information from the Subsystems.    First, we'll change the Layout.   In the top-left, ensure that the Layout tab is active.   Click Add Top.   Ensure the new top section is selected, and set Positioning to Static.   Scroll down in the Layout tab, and set Container Size to Fixed Size.   In the new Height field, type 100, and then hit your keyboard's Tab key to lock-in your change. Note that the px will be automatically added after hitting the Tab key.   At the top, click Save.   Add Widgets   Now that we have the Layout we want, we can add Widgets to display data coming from the backend.   In the top-left, click the Widgets tab.   Drag-and-drop a Label Widget onto the top section.   With the Label still selected, in the bottom-left Properties section, change LabelText to User Count, and hit Tab.   Drag-and-drop a Text Field Widget, also in the top section, and below the Label.   In the top-left Widgets tab, change Category to Legacy. Drag-and-drop an Auto Refresh Widget in the top section as well.    In the top-left Widgets tab, change Category back to Standard. Drag-and-drop a Grid Advanced Widget onto the Bottom section.   At the top, click Save.   Click here to view Part 2 of this guide.
View full tip
  Step 3: Creating Machine Processes   Having data and properties without moving parts would be useless. We already setup services on how to perform an inspection in the last guide. This time around, let's form processes to create our products and generate any alerts whenever we've finished production. We won't show too much into our recipe (it's a family secret), but we will highlight some concepts to help with making our machine process smoother. We won't go into creating an SMT line. For a complete guide on how to create such an application you can view our Monitor an SMT Assembly Line guide.   Events and Subscriptions   Let's create events for the cook time being done and a subscriptions on how to handle these scenarios.   Open the Fizos.BrautsMachine.ThingTemplate template and go to the Events tab. Click the + Add button and create the below alerts. Name Data Shape Description CookCompleted AlertEvent This alert is fired when the machine has completed a batch of our product TemperatureReached AlertEvent This alert is fired when the machine has reached the desired temperature.   3. Save these changes and replicate them with the Fizos.SausageMachine.ThingTemplate template. We now have alerts for the machines (alerts on properties and custom alerts). We can create subscriptions on the individual machines that will later inherit these alerts. One way to have a standard amongst these machines is to create a service in the template that will be called from subscriptions.   Let's create the simple services from now, and later, we will call these services from subscriptions. We will also use Schedules initiate the overall process. This can be a daily process, a timed process, or a process based on alerts and subscriptions. To keep this guide simple, we'll make this a process that runs every 3 hours. Create the below services in each of the Fizos.BrautsMachine.ThingTemplate and Fizos.SausageMachine.ThingTemplate templates:     Name InputReturn Type Override Async Description StartCookingProcess N/A Nothing Yes Yes Start overall product cooking process.Triggedby schedule. StartCookingIngredients N/A Nothing Yes Yes Start cooking the ingredients for the product.Triggedby subscription.   Add the following code for the StartCookingProcess service: // Mix our family secret recipe // Add in meats // Add heat me.TemperatureReached({ aackTimestamp: new Date(), alertType: "Completed", ackBy: "CookUser", ack: true, name: "TemperatureAlert", sourceProperty: "Service", description: "Optimal cooking temperature reached", message: "The desired cooking temperature has been reached. Begin next process.", priority: 1 }); // Perform top secret cooking techniques   2. Add the following code for the StartCookingIngredients service: // Temperature has already been reached if(true) { // IF COOK COMPLETED me.CookCompleted({ ackTimestamp: new Date(), alertType: "Information", ackBy: "CookUser", ack: true, name: "CookingAlert", sourceProperty: "Service", description: "Cooking process has completed", message: "The cooking time has completed. Move to next machine.", priority: 1 }); } else { //ELSE IF COOKING FAILED OR HALTED me.CookCompleted({ ackTimestamp: new Date(), alertType: "Exception", ackBy: "CookUser", ack: true, name: "CookingAlert", sourceProperty: "Service", description: "Cooking process has failed", message: "The cooking process has stopped. See logs for failure cause.", priority: 1 }); }   Let's keep in mind, these services will be triggered by by a schedule or a subscription to an alert. There will be no manual processed performed here. We also allowed for these services to be overridden, in order to allow more customization for machines that might need to be treated differently. These services are also not on the main machine templates, which allows us to create templates for machines that might have a different role in the process, ie, packaging or freeze drying.       Step 4: Automating Processes   Let's create the schedule that will start the cooking process. In the ThingWorx Composer, click the + New in the top left of the screen.   Select Schedules in the dropdown.   3. Name the new Schedule Fizos.MachineCooking.Schedule and set the Project (ie, PTCDefaultProject). 4. For the Run As User field, select the Fizos.Machine.User that was provided in the download. 5. Click Save and your entity should match the below configuration.     6. For the Schedule field, set it to 0 0 0/3 * * ?. This will run the process every 3 hours. 7. Switch to the Subscriptions tab and add a new subscription. 8. Name this new subscription StartCooking and select ScheduledEvent as the event input.   9. Add the following code to the source section:   var index = 0; var brauts = Resources["SearchFunctions"].SearchThingsByTemplate({ thingTemplate: "Fizos.BrautsMachine.ThingTemplate" }); var sausage = Resources["SearchFunctions"].SearchThingsByTemplate({ thingTemplate: "Fizos.SausageMachine.ThingTemplate" }); for(index = 0; index < brauts.rows.length; index++) { brauts.StartCookingProcess(); } for(index = 0; index < brauts.rows.length; index++) { sausage.StartCookingProcess(); } // This will begin the process to start each machines cooking process. // The process itself will then be ran by alerts, subscriptions, and services   So far the process will go as the following: A scheduler will search for all Things that inherit from the Fizos.BrautsMachine.ThingTemplate and Fizos.SausageMachine.ThingTemplate templates. Each list of these entities will have it's StartCookingProcess service called. When a temperature value is reached inside of the StartCookingProcess service, the TemperatureReached alert will be triggered. A subscription waiting for this event at the Thing level (which we create in the next section), will call the StartCookingIngredients service. This StartCookingIngredients service will finish the cooking part of the process and trigger a CookCompleted alert with a status update. A subscription waiting for this event at the Thing level (which we create in the next section), can call for another machine to take over the process.   We now have our moving parts. Let's create some examples to see how it can unfold.       Step 5: Handling Cooking Machines   We have many of our main features compeleted. Now we need to handle situations where our machine status needs to be a part of how we handle the cooking process. While much of this work can be handled in a remote process using one of our SDKs or extensions, our events and subscriptions can be a great method to do this also.   Creating Cooking Entities   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Thing in the dropdown.   Set the Project (ie, PTCDefaultProject) and in the name field, enter Fizos.BrautsMachine.M02X35. In the Base Thing Template field, enter Fizos.BrautsMachine.ThingTemplate. Click Save.   6. In the ThingWorx Composer, click the + New in the top left of the screen.     7. Select Thing in the dropdown.     8. In the name field, enter Fizos.SausageMachine.KGYX20. 9. In the Base Thing Template field, enter Fizos.SausageMachine.ThingTemplate. 10, Click Save.     Creating Cooking Subscriptions   These steps will need to be in both of the entities we just created.   Switch to the Subscriptions tab and add a new subscription. Name this new subscription TemperatureReady and select TemperatureReached as the event input.   3. Add the following code to the source section: //The temperature is high enough. Start cooking me.StartCookingIngredients();   You now have a system (when enabled) will run every three hours. It will start our secret cooking process (that can be added in the service or created remotely).   To make things more fun. Add some logging or fun code to see how this works step by step.   To make things more realistic to a real world scenario, add subscriptions for our property alerts on the status or state properties.       Step 6: Next Steps   Congratulations! You've successfully completed the Factory Line Automation guide. In this guide, you learned how to:   Create automated machine processes Use services, alerts, and subscriptions to handle processes without human interaction Integrating complex logic with straight forward step by step systems   The next guide in the Complex and Automatic Food and Beverage Systems learning path is Automated Distribution and Logistics.    Learn More   We recommend the following resources to continue your learning experience:    Capability     Guide Build ThingWorx Solutions in Food Industry Build Design Your Data Model Build Implement Services, Events, and Subscriptions   Additional Resources   If you have questions, issues, or need additional information, refer to:    Resource       Link Community Developer Community Forum  
View full tip
    Step 3: Creating Customer Data   Now let’s begin creating the customer data. Just enough examples for us to understand what is happening at each step. Let’s create at least 4 customers by following the steps below:   Open the Fizos.Customers.DataTable Data Table and go to the services tab. Open the AddDataTableEntries service to be executed. This service will allow us to create some general data to work with. You can create as many as you like for this test. Click the values parameter to start creating entries. After clicking + Add, and enter data for customers. Try to add at least 1 Factor data tag for each customer.     Save your entry and create a second entry with any location and tags you like. We aren’t adding vehicles as of yet, but we will in the next section. After saving, don’t forget to execute the service with the two entries saved. If you did it correctly, the values parameter of the service, should show at least 1 inside of the parentheses. See below for an example:     We just added customers manually. While convenient for our test, what we truly want is a system that is hands off. What we need is, a way to add customers programatically. Whether a customer is ordering on a website you created for them or they are checking out as a guest (we still want to track this). Below, you’ll see a quick service to add a new user. This service can be created inside of the Fizos.Customers.DataTable data table.   var customer = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape({ infoTableName : "InfoTable", dataShapeName : "Fizos.Customers.DataShape" }); var count = me.GetDataTableEntryCount(); var newEntry = {}; newEntry.ID = count + 1; newEntry.UUID = generateGUID(); newEntry.Type = Type; newEntry.Factors = "Fizos.CustomerTags:FirstTime"; newEntry.Name = Name; newEntry.Email = Email; newEntry.Address = Address; newEntry.Phone = Phone; customer.AddRow(); me.AddDataTableEntry({ sourceType: "Service", values: customer, source: "AddNewCustomer" }); We can adapt these for the customers that would rather not have accounts and be considered guests. Instead of the FirstTime data tag, you might want to add a Guest tag. For name, you could have it empty. The other fields, you’ll still want to likely have. This can give you insight into who these customers are that rather the guest checkout/ordering.     Step 4: Expanding Logistics Models   Let's do a quick review of what we have before we jump forward. In this Learning Path, we've setup scheduled factory inspections, machine automation, created customers, and setup order creation. What we're missing is the handling of deliveries.   In this learning path, we have talked about how to handle design aspects that could be held in a data table or have entities created to model each one. While there are many pros and cons to each method, we will do a mixture of both. Having the logistics data in data tables provide us with an easy form of querying data. Having entities match up with vehicles/transportation allows us to have greater tracking and live updates.   Let's create the vehicle/transportation data model, come up with logic on how to do deliveries from the factories we created earlier in this learning path, then setup a schedule or timer to kickstart the process.   Vehicles Data Model We already have our Data Table of vehicles. Let's create the templates and entities that will be a 1 to 1 between Thing and vehicle.   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Thing Shape in the dropdown.   In the name field, enter Fizos.Vehicles.ThingShape and select a Project (ie, PTCDefaultProject). This Entity will have Services implemented by all types of vehicles. Save your changes and create three Thing Templates which implement this Thing Shape. See below for examples:   Fizos.Vans.ThingTemplate: These are smaller vehicles used to make short or last step deliveries.     Fizos.Trucks.ThingTemplate: These are trucks of different types making larger deliveries.     Fizos.Planes.ThingTemplate: These are planes used to deliver products to long distance locations.     Handling Shipping and Deliveries The cost of shipping and delivering goods is often the last thing people want to think about. Sometimes the cost of shipping goods is more expensive than the goods themselves. So how can we make this one of our strongest factors? By continuing trying to make our design simpler and less costly. We all know that it won't be an easy feat. The best way to do this is to have a system where we can have analytics and continuously improve on.   Let's start with the beginner steps of creating our straight-forward delivery service. Then, we will add Value Streams and tracking to see where we can make improvements. Finally, the solutions get better as we repeat these steps. No one solution is perfect, and no logic will be without holes or issues. Nevertheless, you continuously work on it, so that you can save cost and improve customer experience.   Open Fizos.Vehicles.ThingShape and go to the Properties tab. Create the following list of Properties. These properties are the generic concepts for a vehicle that can deliver a package. Name Base Type Aspects Details FuelCapacity Number 0 minimum, unit: liters logged, persistent AverageFuelConsumption Number 0 minimum, unit: liters logged, persistent MaxMass Number 0 minimum, unit: kilograms logged, persistent MaxVolume Number 0 minimum, unit: cubic meters logged, persistent CurrentLocation Location N/A logged, persistent CurrentOrders InfoTable(Fizos.Orders.DataShape) N/A logged, persistent Your properties should look like the following: Inside the Fizos.Vehicles.ThingShape entity, go to the Services tab. Create the following list of Services. These services are also generic in nature and are based on the concept of a vehicle going to a pick up location, goods being loaded onto the vehicle, the vehicle traveling to a destination, then delivering goods. Name Input Return Type Override Async Description PickUpGoods PickUpLocation: Location Nothing Yes Yes Go to a pickup location (factory or otherwise), and pick up goods. LoadGoods Orders: InfoTable (Fizos.Orders.DataShape) Nothing Yes Yes Perform the task of loading goods onto a vehicle (adding rows to the CurrentOrders property) Travel Destination: Location Nothing Yes Yes Travel for destination A to destination B DeliverGoods Orders: InfoTable (Fizos.Orders.DataShape) Nothing Yes Yes Perform the task of unloading goods at a current location As you can see, the goods are orders. In a real world environment, we would create a separate Data Shape and Data Table for packages to hold a number of orders. We are doing this without the packages Data Table for simplicity in this example. One reason why our products have mass and volume properties is to help with the idea of loading a vehicle and the type of boxes or packaging to use. This could be another way to cut cost. Your Services should look like the following: The same way we were able to create a system that was automated, we will create events that will notify subscribers of certain tasks being complete. This way, the next level of service can be performed instantly by our robot army. Inside the Fizos.Vehicles.ThingShape entity, go to the Events tab. Create the following list of events. These Events will connect to a task being completed. For example, when the vehicle has arrived to a location, an Event will be triggered and thus the next task can begin. For a bit more automation, add an Event you might think is needed, like when fuel is needed in the vehicle.   Name Data Shape Description DestinationReached AlertEvent This alert is fired when the vehicle has reached a location (whether for delivery or pickup). OrdersLoaded AlertEvent This alert is fired when all orders have been loaded onto a vehicle. DeliveryCompleted AlertEvent This alert is fired when the vehicle has completed a delivery. This delivery might have been the last order delivery and vehicle needs to head back for more orders to be picked up. Your Services should look like the following: Let's take a quick break to go over how this will work. A Service in the Fizos.Logistics Entity will search for all Things that implement the Fizos.Vehicles.ThingShape Entity. Each list of these Entities will have it's PickUpGoods Service called with the desired pickup location. When the destination is reached inside of the PickUpGoods Service, the DestinationReached Alert will be triggered. A Subscription waiting for this Event at the Thing level, will call the LoadGoods Service based on the condition of no orders being in the vehicle CurrentOrders Property. This LoadGoods Service will finish and trigger a OrdersLoaded Event. A subscription waiting for this Event at the Thing level, will call the Travel service. The Service will be called with the customer location as the destination OR the location of another site to perform other tasks. When the destination is reached inside of the Travel Service, the DestinationReached Alert will be triggered. A Subscription waiting for this Event at the Thing level, will call the DeliverGoods Service based on the condition of orders being left in the CurrentOrders Property. When the delivery is complete, the DeliveryCompleted Alert will be triggered. A Subscription waiting for this Event at the Thing level, will decide whether to go to a factory or pickup location to restart the process or wait for more instructions. You may have noticed a few things here. For starters, we are starting this from the Fizos.Logistics entity instead of a scheduler. For this process, you can start it with a scheduler, but being a 24 hour company, we don't have a schedule to start deliveries. That being said, the click of a button would do the job.   You can also see that we haven't given you the service code for some of these services. For some of these functions, they're almost duplicates of prior services. What will be more challenging and fun is the logic for which orders go to which delivery method. This is a mixture of vehicle properties, order properties, customer type, and customer location.     Step 5: Next Steps   Congratulations! You've successfully completed the Automated Distribution and Logistics guide.   In this guide, you learned how to: Create automated logistical processes Use services, alerts, and subscriptions to handle processes without human interaction Integrating complex logic with straight forward step by step systems   The next guide in the Complex and Automatic Food and Beverage Systems learning path is Securing Industry Data.    Learn More We recommend the following resources to continue your learning experience:   Capability            Guide Build ThingWorx Solutions in Food Industry Build Design Your Data Model Build Implement Services, Events, and Subscriptions   Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Community Developer Community Forum Support Help Center
View full tip
  Use the C SDK to build an app that connects to ThingWorx with persistent bi-directional communication.     GUIDE CONCEPT   This project will introduce more complex aspects of the ThingWorx C SDK and help you to get started with development.   Following the steps in this this guide, you will be ready to develop your own IoT application with the ThingWorx C SDK.   We will teach you how to use the C programming language to connect and build IoT applications to be used with the ThingWorx Platform.   YOU'LL LEARN HOW TO   Establish and manage a secure connection with a ThingWorx server, including SSL negotiation and connection maintenance Enable easy programmatic interaction with the Properties, Services, and Events that are exposed by Entities running on a ThingWorx server Create applications that can be directly used with your device running the C programming language Basic concepts of the C Edge SDK How to use the C Edge API to build a real-world application How to utilize resources provided in the Edge SDK to help create your own application    Note: The estimated time to complete ALL 4 parts of this guide is 60 minutes.      Step 1: Completed Examples   Download the completed files for this tutorial: ThingWorx C Edge SDK Sample Files.zip.   This tutorial will guide you through working with the C SDK on differing levels. Utilize this file to see a finished example and return to it as a reference if you become stuck creating your own fully fleshed out application.   Keep in mind, this download uses the exact names for Entities used in this tutorial. If you would like to import this example and also create Entities on your own, change the names of the Entities you create.     Step 2: Environment Setup   In order to compile C code, you need a C compiler and the ThingWorx C Edge SDK. It will be helpful to have CMake installed on your system. CMake is a build tool that will generate make or project files for many different platforms and IDEs.   Operating System Notes Windows You will need a 3rd party compiler such as MinGW GCC, Cygwin GCC or you can follow these Microsoft instructions to download and use the Microsoft Visual C++ Build Tool. Mac Download the Apple Developer Tools. Linux/Ubuntu A compiler is included by default.   NOTE: You can use CMake, version 2.6.1 or later to build projects or make files, which then are used to build the applications that you develop with the C SDK.   Before you can begin developing with the ThingWorx C SDK, you need to generate an Application Key and modify the source code file. You can use the Create an Application Key guide as a reference.   Modify Source File   Extract the files from the C SDK samples zip file. At the top level of the extracted files, you will see a folder called examples. This directory provides examples of how to utilize the C SDK. Open a terminal, go to your workspace, and create a new directory. You can also just switch to the unzipped directory in your system. After you've created this directory in your workspace, copy the downloaded files and folders into your new directory. You can start creating your connection code or open the main.c source file in the examples\SteamSensor\src directory for an example.   Operating System Code Linux/Ubuntu gedit main.c OR vi main.c Mac open –e main.c Windows start main.c        5. Modify the Server Details section at the top with the IP address for your ThingWorx Platform instance and the Application Key you would like to use.   Change the TW_HOST definition accordingly. Change the TW_PORT definition accordingly. Change the TW_APP_KEY definition to the keyId value saved from the last step.   /* Server Details */ #define TW_HOST "https://pp-XXXXXXXXX.devportal.ptc.i" #define TW_PORT 80 #define TW_APP_KEY "e1d78abf-cfd2-47a6-92b7-37ddc6dd34618" NOTE: Using the Application Key for the default Administrator is not recommended. If administrative access is absolutely necessary, create a User and place the user as a member of Admins.   Compile and Run Code   To test your connection, you will only need to update the main.c in the SteamSensor example folder. CMake can generate Visual Studio projects, make build files or even target IDEs such as Eclipse, or XCode. CMake generates a general description into a build for your specific toolchain or IDE.   Inside the specific example folder you would like to run, ie SteamSensor. Create a directory to build in, for this example call it bin. mkdir bin  cd bin      5. Run the CMake command listed below. This assumes CMake is already on your PATH. cmake ..      6. CMake has now produced a set of project files which should be compatible with your development environment.   Operating System Command Note Unix make A set of make files Windows msbuild tw-c-sdk.sln /t:build A visual studio solution   NOTE: CMake does its best to determine what version of Visual Studio you have but you may wish to specify which version to use if you have more than one installed on your computer. Below is an example of forcing CMake to use a specific version of Visual Studio: cmake -G "Visual Studio 15 2017" .. If your version of Visual Studio or other IDE is unknown, use cmake -G to see a list of supported IDEs.   You also have the alternative of opening the tw-c-sdk.sln from within Visual Studio and building in this IDE.   NOTE: By default, CMake will generate a build for the creation of a release binary. If you want to generate a debug build, use the command-> cmake -DBUILD_DEBUG=ON ..       7. Once your build completes you will find the build products in the CMake directory (see example below). From here, open the project in your IDE of choice.   NOTE: You should receive messages confirming successful binding, authentication, and connection after the main.c file edits have been made.   Operating System Files Description Unix ./bin/src/libtwCSdk_static.a  Static Library Unix ./bin/src/libtwCSdk.so  Shared Library Unix ./bin/examples/SteamSensor/SteamSensor   Sample Application Windows .\bin\src\<Debug/Release>\twCSdk_static.lib  Static Library Windows .\bin\src\<Debug/Release>\twCSdk.dll  Shared Library Windows .\bin\examples\<Debug/Release>\SteamSensor\SteamSensor.exe  Sample Application     Step 3: Run Sample Code   The C code in the sample download is configured to run and connect to the Entities provided in the ThingWorxEntitiesExport.xml file. Make note of the IP address of your ThingWorx Composer instance. The top level of the exported zip file will be referred to as [C SDK HOME DIR].   Navigate to the [C SDK HOME DIR]/examples/ExampleClient/src directory. Open the main.c source file.   Operating System Command Linux/Ubuntu gedit main.c OR vi main.c Mac open –e main.c Windows start main.c   Modify the Server Details section at the top with the IP address for your ThingWorx Platform instance and the Application Key you would like to use. Change the TW_HOST definition accordingly.   NOTE: By default, TW_APP_KEY has been set to the Application Key from the admin_key in the import step completed earlier. Using the Application Key for the default Administrator is not recommended. If administrative access is absolutely necessary, create a user and place the user as a member of the Admins security group.   /* Server Details */ #define TW_HOST "127.0.0.1" #define TW_APP_KEY "ce22e9e4-2834-419c-9656-e98f9f844c784c"   If you are working on a port other than 80, you will need to update the conditional statement within the main.c source file. Search for and edit the first line within the main function. Based on your settings, set the int16_t port to the ThingWorx platform port. Click Save and close the file. Create a directory to build in, for this example call it bin.   Operating System Command Linux/Ubuntu mkdir bin Mac mkdir bin Windows mkdir bin   Change to the newly created bin directory.   Operating System Command Linux/Ubuntu cd bin Mac cd bin Windows cd bin   Run the CMake command using your specific IDE of choice.    NOTE: Include the two periods at the end of the code as shown below. Use cmake -G to see a list of supported IDEs.   cmake ..   Once your build completes, you will find the build products in the bin directory, and you can open the project in your IDE of choice. NOTE: You should receive messages confirming successful binding, authentication, and connection after building and running the application    10. You should be able to see a Thing in your ThingWorx Composer called SimpleThing_1 with updated last Connection and isConnected properties. SimpleThing_1 is bound for the duration of the application run time                                                                                                                                                                                The below instructions will help to verify the connection.   Click Monitoring. Click Remote Things from the list to see the connection status.   You will now be able to see and select the Entity within the list.   Step 4: ExampleClient Connection   The C code provided in the main.c source file is preconfigured to initialize the ThingWorx C Edge SDK API with a connection to the ThingWorx platform and register handlers. In order to set up the connection, a number of parameters must be defined. This can be seen in the code below.   #define TW_HOST "127.0.0.1" #define TW_APP_KEY "ce22e9e4-2834-419c-9656-ef9f844c784c #if defined NO_TLS #define TW_PORT = 80; #else #define TW_PORT = 443; #endif The first step of connecting to the platform: Establish Physical Websocket, we call the   twApi_Initialize function with the information needed to point to the websocket of the ThingWorx Composer. This function:   Registers messaging handlers Allocates space for the API structures Creates a secure websocket   err = twApi_Initialize(hostname, port, TW_URI, appKey, NULL, MESSAGE_CHUNK_SIZE, MESSAGE_CHUNK_SIZE, TRUE); if (TW_OK != err) { TW_LOG(TW_ERROR, "Error initializing the API"); exit(err); }   If you are not using SSL/TLS, use the following line to test against a server with a self-signed certificate:   twApi_SetSelfSignedOk();   In order to disable HTTPS support and use HTTP only, call the twApi_DisableEncryption function. This is needed when using ports such as 80 or 8080. A call can be seen below:   twApi_DisableEncryption();   The following event handlers are all optional. The twApi_RegisterBindEventCallback function registers a function that will be called on the event of a Thing being bound or unbound to the ThingWorx platform. The twApi_RegisterOnAuthenticatedCallback function registered a function that will be called on the event the SDK has been authenticated by the ThingWorx Platform.  The twApi_RegisterSynchronizeStateEventCallback function registers a function that will be called after binding and used to notify your application about fields that have been bound to the Thingworx Platform.   twApi_RegisterOnAuthenticatedCallback(authEventHandler, TW_NO_USER_DATA); twApi_RegisterBindEventCallback(NULL, bindEventHandler, TW_NO_USER_DATA); twApi_RegisterSynchronizeStateEventCallback(NULL, synchronizeStateHandler, TW_NO_USER_DATA);   NOTE: Binding a Thing within the ThingWorx platform is not mandatory, but there are a number of advantages, including updating Properties while offline.   You can then start the client, which will establish the AlwaysOn protocol with the ThingWorx Composer. This protocol provides bi-directional communication between the ThingWorx Composer and the running client application. To start this connection, use the line below:   err = twApi_Connect(CONNECT_TIMEOUT, RETRY_COUNT); if(TW_OK != err){ exit(-1); }     Click here to view Part 2 of this guide
View full tip
Using the Solution Central API Pitfalls to Avoid by Victoria Firewind, IoT EDC   Introduction The Solution Central API provides a new process for publishing ThingWorx solutions that are developed or modified outside of the ThingWorx Platform. For those building extensions, using third party libraries, or who just are more comfortable developing in an IDE external to ThingWorx, the SC API makes it simple to still utilize Solution Central for all solution management and deployment needs, according to ThingWorx dev ops best practices. This article hones in one some pitfalls that may arise while setting up the infrastructure to use the SC API and assumes that there is already AD integration and an oauth token fetcher application configured for these requests.   CURL One of the easiest ways to interface with the SC API is via cURL. In this way, publishing solutions to Solution Central really involves a series of cURL requests which can be scripted and automated as part of a mature dev ops process. In previous posts, the process of acquiring an oauth token is demonstrated. This oauth token is good for a few moments, for any number of requests, so the easiest thing to do is to request a token once before each step of the process.   1. GET info about a solution (shown) or all solutions (by leaving off everything after "solutions" in the URL)     $RESULT=$(curl -s -o test.zip --location --request GET "https://<your_sc_url>/sc/api/solutions/org.ptc:somethingoriginal12345:1.0.0/files/SampleTwxExtension.zip" ` --header "Authorization: Bearer $ACCESS_TOKEN" ` --header 'Content-Type: application/json' ` )     Shown here in the URL, is the GAV ID (Group:Artifact_ID:Version). This is shown throughout the Swagger UI (found under Help within your Solution Central portal) as {ID}, and it includes the colons. To query for solutions, see the different parameter options available in the Swagger UI found under Help in the SC Portal (cURL syntax for providing such parameters is shown in the next example).   Potential Pitfall: if your solution is not published yet, then you can get the information about it, where it exists in the SC repo, and what files it contains, but none of the files will be downloadable until it is published. Any attempt to retrieve unpublished files will result in a 404.   2. Create a new solution using POST     $RESULT=$(curl -s --location --request POST "https://<your_sc_url>/sc/api/solutions" ` --header "Authorization: Bearer $ACCESS_TOKEN" ` --header 'Content-Type: application/json' ` -d '"{\"groupId\": \"org.ptc\", \"artifactId\": \"somethingelseoriginal12345\", \"version\": \"1.0.0\", \"displayName\": \"SampleExtProject\", \"packageType\": \"thingworx-extension\", \"packageMetadata\": {}, \"targetPlatform\": \"ThingWorx\", \"targetPlatformMinVersion\": \"9.3.1\", \"description\": \"\", \"createdBy\": \"vfirewind\"}"' )     It will depend on your Powershell or Bash settings whether or not the escape characters are needed for the double quotes, and exact syntax may vary. If you get a 201 response, this was successful.   Potential Pitfalls: the group ID and artifact ID syntax are very particular, and despite other sources, the artifact ID often cannot contain capital letters. The artifact ID has to be unique to previously published solutions, unless those solutions are first deleted in the SC portal. The created by field does not need to be a valid ThingWorx username, and most of the parameters given here are required fields.   3.  PUT the files into the project     $RESULT=$(curl -L -v --location --request PUT "<your_sc_url>/sc/api/solutions/org.ptc:somethingelseoriginal12345:1.0.0/files" ` --header "Authorization: Bearer $ACCESS_TOKEN" ` --header 'Accept: application/json' ` --header 'x-sc-primary-file:true' ` --header 'Content-MD5:08a0e49172859144cb61c57f0d844c93' ` --header 'x-sc-filename:SampleTwxExtension.zip' ` -d "@SampleTwxExtension.zip" ) $RESULT=$(curl -L --location --request PUT "https://<your_sc_url>/sc/api/solutions/org.ptc:somethingelseoriginal12345:1.0.0/files" ` --header "Authorization: Bearer $ACCESS_TOKEN" ` --header 'Accept: application/json' ` --header 'Content-MD5:fa1269ea0d8c8723b5734305e48f7d46' ` --header 'x-sc-filename:SampleTwxExtension.sha' ` -d "@SampleTwxExtension.sha" )     This is really TWO requests, because both the archive of source files and its hash have to be sent to Solution Central for verifying authenticity. In addition to the hash file being sent separately, the MD5 checksum on both the source file archive and the hash has to be provided, as shown here with the header parameter "Content-MD5". This will be a unique hex string that represents the contents of the file, and it will be calculated by Azure as well to ensure the file contains what it should.   There are a few ways to calculate the MD5 checksums and the hash: scripts can be created which use built-in Windows tools like certutil to run a few commands and manually save the hash string to a file:      certutil -hashfile SampleTwxExtension.zip MD5 certutil -hashfile SampleTwxExtension.zip SHA256 # By some means, save this SHA value to a file named SampleTwxExtension.sha certutil -hashfile SampleTwxExtension.sha MD5       Another way is to use Java to generate the SHA file and calculate the MD5 values:      public class Main { private static String pathToProject = "C:\\Users\\vfirewind\\eclipse-workspace\\SampleTwxExtension\\build\\distributions"; private static String fileName = "SampleTwxExtension"; public static void main(String[] args) throws NoSuchAlgorithmException, FileNotFoundException { String zip_filename = pathToProject + "\\" + fileName + ".zip"; String sha_filename = pathToProject + "\\" + fileName + ".sha"; File zip_file = new File(zip_filename); FileInputStream zip_is = new FileInputStream(zip_file); try { // Calculate the MD5 of the zip file String md5_zip = DigestUtils.md5Hex(zip_is); System.out.println("------------------------------------"); System.out.println("Zip file MD5: " + md5_zip); System.out.println("------------------------------------"); } catch(IOException e) { System.out.println("[ERROR] Could not calculate MD5 on zip file named: " + zip_filename + "; " + e.getMessage()); e.printStackTrace(); } try { // Calculate the hash of the zip and write it to a file String sha = DigestUtils.sha256Hex(zip_is); File sha_output = new File(sha_filename); FileWriter fout = new FileWriter(sha_output); fout.write(sha); fout.close(); System.out.println("[INFO] SHA: " + sha + "; written to file: " + fileName + ".sha"); // Now calculate MD5 on the hash file FileInputStream sha_is = new FileInputStream(sha_output); String md5_sha = DigestUtils.md5Hex(sha_is); System.out.println("------------------------------------"); System.out.println("Zip file MD5: " + md5_sha); System.out.println("------------------------------------"); } catch (IOException e) { System.out.println("[ERROR] Could not calculate MD5 on file name: " + sha_filename + "; " + e.getMessage()); e.printStackTrace(); } }     This method requires the use of a third party library called the commons codec. Be sure to add this not just to the class path for the Java project, but if building as a part of a ThingWorx extension, then to the build.gradle file as well:     repositories { mavenCentral() } dependencies { compile fileTree(dir:'twx-lib', include:'*.jar') compile fileTree(dir:'lib', include:'*.jar') compile 'commons-codec:commons-codec:1.15' }       Potential Pitfalls: Solution Central will only accept MD5 values provided in hex, and not base64. The file paths are not shown here, as the archive file and associated hash file shown here were in the same folder as the cURL scripts. The @ syntax in Powershell is very particular, and refers to reading the contents of the file, in this case, or uploading it to SC (and not just the string value that is the name of the file). Every time the source files are rebuilt, the MD5 and SHA values need to be recalculated, which is why scripting this process is recommended.   4. Do another PUT request to publish the project      $RESULT=$(curl -L --location --request PUT "https://<your_sc_url>/sc/api/solutions/org.ptc:somethingelseoriginal12345:1.0.0/publish" ` --header "Authorization: Bearer $ACCESS_TOKEN" ` --header 'Accept: application/json' ` --header 'Content-Type: application/json' ` -d '"{\"publishedBy\": \"vfirewind\"}"' )     The published by parameter is necessary here, but it does not have to be a valid ThingWorx user for the request to work. If this request is successful, then the solution will show up as published in the SC Portal:    Other Pitfalls Remember that for this process to work, the extensions within the source file archive must contain certain identifiers. The group ID, artifact ID, and version have to be consistent across a couple of files in each extension: the metadata.xml file for the extension and the project.xml file which specifies which projects the extensions belong to within ThingWorx. If any of this information is incorrect, the final PUT to publish the solution will fail.   Example Metadata File:     <?xml version="1.0" encoding="UTF-8"?> <Entities> <ExtensionPackages> <ExtensionPackage artifactId="somethingoriginal12345" dependsOn="" description="" groupId="org.ptc" haCompatible="false" minimumThingWorxVersion="9.3.0" name="SampleTwxExtension" packageVersion="1.0.0" vendor=""> <JarResources> <FileResource description="" file="sampletwxextension.jar" type="JAR"></FileResource> </JarResources> </ExtensionPackage> </ExtensionPackages> <ThingPackages> <ThingPackage className="SampleTT" description="" name="SampleTTPackage"></ThingPackage> </ThingPackages> <ThingTemplates> <ThingTemplate aspect.isEditableExtensionObject="false" description="" name="SampleTT" thingPackage="SampleTTPackage"></ThingTemplate> </ThingTemplates> </Entities>       Example Projects XML File:     <?xml version="1.0" encoding="UTF-8"?> <Entities> <Projects> <Project artifactId="somethingoriginal12345" dependsOn="{&quot;extensions&quot;:&quot;&quot;,&quot;projects&quot;:&quot;&quot;}" description="" documentationContent="" groupId="org.ptc" homeMashup="" minPlatformVersion="" name="SampleExtProject" packageVersion="1.0.0" projectName="SampleExtProject" publishResult="" state="DRAFT" tags=""> </Project> </Projects> </Entities>       Another large issue that may come up is that requests often fail with a 500 error and without any message. There are often more details in the server logs, which can be reviewed internally by PTC if a support case is opened. Common causes of 500 errors include missing parameter values that are required, including invalid characters in the parameter strings, and using an API URL which is not the correct endpoint for the type of request. Another large cause of 500 errors is providing MD5 or hash values that are not valid (a mismatch will show differently).    Another common error is the 400 error, which happens if any of the code that SC uses to parse the request breaks. A 400 error will also occur if the files are not being opened or uploaded correctly due to some issue with the @ syntax (mentioned above).  Another common 400 error is a mismatch between the provided MD5 value for the zip or SHA file, and the one calculated by Azure ("message: Md5Mismatch"), which can indicate that there has been some corruption in the content of the upload, or simply that the MD5 values aren't being calculated correctly. The files will often say they have 100% uploaded, even if they aren't complete, errors appear in the console, or the size of the file is smaller than it should be if it were a complete upload (an issue with cURL).   Conclusion Debugging with cURL can be a challenge. Note that adding "-v" to a cURL command provides additional information, such as the number of bytes in each request and a reprint of the parameters to ensure they were read correctly. Even still, it isn't always possible for SC to indicate what the real cause of an issue is. There are many things that can go wrong in this process, but when it goes right, it goes very right. The SC API can be entirely scripted and automated, allowing for seamless inclusion of externally-developed tools into a mature dev ops process.
View full tip
We will host a live Expert Session: "Top 5 Thingworx environment monitoring best practices" on March 25th, 10h00 EST.   Please find below the description of the expert session and the registration link.   Expert Session: Top 5 Thingworx environment monitoring best practices Date and Time: March 25th, 10h00 EST Duration: 1 hour Host: Tori Firewind, Tim Atwood and Dave Bernbeck from Enterprise Deployment Center Registration Here: https://www.ptc.com/en/resources/iiot/webcast/top-5-thingworx-monitoring-best-practices    In this session, we will be reviewing the main monitoring practices to keep a heathy environment and discuss the main issues from the audience. Bring your questions!.   Existing Recorded sessions can be found on support portal using the keyword ‘Expert Sessions’. You can also suggest topics for upcoming sessions using this small form.   Here are some recorded sessions that might be of your interest. You can find recordings for the full library of webinars using the keyword ‘Expert Sessions’ in PTC support portal search   Thingworx Active Active Clustering This session will cover the main aspects of the High Availability Clustering feature launched with the ThingWorx 9.0 release.   Recoding Link Upgrade to Thingworx 9 – How to Plan / Evaluate Impacts This session highlights the key points you should evaluate to properly plan your upgrade to Thingworx 9. Recording Link Top 5 items to check for Thingworx Performance Troubleshooting How to troubleshoot performance issues in a Thingworx Environment? Here we cover the top 5 investigation steps that will help you understand the source of your environment issues and allow better communication with PTC Technical Support     Recording Link
View full tip
Hi All,   We will host a live Expert Session: "Thignworx Active Active Clustering" on January 21th 8h00 EST. Please find below the description of the expert session and the registration link.   Expert Session: Thignworx Active Active Clustering Date and Time: January 21th 8h00 EST Duration: 1 hour Host: Ayush Tiwari - IoT Product Manager Registration Here: https://www.ptc.com/en/customer-success/expert-sessions-for-thingworx-foundation-webcasts (scroll down, the session is in the bottom of the page)   Description: This session will cover the main aspects of the High Availability Clustering feature for High Availability configuration launched with the ThingWorx 9.0 release. Join us and bring your questions with you!    Existing Recorded sessions can be found on support portal using the keyword ‘Expert Sessions’. You can also suggest topics for upcoming sessions using this small form.   Here are some recorded sessions that might be of your interest. You can find recordings for the full library of webinars using the keyword ‘Expert Sessions’ in PTC support portal search Upgrade to Thingworx 9 – How to Plan / Evaluate Impacts This session highlights the key points you should evaluate to properly plan your upgrade to Thingworx 9   Recording Link Thingworx Flow Overview Flow is a powerful component of the ThingWorx platform.  This session will take the Flow discussion beyond basic applications and into more customized and complex solutions.​ This will focus on use cases, main features such as triggers, connector options, main enhancements for Thingworx 9.0 and a short demonstration   Recoding Link
View full tip
We will host a live Expert Session: "Thingworx Flow Overview" on December 10th, 8h00 EST.   Please find below the description of the expert session and the registration link.   Expert Session: Thingworx Flow Overview Date and Time: December 10th, 8h00 EST Duration: 1 hour Host: Antony Moffa; Vinay Vaidya - Thingworx IoT Platfom Senior Directors Registration Here: https://www.ptc.com/en/customer-success/expert-sessions-for-thingworx-foundation-webcasts    Description: Overview of Thingworx Flow, an application for integration and orchestration between systems. This will focus on use cases, main features such as triggers, connector options, main enhancements for Thingworx 9.0 and a short demonstration.   Existing Recorded sessions can be found on support portal using the keyword ‘Expert Sessions’. You can also suggest topics for upcoming sessions using this small form.   Here are some recorded and upcoming sessions that might be of your interest. You can also find recordings for the full library of webinars using the keyword ‘Expert Sessions’ in PTC support portal search Top 5 items to check for Thingworx Performance Troubleshooting How to troubleshoot performance issues in a Thingworx Environment? Here we cover the top 5 investigation steps that will help you understand the source of your environment issues and allow better communication with PTC Technical Support   Recording Link Upgrade to Thingworx 9 – How to Plan / Evaluate Impacts This session will highlight the key points you should evaluate to properly plan your upgrade to Thingworx 9 Register Here Active Active Clustering This session will cover the main aspects of the High Availability Clustering feature launched with the ThingWorx 9.0 release Register Here
View full tip
  Step 5: Properties In the Delivery Truck application, there are three Delivery Truck Things. Each Thing has a number of Properties based on its location, speed, and its deliveries carried out. In this design, when a delivery is made or the truck is no longer moving, the Property values are updated. The deliveryTruck.c helper C file is based on the DeliveryTruck Entities in the Composer. After calling the construct function, there are a number of steps necessary to get going. For the SimpleThing application, there are a number of methods for creating Properties, Events, Services, and Data Shapes for ease of use. Properties can be created in the client or just registered and utilized. In the SimpleThingClient application, Properties are created. In the DeliveryTruckClient application, Properties are bound to their ThingWorx Platform counterpart. Two types of structures are used by the C SDK to define Properties when it is seen fit to do so and can be found in [C SDK HOME DIR]/src/api/twProperties.h:   Name                    Structure             Description Property Definitions twPropertyDef Describes the basic information for the Properties that will be available to ThingWorx and can be added to a client application. Property Values twProperty Associates the Property name with a value, timestamp, and quality. NOTE: The C SDK provides a number of Macros located in [C SDK HOME DIR]/src/api/twMacros.h. This guide will use these Macros while providing input on the use of pure function calls.   The Macro example below can be found in the main source file for the SimpleThingClient application and the accompanying helper file simple_thing.c. TW_PROPERTY("TempProperty", "Description for TempProperty", TW_NUMBER); TW_ADD_BOOLEAN_ASPECT("TempProperty", TW_ASPECT_ISREADONLY,TRUE); TW_ADD_BOOLEAN_ASPECT("TempProperty", TW_ASPECT_ISLOGGED,TRUE); NOTE: The list of aspect configurations can be seen in [C SDK HOME DIR]/src/api/twConstants.h. Property values can be set with defaults using the aspects setting. Setting a default value in the client will affect the Property in the ThingWorx platform after binding. It will not set a local value in the client application.   For the DeliveryTruckClient, we registered, read, and update Properties without using the Property definitions. Which method of using Properties is based on the application being built.   NOTE: Updating Properties in the ThingWorx Platform while the application is running, will update the values in the client application. To update the values in the platform to match, end the Property read section of your property handler function with a function to set the platform value.   The createTruckThing function for the deliveryTruck.c source code takes a truck name as a parameter and is used to register the Properties, functions, and handlers for each truck. The updateTruckThing function for the deliveryTruck.c source code takes a truck name as a parameter and is used to either initialize a struct for DeliveryTruck Properties, or simulate a truck stop Event, update Properties, then fire an Event for the ThingWorx platform. Connecting properties to be used on the platform is as easy as registering the property and optionally adding aspects. The following shows the properties that correlate to those in the DeliveryTruck entities in the Composer. To do this within the code, you would use the TW_PROPERTY macro as shown in the deliveryTruck.c. This macro must be proceeded by either TW_DECLARE_SHAPE, TW_DECLARE_TEMPLATE or TW_MAKE_THING because these macros declare variables used by the TW_PROPERTY that follow them. //TW_PROPERTY(propertyName,description,type) TW_PROPERTY(PROPERTY_NAME_DRIVER, NO_DESCRIPTION, TW_STRING); TW_PROPERTY(PROPERTY_NAME_DELIVERIES_LEFT, NO_DESCRIPTION, TW_NUMBER); TW_PROPERTY(PROPERTY_NAME_TOTAL_DELIVERIES, NO_DESCRIPTION, TW_NUMBER); TW_PROPERTY(PROPERTY_NAME_DELIVERIES_MADE, NO_DESCRIPTION, TW_NUMBER); TW_PROPERTY(PROPERTY_NAME_LOCATION, NO_DESCRIPTION, TW_LOCATION); TW_PROPERTY(PROPERTY_NAME_SPEED, NO_DESCRIPTION, "TW_NUMBER); Read Properties Reading Properties from a ThingWorx platform Thing or the returned Properties of a Service can be done using the TW_GET_PROPERTY macro. Examples of its use can be seen in all of the provided applications. An example can be seen below: int flow = TW_GET_PROPERTY(thingName, "TotalFlow").number; int pressue = TW_GET_PROPERTY(thingName, "Pressure").number; twLocation location = TW_GET_PROPERTY(thingName, "Location").location; int temperature = TW_GET_PROPERTY(thingName, "Temperature").number; Write Properties Writing Properties to a ThingWorx platform Thing from a variable storing is value uses a similarly named method. Using the TW_SET_PROPERTY macro will be able to send values to the platform. Examples of its use can be seen in all of the provided applications. An example is shown below: TW_SET_PROPERTY(thingName, "TotalFlow", TW_MAKE_NUMBER(rand() / (RAND_MAX / 10.0))); TW_SET_PROPERTY(thingName, "Pressure", TW_MAKE_NUMBER(18 + rand() / (RAND_MAX / 5.0))); TW_SET_PROPERTY(thingName, "Location", TW_MAKE_LOC(gpsroute[location_step].latitude,gpsroute[location_step].longitude,gpsroute[location_step].elevation)); This macro utilizes the twApi_PushSubscribedProperties function call to pushe all property updates to the server. This can be seen in the updateTruckThing function in deliveryTruck.c. Property Change Listeners Using the Observer pattern, you can take advantage of the Property change listener functionality. With this pattern, you create functions that will be notified when a value of a Property has been changed (whether on the server or locally by your program when the TW_SET_PROPERTY macro is called). Add a Property Change Listener In order to add a Property change listener, call the twExt_AddPropertyChangeListener function using the: Name of the Thing (entityName) Property this listener should watch Function that will be called when the property has changed void simplePropertyObserver(const char * entityName, const char * thingName,twPrimitive* newValue){ printf("My Value has changed\n"); } void test_simplePropertyChangeListener() { { TW_MAKE_THING("observedThing",TW_THING_TEMPLATE_GENERIC); TW_PROPERTY("TotalFlow", TW_NO_DESCRIPTION, TW_NUMBER); } twExt_AddPropertyChangeListener("observedThing",TW_OBSERVE_ALL_PROPERTIES,simplePropertyObserver); TW_SET_PROPERTY("observedThing","TotalFlow",TW_MAKE_NUMBER(50)); } NOTE: Setting the propertyName parameter to NULL or TW_OBSERVE_ALL_PROPERTIES, the function specified by the propertyChangeListenerFunction parameter will be used for ALL properties.   Remove a Property Change Listener In order to release the memory for your application when done with utilizing listeners for the Property, call the twExt_RemovePropertyChangeListener function. void simplePropertyObserver(const char * entityName, const char * thingName,twPrimitive* newValue){ printf("My Value has changed\n"); } twExt_RemovePropertyChangeListener(simplePropertyObserver);   Step 6: Data Shapes Data Shapes are an important part of creating/firing Events and also invoking Services. Define With Macros In order to define a Data Shape using a macro, use TW_MAKE_DATASHAPE.   NOTE: The macros are all defined in the twMacros.h header file.   TW_MAKE_DATASHAPE("SteamSensorReadingShape", TW_DS_ENTRY("ActivationTime", TW_NO_DESCRIPTION ,TW_DATETIME), TW_DS_ENTRY("SensorName", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("Temperature", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("Pressure", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("FaultStatus", TW_NO_DESCRIPTION ,TW_BOOLEAN), TW_DS_ENTRY("InletValve", TW_NO_DESCRIPTION ,TW_BOOLEAN), TW_DS_ENTRY("TemperatureLimit", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("TotalFlow", TW_NO_DESCRIPTION ,TW_INTEGER) ); Define Without Macros In order to define a Data Shape without using a macro, use the twDataShape_CreateFromEntries function. In the example below, we are creating a Data Shape called SteamSensorReadings that has two numbers as Field Definitions. twDataShape * ds = twDataShape_Create(twDataShapeEntry_Create("a",NULL,TW_NUMBER)); twDataShape_AddEntry(ds, twDataShapeEntry_Create("b",NULL,TW_NUMBER)); /* Name the DataShape for the SteamSensorReadings service output */ twDataShape_SetName(ds, "SteamSensorReadings");   Step 7: Events and Services Events and Services provide useful functionality. Events are a good way to make a Service be asynchronous. You can call a Service, let it return, then your Entity can subscribe to your Event and not keep the original Service function waiting. Events are also a good way to allow the platform to respond to data when it arrives on the edge device without it having to poll the edge device for updates. Fire Events To fire an Event you first need to register the Event and load it with the necessary fields for the Data Shape of that Event using the twApi_RegisterEvent function. Afterwards, you would send a request to the ThingWorx server with the collected values using the twApi_FireEvent function. An example is as follows: twDataShape * ds = twDataShape_Create(twDataShapeEntry_Create("message", NULL,TW_STRING)); /* Event datashapes require a name */ twDataShape_SetName(ds, "SteamSensorFault"); /* Register the service */ twApi_RegisterEvent(TW_THING, thingName, "SteamSensorFault", "Steam sensor event", ds); …. struct { char FaultStatus; double Temperature; double TemperatureLimit; } properties; …. properties. TemperatureLimit = rand() + RAND_MAX/5.0; properties.Temperature = rand() + RAND_MAX/5.0; properties.FaultStatus = FALSE; if (properties.Temperature > properties.TemperatureLimit && properties.FaultStatus == FALSE) { twInfoTable * faultData = 0; char msg[140]; properties.FaultStatus = TRUE; sprintf(msg,"%s Temperature %2f exceeds threshold of %2f", thingName, properties.Temperature, properties.TemperatureLimit); faultData = twInfoTable_CreateFromString("message", msg, TRUE); twApi_FireEvent(TW_THING, thingName, "SteamSensorFault", faultData, -1, TRUE); twInfoTable_Delete(faultData); } Invoke Services In order to invoke a Service, you will use the twApi_InvokeService function. The full documentation for this function can be found in [C SDK HOME DIR]/src/api/twApi.h. Refer to the table below for additional information.   Parameter         Type                   Description entityType Input The type of Entity that the service belongs to. Enumeration values can be found in twDefinitions.h. entityName Input The name of the Entity that the service belongs to. serviceName Input The name of the Service to execute. params Input A pointer to an Info Table containing the parameters to be passed into the Service. The calling function will retain ownership of this pointer and is responsible for cleaning up the memory after the call is complete. result Input/Output A pointer to a twInfoTable pointer. In a successful request, this parameter will end up with a valid pointer to a twInfoTable that is the result of the Service invocation. The caller is responsible for deleting the returned primitive using twInfoTable_Delete. It is possible for the returned pointer to be NULL if an error occurred or no data is returned. timeout Input The time (in milliseconds) to wait for a response from the server. A value of -1 uses the DEFAULT_MESSAGE_TIMEOUT as defined in twDefaultSettings.h. forceConnect Input A Boolean value. If TRUE and the API is in the disconnected state of the duty cycle, the API will force a reconnect to send the request.   See below for an example in which the Copy service from the FileTransferSubsystem is called:   twDataShape * ds = NULL; twInfoTable * it = NULL; twInfoTableRow * row = NULL; twInfoTable * transferInfo = NULL; int res = 0; const char * sourceRepo = "SimpleThing_1"; const char * sourcePath = "tw/hotfolder/"; const char * sourceFile = "source.txt"; const char * targetRepo = "SystemRepository"; const char * targetPath = "/"; const char * targetFile = "source.txt"; uint32_t timeout = 60; char asynch = TRUE; char * tid = 0; /* Create an infotable out of the parameters */ ds = twDataShape_Create(twDataShapeEntry_Create("sourceRepo", NULL, TW_STRING)); res = twDataShape_AddEntry(ds, twDataShapeEntry_Create("sourcePath", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("sourceFile", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetRepo", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetPath", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetFile", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("async", NULL, TW_BOOLEAN)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("timeout", NULL, TW_INTEGER)); it = twInfoTable_Create(ds); row = twInfoTableRow_Create(twPrimitive_CreateFromString(sourceRepo, TRUE)); res = twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(sourcePath, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(sourceFile, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetRepo, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetPath, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetFile, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromBoolean(asynch)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromInteger(timeout)); twInfoTable_AddRow(it,row); /* Make the service call */ res = twApi_InvokeService(TW_SUBSYSTEM, "FileTransferSubsystem", "Copy", it, &transferInfo, timeout ? (timeout * 2): -1, FALSE); twInfoTable_Delete(it); /* Grab the tid */ res = twInfoTable_GetString(transferInfo,"transferId",0, &tid); Bind Event Handling You may want to track exactly when your edge Entities are successfully bound to or unbound from the server. The reason for this is that only bound items should be interacting with the ThingWorx Platform and the ThingWorx Platform will never send any requests targeted at an Entity that is not bound. A simple example that only logs the bound Thing can be seen below. After creating this function, it will need to be registered using the twApi_RegisterBindEventCallback function before the connection is made. void BindEventHandler(char * entityName, char isBound, void * userdata) { if (isBound) TW_LOG(TW_FORCE,"BindEventHandler: Entity %s was Bound", entityName); else TW_LOG(TW_FORCE,"BindEventHandler: Entity %s was Unbound", entityName); } …. twApi_RegisterBindEventCallback(thingName, BindEventHandler, NULL); OnAuthenticated Event Handling You may also want to know exactly when your Edge device has successfully authenticated and made a connection to the ThingWorx platform. Like the bind Event handling, this function will need to be made and registered. To register this handler, use the twApi_RegisterOnAuthenticatedCallback function before the connection is made. This handler form can also be used to do a delay bind for all Things. void simplePropertyObserver(const char * entityName, const char * thingName,twPrimitive* newValue){ printf("My Value has changed\n"); } twExt_RemovePropertyChangeListener(simplePropertyObserver);   Click here to view Part 4 of this guide. 
View full tip
Applicable Releases: ThingWorx Navigate 1.6.0 to 8.5.0     Description:     How to use PingFederate script: Prerequisites Configuration Run the script Generated artifacts Live Demo         Associated documentation is available in PTC Single Sign On Architecture and Configuration Overview guide: PTC Single Sign-on Architecture and Configuration Overview  
View full tip
    Step 7: Add Grid   It might also be helpful to display the data you've used to build the ThingWorx Analytics model.   In the future, this might be split out into an entirely separate page/Mashup that is exclusively devoted to backend model creation, but that would be beyond the scope of this guide.   For now, we'll simply display that collected data via the Grid Widget      1. On the EEFV_Mashup, drag-and-drop a Grid Advanced Widget onto the bottom-left Canvas section..     2. On the top-right Data tab, click the green </> button beside Things_EdgeThing. Note that this will open the Add Data pop-up, but with EdgeThing pre-selected.   3. In the Services Filter field, type getproperties.   4. Click the right-arrow beside GetProperties to add it to Selected Services on the right.   5. Check Execute on Load.     6. Click Done.   7. Under the Data tab, expand GetProperties to reveal the options.     8. Drag-and-drop Things_EdgeThing > GetProperties > infoTableProperty onto the Grid Advanced Widget.     9. On the Select Binding Target pop-up, click Data.     10. Click Save.   11. Click View Mashup.         Step 8: Add Controls   Throughout this Learning Path, it has been recommended that you stop Analysis Events when not actively using their functionality.   However, this requires going into the backend of ThingWorx Analytics to disable the event, which is not ideal.   Instead, let's enable or disable the Analysis Event from inside the Mashup by adding some Button Widgets to directly interface the Analytics backend for us.       1. Click the bottom-right Canvas section to select it.         2. In Mashup Builder top-left, click the Layout tab.         3. Under Positioning, click the Static radio-button.         4. Click the Widgets tab.       5. Drag-and-drop a Button Widget onto the bottom-right section.         6. Drag-and-drop another Button Widget onto the bottom-right section.       7. Drag-and-drop a Text Field Widget onto the bottom-right section.       8. Click Save.       Bring in More Data   Now that we have Buttons to trigger enable/disable, as well as a Text Field to display information, we now need to bring in some additional Mashup Data Services to interact with the ThingWorx Analytics backend.       1. Click the green + button at the top of the Data tab.       2. In the Entity Filter field, search for and  select TW.AnalysisServices.EventManagementServicesAPI.       3. In the Services Filter field, search for and add QueryAnalysisEvents by clicking the right arrow.       4. Check Execute on Load.         5. In Services Filter, search for and select EnableAnalysisEvent by clicking the right arrow. Note that you should NOT check "Execute on Load", as we'll trigger this Service only when the Button is clicked.     6. In Services Filter, search for and select DisableAnalysisEvent by clicking the right arrow. Likewise, do NOT check "Execute on Load" here either.       7. Click Done.         8. Click Save.     Display Event Key   To enable or disable Analytics Events, we need to know the eventId, which is returned by QueryAnalysisEvent Service as the parameter labeled key.   We'll bind that to the Text Field Widget for later usage in enabling/disabling.        1. Change the top-button's Label Property to Enable Analytics Event.       2. Change the bottom-button's Label Property to Disable Analytics Event.         3. Under the Data tab, expand QueryAnalysisEvents > Returned Data > All Data to reveal the options. .       4. Drag-and-drop QueryAnalysisEvents > Returned Data > All Data > key to the TextField Widget.         5. On the Select Binding Target pop-up, click Text.         6. Click Save.     Enable/Disable Analytics Events   Now that we know the key/eventId, we can call the EnableAnalysisEvent and DisableAnalysisEvent services.       1. Under the Data tab, expand EnableAnalysisEvent > Parameters to reveal eventId.       2. Click the Text Field Widget to select it, and then click the top-left drop down to reveal the options.         3. Drag-and-drop the Text Field's Text Property onto EnableAnalysisEvent > Parameters > eventId.         4. Repeat steps 1-3 for DisableAnalysisEvent.         5. Click the Enable Analytics Event Button Widget to select it, then click the top-left to reveal the drop down option .       6. Drag-and-drop the Clicked Event onto the EnableAnalysisEvent Service under the Data tab.         7. Repeat steps 5-6 for DisableAnalysisEvent, using the other Button Widget.         8. Click Save     Step 9: View Mashup   Throughout this guide, we've added various additional functionality to our MVP Mashup. At this point, you could continue to update the Mashup as you see fit.   For instance, you could change the background color of the top-left section to better match the header. Or you could further modify the original Mashup shown in the Contained Mashup Widget so that it better fits in the allowed space. You could add another Label Widget to the Header section to also display the company's motto / tag-line.   Regardless, when you are done with modifications, Save and click View Mashup.     Note that you can left-click-and-drag on the Time Series Chart to select particular time ranges. Or you could add a Time Selector Widget to the bottom-right section to control it there.   Similarly, you could add controls for the Grid Widget to only show the Identifier ranges in which you were interested.   Or you could split out the Model-creation values to a completely separate Mashup as previously discussed.   The extent to which you develop your Mashup is entirely up to you.        Step 10: Next Steps   Congratulations. You've completed the Enhanced Engine Failure Visualization guide. In this guide, you learned how to:   Create a Mashup with a Header Divide your Mashup into Sub-sections Use a Contained Mashup to reuse development Store historical data in a Value Steam Display historical data in a Line Chart Show spreadsheet data via a Grid Advanced Widget Tie Mashup controls into the ThingWorx backend   This is the last guide in the Vehicle Predictive Pre-Failure Detection with ThingWorx Platform learning path.   Learn More   We recommend the following resources to continue your learning experience:   Capability  Guide Build Implement Services, Events, and Subscriptions Guide   Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Community Developer Community Forum Support Analytics Manager Help Center
View full tip
I had tested this on a Raspberry Pi 3 which was working fine. I'm not sure if there's any resource limitation on the Pi 2 that would cause potential issues. I however had some performance issues when using the H2 version, therefore I stuck to the Neo4J version. As there were some changes introduced in ThingWorx 7.4 I would also stick to 7.2 or 7.3. If the Tomcat folder exists in the webapps, that means that initial deployment has been successful. I'd recommend checking the Tomcat logs and the ThingWorx ApplicationLog / ConfigurationLog for any potential issues.
View full tip
  Use the statistical calculation Thing Shape to execute useful analysis services   GUIDE CONCEPT   This project will introduce the Statistical Monitoring Thing Shape.   This guide demonstrates using Descriptive Analytics from ThingWorx Predictive Analytics to perform common statistical monitoring analysis on time-series data. You will learn to use the Statistical Monitoring ThingShape's built-in services to return the number of values in a data set that are: above or below a threshold, inside or outside a defined range, or following a trend that is: increasing, decreasing, or alternating.       YOU'LL LEARN HOW TO   Create a Thing with the Statistical Monitoring Thing Shape Create a Property and Value Stream to record changes in Property values Use Services that perform Standard Analytical Monitoring   NOTE: The estimated time to complete this guide is 30 minutes.     Step 1: Introduction   Descriptive analytics lets you perform common statistical monitoring calculations on changes in a Property value over time.   Output from monitoring Services can be used in IoT applications built with ThingWorx to provide trend and numerical limits feedback.   This guide introduces the Statistical Monitoring Thing Shape which adds Services to Things and Thing Templates for reporting Property values that are: above or below a threshold, inside or outside a defined range, or following a trend that is: increasing, decreasing, or alternating.     These Services analyze time-series data which is stored in ThingWorx Foundation as changes to a Property value logged to a Value Stream.   To ensure optimum performance, both a time range and a maximum number of value changes must be specified.     Step 2: Create Prerequisites   Statistical monitoring Services operate on Property values that change over time. To create this time series data, Property value changes are logged in a Value Stream. In this step, you will create a Value Stream, then a Thing with a Property that logs changes to that Value Stream.   Create Value Stream   Follow the steps below to create a value stream which you will later tie to a Thing.       1. On the ThingWorx Composer Browse tab, click Data Storage > Value Streams, then click the +New button         2. Select ValueStream and click OK.         3. In the Name field, enter scts_valuestream.         4. If Project is not already set, click the + in the Project text box and select the PTCDefaultProject.       5. At the top, click Save.   Create Thing   Now, you will create a Thing with a property and configure it to use the previously-created value stream. You will also apply the statistical monitoring Thing Shape to the Thing, which makes the built-in analytics services available.       1. On the ThingWorx Composer Browse tab, click MODELING > Things then click the + New button          2. In the Name field, enter scts_thing.       3. If Project is not already set, click the + in the Project text box and select the PTCDefaultProject.       4. In the Base Thing Template field, search for and select GenericThing.       5. In the Implemented Shapes field, search for and select StatisticalMonitoringThingShape.       6. In the Value Stream field, search for and select scts_valuestream.         7. At the top, click Save.    Add Property   You will now add a property to scts_thing.       1. At the top, click Properties and Alerts         2. Click + Add.       3. In the right slide-out's Name field, enter numbers.       4. Change the Base Type to NUMBER.       5. Click Persistent.       6. Click Logged.         7. Click Advanced Settings to open the bottom panel, in the Data Change Type drop-down select Always.         8. At the top-right, click the "check" icon for DONE.         9. At the top, click Save.     Step 3: Enter Sample Data   In this step, you will enter sample data that will illustrate the available Services.   This dataset: 2, 3, 4, 3, 2, 2, 1, 2, 1, 1, 2, 3, 3, 4, 3, 2 is shown graphed.      Enter Data   Perform the steps below to enter the above values into the Numbers Property.       1. Under the Value column and on the Numbers property row, click the "pencil" icon for Set value of property.         2. In the right-side slide-out, enter 2.         3. At the top-right, click the "check" icon for Done.       4. At the top, click Save.       5. Repeat steps 1-4 above, but changing the value each time as per the table below:   Value Change Count Entered Value 2nd 3 3rd 4 4th 3 5th 2 6th 2 7th 1 8th 2 9th 1 10th 1 11th 2 12th 3 13th 3 14th 4 15th 3 16th 2     Step 4: Test Monitoring Services   Now that value changes to the numbers Property have been logged in a Value Stream, you can use the built-in Services of the Statistical Monitoring Thing Shape to return how many of values meet different monitoring criteria.   Trend Service   You will test the built-in GetNumberOfConsecutivePointsFollowingATrend Service. This Service will return how many of points in the largest group of values following one of the three trend types INCREASING, DECREASING, and ALTERNATING.       1. Click the Service tab in the scts_thing Thing.       2. Click the GetNumberOfConsecutivePointsFollowingATrend Service.         3. In the PropertyName field enter numbers.       4. In the NumberOfPoints field enter 16 to check all entered points.       5. In the Trend field enter INCREASING to find the largest number of points with an increasing trend.       6. Click the green Execute button and note the Result property in the Output panel has the value 6         7. Repeat the steps above changing the Trend to DECREASING and note the result is 5.     Range Service   We'll now test the built-in GetNumberOfPointsBasedOnARange Service. This Service will return the number of points INSIDE or OUTSIDE a range of values specified by a MIN and MAX value.       1. Click the Close button in the Services tab in the scts_thing Thing.       2. Click the GetNumberOfPointsBasedOnARange Service.         3. In the PropertyName field enter numbers.       4. In the NumberOfPoints field enter 16 to check all entered points.       5. In the Min field enter 1.5 to set the lower end of the range criteria.       6. In the Max field enter 2.5 to set the upper end of the range criteria.       7. In the RegionOfInterest field enter INSIDE to find the largest number of points with an increasing trend.       8. Keep the default IncludeMin and IncludeMax set to True.       9. Click the green Execute button and note the Result property in the Output panel has the value 6.         10. Repeat the steps above changing the RegionOfInterest to OUTSIDEand note the result is 10.     Threshold Service   You will now test the built-in GetNumberOfPointsBeyondAThreshold Service. This Service will return the number of points ABOVE or BELOW a specified value.       1. Click the Services tab in the scts_thing Thing.       2. Click the GetNumberOfPointsBeyondAThreshold Service.         3. In the PropertyName field enter numbers.       4. In the NumberOfPoints field enter 16 to check all entered points.       5. In the Threshold field enter 3 to set the threshold criteria.       6. In the Direction field enter ABOVE to find the number of points above the threshold value.       7. Keep the default IncludeThreshold set to True.       8. Click the green Execute button and note the Result property in the Output panel has the value 7.       9. Repeat the steps above changing the Direction to BELOWand note the result is 14.       Click here to view Part 2 of this guide.
View full tip
    Step 4: Implement New Features   The SMT Assembly Line Data Model was built around a sample manufacturing facility that tracks critical data points, including diagnostic information for:   motherboards assembly machines assembly line performance   To make informed decisions based on the diagnostic and performance data, you can add features that will increase analytics capabilities.   In this optional step, we'll add a Line Chart to see the performance of any given assembly machine. Once completed, we can create Services that will be used to make calculations on the data we have generated from the assembly line.   For a final challenge, you can create a service that will compare data points to identify what works best in an assembly machine, a larger internal queue, or more placement heads.   Setup New Mashup   Create a Mashup that is Responsive and name it SMTTimeSeriesMashup. Click the Layout tab and add a column to the canvas. Drag and drop a List Widget onto the left column of the layout. Drag-and-drop a Line Chart Widget onto the other column of the layout.   Configure List Widget   Add the GetImplementingThingsWithData Service of the AssemblyMachineTemplate Thing Template as a data source in the Mashup. Ensure the checkbox for Execute on Load is checked. Drag-and-drop GetImplementingThingsWithData > Returned Data > All Data to the List Widget. On the Select Binding Target pop-up, select Data.   With the List widget selected, for the DisplayField property dropdown, select name. In the ValueField property dropdown, select name.   Configure Time Series Data Add the Dynamic QueryPropertyHistory Service of the AssemblyMachineTemplate Thing Template as a data source in the Mashup. Ensure the checkbox for Execute on Load is checked. Drag-and-drop QueryPropertyHistory > Returned Data > All Data to the Time Series Chart Widget. On the Select Binding Target pop-up, select Data. For the XAxisField property dropdown, select timestamp. For the DataField1 property dropdown, select IdleTime OR MotherboardsCompleted.   Connect Widgets   Drag-and-drop the GetImplementingThingsWithData > Returned Data > Selected Row(s) > name property to the EntityName parameter for the Dynamic AssemblyMachineTemplate data source. Select the GetImplementingThingsWithData service, then drag-and-drop the SelectedRowChanged event onto QueryPropertyHistory.   Click Save. After the save is complete, click View Mashup.   You are now able to see what the idle time is for each assemble machine over the span of its use.   Create Service   You can add JavaScript code to calculate the average completion time for the motherboard assembly.   Open the MotherboardTemplate ThingTemplate in Composer. Create a new Service titled CompletionTime. For the Output type, select Number. Enter the following code into the JavaScript window and save: var diff = 1; if(me.EndTime != null && me.StartTime != null){ diff = Math.abs(me.EndTime - me.StartTime); } //Seconds var result = Math.round(diff/1000); //Minutes //var result = Math.round((diff/1000)/60);   You can now calculate the time it takes for an individual motherboard to be completed. Create a service to be used with the GetCompletedMotherboards service (returns a list of all completed Raspberry Pi motherboards) with the SMTAssemblyLineTemplate ThingTemplate to calculate the average time for your assembly line to complete a motherboard. Finish this Service and add configure it to work with your new Mashup.       Step 5: Next Steps   Congratulations! You've successfully completed the ThingWorx Monitor an SMT Assembly Line Guide, learning how to use ThingWorx to create an application that provides real-time insight into connected assets.   Learn More   We recommend the following resources to continue your learning experience:    Capability      Guide Build Design Your Data Model Build Implement Services, Events, and Subscriptions Connect Java SDK Tutorial   Additional Resources   If you have questions, issues, or need additional information, refer to:    Resource       Link Community Developer Community Forum Support Java Edge SDK Help Center
View full tip
  Step 5: Contained Mashup   Our Minimum Viable Product (MVP) Mashup which we created in the last guide did have valid information.   Being able to display the inputs coming from the engine, as well as the analytical results coming from ThingWorx Analytics, are certainly items we don’t want to lose in this new, more complete Mashup.   Rather than recreating that work from scratch, we’ll simply include that previous Mashup in one of our sub-section via the Contained Mashup Widget.       1. Click on the top-left section to select it, and ensure that you’re on the Widgets tab in the top-left.       2. Drag-and-drop a Contained Mashup Widget onto the top-left section.       3. With the Contained Mashup Widget selected, return to the Properties tab in the bottom-left.       4. Scroll down and locate the Name Property.       5. Search for and select EFPG_Mashup       6. Click Save.     Add Column Labels   The original Mashup we created (and have now embedded in the new one) had some labels for the inputs and outputs. However, you had to know what things like “s1_fb1” meant to understand that that was an input.   We can go back to the original EPFG_Mashup, make some modifications for greater clarity, and those changes will also carry over to our new Mashup.       1. Reopen the old EPFG_Mashup on the Design tab.       2. Move all of the Widgets down to leave some extra room at the top.       3. Drag-and-drop two Divider Widgets onto the Canvas above both the Inputs and Results columns.       4. Select a Divider Widget, and go to its Style Properties.       5. Expand Base > Line to reveal the background Style Property.       6. Click on the default gray color to see the available options.       7. Choose the built-in black at the bottom, and click Select.       8. Make the same modification to the other Divider Widget.       9. Drag-and-drop two more Label Widgets onto the Canvas above the two columns.       10. Change their LabelText Properties to Inputs and Results, respectively.     Change Background and Size       1. From the Explorer tab in the top-left, select the container.       2. Select the Style Properties tab in the bottom-left and expand Base > Container.       3. Change the background Style Property to a color you prefer.       4. With the container still selected in the Explorer tab, drag the corners of the Mashup to reduce its size.       5. You could even move the Results column over, place the Auto Refresh Widget underneath, and then reduce the container size even further.       6. Click Save.     View Mashup Thus Far   With the changes to the previous EFPG_Mashup now complete, let’s ensure that everything carried over to our new Mashup.       1. Return to EEFV_Mashup.       2. Click Save.       3. Click View Mashup.   Note how the various changes we made to the base Mashup are also being shown, via a Contained Mashup Widget, in our new Mashup.   Splitting out functionality to a separate Mashup that is then embedded where needed is a great way to re-use content and simplify development.       Step 6: Add Chart   Our original Mashup (which has now been embedded in our new one) shows the instantaneous analytical results based on the inputs coming from the Edge MicroServer (EMS).   However, when investigating remote customer issues, it might be helpful to see some historical trends. A temporary "blip" of a low-grease indication might be worrisome, but it may not require immediate intervention unless the issue was occuring consistently or for extended periods of time.   Fortunately, creating a historical record is relatively simple in ThingWorx Foundation.   All that is really needed is a place in which to store the past records.   One of the easiest such storage methods is a Value Stream.       1. In ThingWorx Foundation, click Browse > Data Storage > Value Streams.       2. Click + New.       3. On the Choose Template pop-up, select ValueStream and click OK.       4. In the Name field, type EEFV_ValueStream.       5. If Project is not already set, search for and select PTCDefaultProject.       6. At the top, click Save.     Link Value Stream and Begin Storage   Now that we have a Value Stream to act as a storage location, we want to link it to EdgeThing.   After EdgeThing knows where to store historical data, we can simply instruct it which Property we want to archive by setting it to Logged.       1. Return to EdgeThing and its General Information tab.       2. In the Value Stream field, search for and select EEFV_ValueStream.       3. Click Save.       4. Still on EdgeThing, click Properties and Alerts.       5. Click Result_low_grease_mo to trigger the slide-out from the right-side.         6. Check Logged.       7. Click the Check icon in the top-right to close the slide-out.       8. Click Save.     Add Line Chart and Data   As per most guides in this Learning Path, it is assumed that you have an active connection to the EMS Engine Simulator and have your Analytics Event currently set to active.   This provides both the engine-sensor inputs and the analytical results for our Mashup.   After adding the Value Stream above, you'll need to let it run for a bit for the historical data to be archived. After it's run for a while and we have a valid history build-up, you can display that history in a Line Chart.       1. Return to EEFV_Mashup on the Design tab.       2. Click on the top-right section to select it.         3. From the Widgets tab, drag-and-drop a Line Chart onto the top-right section.         4. In the top-right of Mashup Builder, ensure the Data tab is selected.         5. Click the green + button.         6. On the Add Data pop-up in Entity Filter, search for and select EdgeThing.       7. In Services Filter, type queryprop.       8. Click the right arrow button besides QueryPropertyHistory.       9. Check Execute on Load.         10. Click Done.       11. Expand Data > Things_EdgeThing > QueryPropertyHistory > Returned Data.       Bind Data and View Mashup   Now that we have both our method of displaying the historical data, i.e. a Line Chart, as well as a method to bring backend data into the Mashup, i.e. QueryPropertyHistory, we can bind them together and see how our Mashup is progressing.       1. From the right under the Data tab, drag-and-drop EdgeThing > QueryPropertyHistory > Returned Data > All Data onto the Line Chart in the top-right of the Canvas.         2. On the Select Binding Target pop-up, click Data.         3. With the Line Chart selected, explore its Properties in the bottom-left.       4. Change XAxisField to timestamp.         5. Click Save.       6. Click View Mashup.     Your own Line Chart will vary depending on what values your Engine Simulator is sending to Foundation and Analytics.   NOTE: Remember that the Analysis Event needs to be Enabled for new values to be fed into Result_low_grease_mo.     Click here to view Part 3 of this guide.  
View full tip
  Step 6: Record Data   Now that we have a place to permanently store the values coming from the EMS engine simulator, we'll write a Service to take samples and place them within the Info Table.   Part of that Service, though, will be incrementing the identifier, so we'll need to create one last Property.   Ensure that you're on the Properties and Alerts tab of EdgeThing. At the top-left, click + Add.     In the Name field of the slide-out on the right, type identifier. Change the Base Type to NUMBER. Click Persistent. Click Has Default Value. In the Has Default Value field, type 0.   At the top-right, click the "Check" button for Done. At the top, click Save.   Store the Property Values   With all the pieces in place, we can now create our Service to add entries to our Info Table Property.   At the top of EdgeThing, click Services.   At the top-left, click + Add.   On the "+ Add" drop-down, select Local (JavaScript). In the Service Info > Name field, type recordService.     Expand Me/Entities > Properties.     Click the arrow beside infoTableProperty.     Type .AddRow({ after me.infoTableProperty to being the process of calling the "AddRow()" function.     We now have called the function which will add a row of information to the Info Table Property, one entry for each column of the formatting Data Shape.   We just need to specify which values go into which column.   Add the following lines to store the individual Identifier count into the first column of the Info Table Property: identifier:me.identifier, Because we want the identifier in the stored data to increment on each run, and we want to start the count at 1 (and the Default Value is 0), add the following line to the top of the Service: me.identifier=me.identifier+1;       Add the low_grease value with the following line: low_grease:me.low_grease, Add the following lines to store the five frequency bands of the first sensor: s1_fb1:me.s1_fb1, s1_fb2:me.s1_fb2, s1_fb3:me.s1_fb3, s1_fb4:me.s1_fb4, s1_fb5:me.s1_fb5, Add the final lines to store the five frequency bands of the second sensor and close out the AddRow() function: s2_fb1:me.s2_fb1, s2_fb2:me.s2_fb2, s2_fb3:me.s2_fb3, s2_fb4:me.s2_fb4, s2_fb5:me.s2_fb5 }); You completed Service should look like the following: me.identifier=me.identifier+1; me.infoTableProperty.AddRow({ identifier:me.identifier, low_grease:me.low_grease, s1_fb1:me.s1_fb1, s1_fb2:me.s1_fb2, s1_fb3:me.s1_fb3, s1_fb4:me.s1_fb4, s1_fb5:me.s1_fb5, s2_fb1:me.s2_fb1, s2_fb2:me.s2_fb2, s2_fb3:me.s2_fb3, s2_fb4:me.s2_fb4, s2_fb5:me.s2_fb5 });     At the top, click Done. At the top, click Save.       Run the Service   With our Service completed, let's run it to store a sampling of the data coming from our EMS Engine Simulator.   Under the Execute column in the center, on the recordService row, click the "Play" icon for Execute service. At the bottom-right, click Execute.     At the bottom-right, click Done. At the top, click Properties and Alerts.   Under the Value column, on the infoTableProperty row, click the "Pencil" icon for Set value of property.   Note that the Service has captured a snap-shot of the vibration data and grease condition and permanently stored it within the Info Table Property. You now have not only an Engine Simulator that is constantly sending data from a remote EMS, but a way to permanently record data at points that you deem significant.   Feel free to return to the Service and call it several more times. Each time, the values coming from the Engine Simulator will be stored in another entry in the Info Table Property.       Step 7: Next Steps   Congratulations! You've successfully completed the Use the EMS to Create an Engine Simulator guide, and learned how to:   Modify an EMS Template Provision Thing Properties and Values from an EMS rather than Foundation Send information from an EMS to Foundation Store large amounts of data in an InfoTable Property Create a simulator for testing   The next guide in the Vehicle Predictive Pre-Failure Detection with ThingWorx Platform learning path is Engine Simulator Data Storage. Learn More We recommend the following resources to continue your learning experience: Capability Guide Build Engine Simulator Data Storage Build Implement Services, Events and Subscriptions Additional Resources If you have questions, issues, or need additional information, refer to: Resource Link Community Developer Community Forum Support Analytics Builder Help Center  
View full tip
    Step 7: Access Data   Now that we have a basic display in-place, we need to access the backend data.   To do so, we'll make use of built-in Mashup Data Services.   At the top-right, ensure that the Data tab is selected.   Click the green + button.   In the Entities Filter field of the Add Data Pop-up, search for and select MDSD_Thing. In the Services Filter field, type getprop. Click the arrow beside GetProperties. Note that the GetProperties() Service has been added to the right section. Check the Execute on Load checkbox.   At the bottom-right of the pop-up, click Done. Note how the GetProperties() Mashup Data Service has been added to the top-right under the Data tab.   Bind Info Table Property   Now that we have access to the Data Service within the Mashup, we'll use it to pull backend data into the Mashup (specifically, the aggregated Info Table Property), and then bind it to the Grid Widget.   Expand the GetProperties Service.   Drag-and-drop MDSD_InfoTable_Property onto the Grid Advanced Widget.   On the Select Binding Target Pop-up, select Data.   Note how the bottom-center Connections Window indicates that the aggregated Info Table Property is now bound to the grid.   Access the Aggregation Service   Because we selected the Execute on Load option for GetProperties(), the Service will grab the information from the aggregated Info Table Property and propagate that into the Grid Advanced Widget on initially opening the Mashup in a web browser.   However, we currently have no way to call our custom aggregation Service besides diving into the backend as we did when we first created it and performed testing.   Instead, we want the Mashup itself to have the ability to call the aggregation Service.   We'll do so via the Button Widget's Clicked Event, but we first need to gain access to our custom Service in the Mashup.   In the top-right Data tab beside Things_MDSD_Thing, click the green </> button. Note that the Add Data Pop-up opens with the MDSD_Thing pre-selected.   On the left of the Add Data Pop-up, expand Select Service Category, and choose **Uncategorized**. Note our custom MDSD_Aggregation_Service.   Beside MDSD_Aggregation_Service, click the right arrow to add it to the Selected Services section. Note that we do NOT want to check "Execute on Load", because don't want the Service called upon initially opening the Mashup in a web browser.   At the bottom-right, click Done. Note the MDSD_Aggregation_Service has been added to the far-right Data tab.   Bind Button to Aggregation Service   Now that we can reference the aggregation Service in the Mashup, we'll bind it to pressing the Button Widget.   Click the Button Widget to select it.   Click the top-left of the Button Widget to reveal the Drop Down Menu.   Drag-and-drop the Clicked Event onto the MDSD_Aggregation_Service under the Data Tab.   On the bottom-right, with the MDSD_Aggregation_Service selected, drag-and-drop the MDSD_Aggregation_Service's ServiceInvokeCompleted Event onto GetProperties in the Data Tab. This causes the completion of the aggregation Service to re-call GetProperties, which updates the grid; as such, a new entry into the Info Table Property created by the custom Service will immediately show up in the Grid Widget. In the top-right, click the GetProperties Service to see all of its interactions in the bottom-center Connections window.   At the top, click Save.       Step 8: Test Application   Our MVP MRI Service Application is now complete.   Let's test it.   At the top, click View Mashup. Note the pre-existing single entry from our Service test which we executed directly from the backend.   Click Retrieve MRI Statistics.   Wait a moment and click Retrieve MRI Statistics again. Note that each click adds another entry to the Grid Widget.   Each time that you press the Button Widget, what you're really doing is calling our custom aggregation Service in the Foundation backend.   This Service then goes out and pulls in information from our various EMS-connected Edge sub-systems.   To add additional sub-systems (maybe a "friction" detection on the patient-bed indicating that it needs additional grease) all you would have to do is repeat the steps in this Learning Path, i.e. connect the additional sub-system with the EMS, add another Field Definition to the Data Shape, and modify the aggregation Service to retrieve that info and store it in the Info Table Property.   In addition, you may wish to improve the GUI. Rather than using a Positioning: Static Mashup, you could utilize a Responsive setup, sub-divide the Canvas into various sections, and then add items such as your company's logo. This would also make the GUI more friendly to different screen resolutions.   You can even add business logic to the Mashup itself. For instance, the Auto Refresh Widget (Legacy) can effectively be used as a "timer". In the same way that the Button Widget's Clicked Event calls the aggregation Service, the Auto Refresh Widget could be used to trigger the Service call at a set interval. Then, as long as the Mashup was open, the Button Widget would only be needed when you wanted an immediate status update.   For more information on implementing additional business logic, refer to the Create Custom Business Logic guide.   Or the Time Selector Widget could be used to restrict the information in the Grid Widget to only the timeframe you wanted to investigate.       Step 9: Next Steps   Congratulations! You've successfully completed the Medical Data Storage and Display guide, and learned how to:   Create a Data Shape and Info Table Property to store Medical Data Create a Service to combine data from multiple Edge devices into a single, logical Thing Create a Mashup to view and retrieve Medical data   This is the last guide in the Medical Device Service learning path.   Learn More   We recommend the following resources to continue your learning experience:    Capability     Guide Build Methods for Data Storage Experience Create Your Application UI   Additional Resources   If you have questions, issues, or need additional information, refer to:    Resource       Link Community Developer Community Forum Support ThingWorx Help Center  
View full tip
    Step 5: Properties   In the Delivery Truck application, there are three Delivery Truck Things. Each Thing has a number of Properties based on its location, speed, and its deliveries carried out. In this design, when a delivery is made or the truck is no longer moving, the Property values are updated. The deliveryTruck.c helper C file is based on the DeliveryTruck Entities in the Composer. After calling the construct function, there are a number of steps necessary to get going. For the SimpleThing application, there are a number of methods for creating Properties, Events, Services, and Data Shapes for ease of use.   Properties can be created in the client or just registered and utilized. In the SimpleThingClient application, Properties are created. In the DeliveryTruckClient application, Properties are bound to their ThingWorx Platform counterpart. Two types of structures are used by the C SDK to define Properties when it is seen fit to do so and can be found in [C SDK HOME DIR]/src/api/twProperties.h:    Name                   Structure            Description Property Definitions twPropertyDef Describes the basic information for the Properties that will be available to ThingWorx and can be added to a client application. Property Values twProperty Associates the Property name with a value, timestamp, and quality.   NOTE: The C SDK provides a number of Macros located in [C SDK HOME DIR]/src/api/twMacros.h. This guide will use these Macros while providing input on the use of pure function calls.   The Macro example below can be found in the main source file for the SimpleThingClient application and the accompanying helper file simple_thing.c.   TW_PROPERTY("TempProperty", "Description for TempProperty", TW_NUMBER); TW_ADD_BOOLEAN_ASPECT("TempProperty", TW_ASPECT_ISREADONLY,TRUE); TW_ADD_BOOLEAN_ASPECT("TempProperty", TW_ASPECT_ISLOGGED,TRUE);   NOTE: The list of aspect configurations can be seen in [C SDK HOME DIR]/src/api/twConstants.h. Property values can be set with defaults using the aspects setting. Setting a default value in the client will affect the Property in the ThingWorx platform after binding. It will not set a local value in the client application.   For the DeliveryTruckClient, we registered, read, and update Properties without using the Property definitions. Which method of using Properties is based on the application being built.   NOTE: Updating Properties in the ThingWorx Platform while the application is running, will update the values in the client application. To update the values in the platform to match, end the Property read section of your property handler function with a function to set the platform value.   The createTruckThing function for the deliveryTruck.c source code takes a truck name as a parameter and is used to register the Properties, functions, and handlers for each truck.   The updateTruckThing function for the deliveryTruck.c source code takes a truck name as a parameter and is used to either initialize a struct for DeliveryTruck Properties, or simulate a truck stop Event, update Properties, then fire an Event for the ThingWorx platform.   Connecting properties to be used on the platform is as easy as registering the property and optionally adding aspects. The following shows the properties that correlate to those in the DeliveryTruck entities in the Composer. To do this within the code, you would use the TW_PROPERTY macro as shown in the deliveryTruck.c. This macro must be proceeded by either TW_DECLARE_SHAPE, TW_DECLARE_TEMPLATE or TW_MAKE_THING because these macros declare variables used by the TW_PROPERTY that follow them.   //TW_PROPERTY(propertyName,description,type) TW_PROPERTY(PROPERTY_NAME_DRIVER, NO_DESCRIPTION, TW_STRING); TW_PROPERTY(PROPERTY_NAME_DELIVERIES_LEFT, NO_DESCRIPTION, TW_NUMBER); TW_PROPERTY(PROPERTY_NAME_TOTAL_DELIVERIES, NO_DESCRIPTION, TW_NUMBER); TW_PROPERTY(PROPERTY_NAME_DELIVERIES_MADE, NO_DESCRIPTION, TW_NUMBER); TW_PROPERTY(PROPERTY_NAME_LOCATION, NO_DESCRIPTION, TW_LOCATION); TW_PROPERTY(PROPERTY_NAME_SPEED, NO_DESCRIPTION, "TW_NUMBER);   Read Properties   Reading Properties from a ThingWorx platform Thing or the returned Properties of a Service can be done using the TW_GET_PROPERTY macro. Examples of its use can be seen in all of the provided applications. An example can be seen below:   int flow = TW_GET_PROPERTY(thingName, "TotalFlow").number; int pressue = TW_GET_PROPERTY(thingName, "Pressure").number; twLocation location = TW_GET_PROPERTY(thingName, "Location").location; int temperature = TW_GET_PROPERTY(thingName, "Temperature").number;   Write Properties   Writing Properties to a ThingWorx platform Thing from a variable storing is value uses a similarly named method. Using the TW_SET_PROPERTY macro will be able to send values to the platform. Examples of its use can be seen in all of the provided applications. An example is shown below:   TW_SET_PROPERTY(thingName, "TotalFlow", TW_MAKE_NUMBER(rand() / (RAND_MAX / 10.0))); TW_SET_PROPERTY(thingName, "Pressure", TW_MAKE_NUMBER(18 + rand() / (RAND_MAX / 5.0))); TW_SET_PROPERTY(thingName, "Location", TW_MAKE_LOC(gpsroute[location_step].latitude,gpsroute[location_step].longitude,gpsroute[location_step].elevation));   This macro utilizes the twApi_PushSubscribedProperties function call to push all property updates to the server. This can be seen in the updateTruckThing function in deliveryTruck.c.   Property Change Listeners   Using the Observer pattern, you can take advantage of the Property change listener functionality. With this pattern, you create functions that will be notified when a value of a Property has been changed (whether on the server or locally by your program when the TW_SET_PROPERTY macro is called).   Add a Property Change Listener   In order to add a Property change listener, call the twExt_AddPropertyChangeListener function using the:   Name of the Thing (entityName) Property this listener should watch Function that will be called when the property has changed   void simplePropertyObserver(const char * entityName, const char * thingName,twPrimitive* newValue){ printf("My Value has changed\n"); } void test_simplePropertyChangeListener() { { TW_MAKE_THING("observedThing",TW_THING_TEMPLATE_GENERIC); TW_PROPERTY("TotalFlow", TW_NO_DESCRIPTION, TW_NUMBER); } twExt_AddPropertyChangeListener("observedThing",TW_OBSERVE_ALL_PROPERTIES,simplePropertyObserver); TW_SET_PROPERTY("observedThing","TotalFlow",TW_MAKE_NUMBER(50)); } NOTE: Setting the propertyName parameter to NULL or TW_OBSERVE_ALL_PROPERTIES, the function specified by the propertyChangeListenerFunction parameter will be used for ALL properties.   Remove a Property Change Listener   In order to release the memory for your application when done with utilizing listeners for the Property, call the twExt_RemovePropertyChangeListener function.   void simplePropertyObserver(const char * entityName, const char * thingName,twPrimitive* newValue){ printf("My Value has changed\n"); } twExt_RemovePropertyChangeListener(simplePropertyObserver);       Step 6: Data Shapes   Data Shapes are an important part of creating/firing Events and also invoking Services.   Define With Macros   In order to define a Data Shape using a macro, use TW_MAKE_DATASHAPE.   NOTE: The macros are all defined in the twMacros.h header file.   TW_MAKE_DATASHAPE("SteamSensorReadingShape", TW_DS_ENTRY("ActivationTime", TW_NO_DESCRIPTION ,TW_DATETIME), TW_DS_ENTRY("SensorName", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("Temperature", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("Pressure", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("FaultStatus", TW_NO_DESCRIPTION ,TW_BOOLEAN), TW_DS_ENTRY("InletValve", TW_NO_DESCRIPTION ,TW_BOOLEAN), TW_DS_ENTRY("TemperatureLimit", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("TotalFlow", TW_NO_DESCRIPTION ,TW_INTEGER) );   Define Without Macros   In order to define a Data Shape without using a macro, use the twDataShape_CreateFromEntries function. In the example below, we are creating a Data Shape called SteamSensorReadings that has two numbers as Field Definitions.   twDataShape * ds = twDataShape_Create(twDataShapeEntry_Create("a",NULL,TW_NUMBER)); twDataShape_AddEntry(ds, twDataShapeEntry_Create("b",NULL,TW_NUMBER)); /* Name the DataShape for the SteamSensorReadings service output */ twDataShape_SetName(ds, "SteamSensorReadings");     Step 7: Events and Services   Events and Services provide useful functionality. Events are a good way to make a Service be asynchronous. You can call a Service, let it return, then your Entity can subscribe to your Event and not keep the original Service function waiting. Events are also a good way to allow the platform to respond to data when it arrives on the edge device without it having to poll the edge device for updates.   Fire Events   To fire an Event you first need to register the Event and load it with the necessary fields for the Data Shape of that Event using the twApi_RegisterEvent function. Afterwards, you would send a request to the ThingWorx server with the collected values using the twApi_FireEvent function. An example is as follows:   twDataShape * ds = twDataShape_Create(twDataShapeEntry_Create("message", NULL,TW_STRING)); /* Event datashapes require a name */ twDataShape_SetName(ds, "SteamSensorFault"); /* Register the service */ twApi_RegisterEvent(TW_THING, thingName, "SteamSensorFault", "Steam sensor event", ds); …. struct { char FaultStatus; double Temperature; double TemperatureLimit; } properties; …. properties. TemperatureLimit = rand() + RAND_MAX/5.0; properties.Temperature = rand() + RAND_MAX/5.0; properties.FaultStatus = FALSE; if (properties.Temperature > properties.TemperatureLimit && properties.FaultStatus == FALSE) { twInfoTable * faultData = 0; char msg[140]; properties.FaultStatus = TRUE; sprintf(msg,"%s Temperature %2f exceeds threshold of %2f", thingName, properties.Temperature, properties.TemperatureLimit); faultData = twInfoTable_CreateFromString("message", msg, TRUE); twApi_FireEvent(TW_THING, thingName, "SteamSensorFault", faultData, -1, TRUE); twInfoTable_Delete(faultData); }   Invoke Services   In order to invoke a Service, you will use the twApi_InvokeService function. The full documentation for this function can be found in [C SDK HOME DIR]/src/api/twApi.h. Refer to the table below for additional information.    Parameter         Type                  Description entityType Input The type of Entity that the service belongs to. Enumeration values can be found in twDefinitions.h. entityName Input The name of the Entity that the service belongs to. serviceName Input The name of the Service to execute. params Input A pointer to an Info Table containing the parameters to be passed into the Service. The calling function will retain ownership of this pointer and is responsible for cleaning up the memory after the call is complete. result Input/Output A pointer to a twInfoTable pointer. In a successful request, this parameter will end up with a valid pointer to a twInfoTable that is the result of the Service invocation. The caller is responsible for deleting the returned primitive using twInfoTable_Delete. It is possible for the returned pointer to be NULL if an error occurred or no data is returned. timeout Input The time (in milliseconds) to wait for a response from the server. A value of -1 uses the DEFAULT_MESSAGE_TIMEOUT as defined in twDefaultSettings.h. forceConnect Input A Boolean value. If TRUE and the API is in the disconnected state of the duty cycle, the API will force a reconnect to send the request.   See below for an example in which the Copy service from the FileTransferSubsystem is called:   twDataShape * ds = NULL; twInfoTable * it = NULL; twInfoTableRow * row = NULL; twInfoTable * transferInfo = NULL; int res = 0; const char * sourceRepo = "SimpleThing_1"; const char * sourcePath = "tw/hotfolder/"; const char * sourceFile = "source.txt"; const char * targetRepo = "SystemRepository"; const char * targetPath = "/"; const char * targetFile = "source.txt"; uint32_t timeout = 60; char asynch = TRUE; char * tid = 0; /* Create an infotable out of the parameters */ ds = twDataShape_Create(twDataShapeEntry_Create("sourceRepo", NULL, TW_STRING)); res = twDataShape_AddEntry(ds, twDataShapeEntry_Create("sourcePath", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("sourceFile", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetRepo", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetPath", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetFile", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("async", NULL, TW_BOOLEAN)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("timeout", NULL, TW_INTEGER)); it = twInfoTable_Create(ds); row = twInfoTableRow_Create(twPrimitive_CreateFromString(sourceRepo, TRUE)); res = twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(sourcePath, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(sourceFile, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetRepo, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetPath, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetFile, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromBoolean(asynch)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromInteger(timeout)); twInfoTable_AddRow(it,row); /* Make the service call */ res = twApi_InvokeService(TW_SUBSYSTEM, "FileTransferSubsystem", "Copy", it, &transferInfo, timeout ? (timeout * 2): -1, FALSE); twInfoTable_Delete(it); /* Grab the tid */ res = twInfoTable_GetString(transferInfo,"transferId",0, &tid);   Bind Event Handling   You may want to track exactly when your edge Entities are successfully bound to or unbound from the server. The reason for this is that only bound items should be interacting with the ThingWorx Platform and the ThingWorx Platform will never send any requests targeted at an Entity that is not bound. A simple example that only logs the bound Thing can be seen below. After creating this function, it will need to be registered using the twApi_RegisterBindEventCallback function before the connection is made.   void BindEventHandler(char * entityName, char isBound, void * userdata) { if (isBound) TW_LOG(TW_FORCE,"BindEventHandler: Entity %s was Bound", entityName); else TW_LOG(TW_FORCE,"BindEventHandler: Entity %s was Unbound", entityName); } …. twApi_RegisterBindEventCallback(thingName, BindEventHandler, NULL);   OnAuthenticated Event Handling   You may also want to know exactly when your Edge device has successfully authenticated and made a connection to the ThingWorx platform. Like the bind Event handling, this function will need to be made and registered. To register this handler, use the twApi_RegisterOnAuthenticatedCallback function before the connection is made. This handler form can also be used to do a delay bind for all Things.   void AuthEventHandler(char * credType, char * credValue, void * userdata) { if (!credType || !credValue) return; TW_LOG(TW_FORCE,"AuthEventHandler: Authenticated using %s = %s. Userdata = 0x%x", credType, credValue, userdata); /* Could do a delayed bind here */ /* twApi_BindThing(thingName); */ } … twApi_RegisterOnAuthenticatedCallback(AuthEventHandler, NULL);     Click here to view Part 3 of this guide.
View full tip
  Step 6: Create Event Router   What do you do when a user can perform multiple events in which data is generated, and want those outputs to go through the same exact process? An Events Router Function is your solution. The Events Router Function allows for multiple data sources to be funneled to one location. Let's create a simple example in our MyFunctionsMashup Mashup. In this Mashup, we'll add two Text Field Widgets and a Label Widget. The two Text Field Widgets will take user input and then an Events Router will send the output to the Label. Let's start!   Open the MyFunctionsMashup Mashup to the Design tab. Click on the Widgets tab. Type in the Filter text box for Text Field.   Drag and drop TWO (2) Text Field Widgets to the Mashup Canvas. Type in the Filter text box for Label.   Drag and drop ONE (1) Label Widget to the Mashup Canvas. We now have all the Widgets we need for this example. Let's get started on the Events Router Function. Click the + button in the Functions panel. Select Events Router in the dropdown.   Set the Name to routeUserInput.   Click Next. Set the Inputs field to 2.   Click Done. We have our Events Router setup. Now, we'll bind our new items together. Click the Bind (arrows) button on the routeUserInput Events Router.   Click the down arrow next to Input1. Select Add Source.   In the Widgets tab, scroll to the bottom and select Text Property of the first of the two recent Text Fields we created (it should be third to last).   Click Next. Click the down arrow next to Input2. Select Add Source.   In the Widgets tab, scroll to the bottom and select Text Property of the second of the two recent Text Fields we created (it should be second to last).   Click Next. Click the down arrow next to Output. Select Add Target.   In the Widgets tab, scroll to the bottom and select LabelText Property of the recent Label we created (it should be last).   Click Next. Click Done. Click Save for the Mashup. You have just created an Events Router that will update a Label based on the typed input from two Text Fields. View your Mashup and play around with the bottom two text boxes. For a completed example, download and unzip, then import the attached FunctionsGuide_Entities.zip.     Step 7: Next Steps   Congratulations! You've successfully completed the Explore UI Functions guide, and learned best practices for building a complex Mashup that navigations, multiple data inputs, confirmations, and all working together effectively for an enhanced user experience.   Learn More   We recommend the following resources to continue your learning experience:    Capability    Guide Experience Object-Oriented UI Design Tips   Additional Resources   If you have questions, issues, or need additional information, refer to:    Resource       Link Community Developer Community Forum Support Mashup Builder Support Help Center
View full tip
  Step 7: Real World Model   We’ll now rerun model creation with the Real World data.   Even though Signals and Profiles are possibly telling us that only Sensor 1 is needed, the first Model you’ll create will contain all the data, while the second Model will exclude Sensor 2. We’ll then compare the Models to see which one is going to work the best for predicting engine failures.   On the left, click Analytics Builder > Models. Click New….   In the Model Name field, enter vibration_model. In the Dataset field, select vibration_dataset.   Click Submit. After ~60 seconds, the Model Status will change to COMPLETED. Select the model that was created in the previous step, i.e. vibration_model. Click View… to open the Model Information page. Note that your model may differ slightly from the picture below, as the automatically-withheld "test" data is randomly chosen.       Unlike our simulated dataset, this real-world data is not perfect. However, it’s still pretty good, and is much more representative of what a real-world scenario would indicate.   The True Positive Rate shown on the Receiver Operating Characteristic (ROC) chart are much higher than the False Positives.   The curve is relatively high and to the left, which indicates a high accuracy level.   You may also click on the Confusion Matrix tab in the top-left, which will show you the number of True Positive and True Negatives in comparison to False Positives and False Negatives.     NOTE: The number of correct predictions is much higher than the number of incorrect predictions.   As such, we now know that our Sensors have a relatively good chance at predicting an impending failure by detecting low grease conditions before they cause catastrophic engine failure.   Refined Model   We can now compare this first Model that includes both Sensors to a Model using only Sensor 1, since we suspect that Sensor 2 may not be necessary to achieve our goal. On the left, click Analytics Builder > Models. Click New…. In the Model Name field, enter vibration_model_s1_only. In the Dataset field, select vibration_dataset.   On the right beside Excluded Fields from Model, click the Excluded Fields button.   Select s2_fb1 through s2_fb5.   While all the s2 values are selected, click the green "right-arrow", i.e. > button, in the middle.   At the bottom-left, click Save.   Click Submit. After ~60 seconds, the Model State will change to COMPLETED. With vibration_model_s1_only selected, click View….     The ROC chart is comparable to the original model (including Sensor 2).   Likewise, the Confusion Matrix (on the other tab) indicates a good ratio of correct predictions versus incorrect predictions.     NOTE: These Models may vary slightly from your own final scores, as what data is used for the prediction versus for evaluation is random.   ThingWorx Analytics’s Models have indicated that you are likely to receive roughly the same accuracy of predicting a low-grease condition whether you use one sensor or two!   If we can get an accurate early-warning of the low grease condition with just one sensor, it then becomes a business decision as to whether the extra cost of Sensor 2 is necessary.     Step 8: Next Steps   Congratulations! You've successfully completed the Build an Engine Analytical Model guide, and learned how to:   Load an IoT dataset Generate machine learning predictions Evaluate the analytics output to gain insight   The next guide in the Vehicle Predictive Pre-Failure Detection with ThingWorx Platform learning path is Manage an Engine Analytical Model.   Learn More   We recommend the following resources to continue your learning experience:   Capability Guide Analyze Operationalize an Analytics Model Build Implement Services, Events, and Subscriptions Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Community Developer Community Forum Support Analytics Builder Help Center
View full tip
  Step 4: Simulated Model   Models are primarily used by Analytics Manager (which will be discussed in the next guide), but they can still be used to estimate the accuracy of predictions.   When Models are calculated, they inherently withhold a certain amount of data (~20%). The prediction model is then run against the withheld data. This provides a form of "accuracy measure".   The withheld-data is selected randomly, so you'll actually get a slightly different model and accuracy measure each time that you create a Model versus the same dataset.   On the left, click Analytics Builder > Models.   Click New….   In the Model Name field, enter simulated_model. In the Dataset field, select simulated_dataset.   Click Submit. After ~60 seconds, the Model Status will change to COMPLETED.     Select the model that was created in the previous step, i.e. simulated_model. Click View… to open the Model Information page.   As with Signals and Profiles, our Model is once again "too good". In fact, it's perfect.   The expected "Precision" is 1.0, i.e. 100%. The True vs False Positive rate shown in the graph goes straight up to the top immediately.   While you want a graph that is "high and left", you're very unlikely to ever see real-world scenarios such as shown here.   Still, you've been able to progress the process of using Foundation (and now Analytics) to build an analytical model of MotorCo's prototype engine.   What needs to happen now is to receive real world data from your R&D engineers.     Step 5: Upload Real World Data   In your process of using the EMS Engine Simulator, the idea has always been to get a headstart on the engine developers.   At some point, they would wire sensors into the EMS and start providing real world data.   In our scenario, that has now happened. Real world data is being fed from the EMS to Foundation, Foundation is collecting that data in an Info Table Property, and you've even exported the data as a .csv. file.   This new dataset is over periods of both good and bad grease conditions. The engineers monitoring the process can flip a sensor switch connected to the EMS to log the current grease situation as either good or bad at the same time that the vibration sensors are taking readings.   We will now load this real world dataset into Analytics in the same manner that we did earlier with the simulated dataset.   Download the attached analytics_vibration.zip file to your computer. Unzip the analytics_vibration.zip file to access the vibration_data_and_header.csv and vibration_metadata.json files. On the left, click Analytics Builder > Data. Under Datasets, click New....   In the Dataset Name field, enter vibration_dataset. In the File Containing Dataset Data section, search for and select vibration_data_and_header.csv. In the File Containing Dataset Field Configuration section, search for and select vibration_metadata.json.   Click Submit.     Step 6: Real World Signals and Profiles   Now that the real-world vibration data has been uploaded, we’ll re-run Signals and Profiles just as we did before.   Hopefully, we’ll start seeing some patterns.   On the left, click Analytics Builder > Signals. At the top, click New….   In the Signal Name field, enter vibration_signal. In the Dataset field, select vibration_dataset.   Click Submit. Wait ~30 seconds for Signal State to change to COMPLETED     The results show that the five Frequency Bands for Sensor 1 are the most highly correlated with determining our goal of detecting a low grease condition.   For Sensor 2, only bands one and four seem to be related, while bands two, three, and five are hardly relevant at all.   This is a very different result than our earlier simulated data. Instead, it looks like it’s possible that the vibration-frequencies getting pickup up by our first sensor are explicitly more important.   Profiles   We’ll now re-run Profiles with our real-world dataset. On the left, click Analytics Builder > Profiles. Click New….   In the Profile Name field, enter vibration_profile. In the Dataset field, select vibration_dataset.   Click Submit. After ~30 seconds, the Signal State will change to COMPLETED.     The results show several Profiles (combinations of data) that appear to be statistically significant.   Only the first few Profiles, however, have a significant percentage of the total number of records. The later Profiles can largely be ignored.   Of those first Profiles, both Frequency Bands from Sensor 1 and Sensor 2 appear.   But in combination with the result from Signals (where Sensor 1 was always more important), this could possibly indicate that Sensor 1 is still the most important overall.   In other words, since Sensor 1 is statistically significant both by itself and in combination (but Sensor 2 is only significant in combination  with Sensor 1), then Sensor 2 may not be necessary.     Click here to view Part 3 of this guide.
View full tip
Announcements