cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Did you know you can set a signature that will be added to all your posts? Set it here! X

IoT Tips

Sort by:
  Step 5: Import Extension   Now that we have a valid dataset, we want to export it as a .csv file, which can be imported into ThingWorx Analytics in a future guide to generate an analytical model.   An easy way to do this is with the CSV Parser Extension, which you’ll now import.       1. Download the CSV Parser Extension from our third party provider IQNOX.   Note:  An account is required but the download is free.     2. At the bottom-left, click Import/Export.       3. On the drop-down, click Import.       4. For Import Option, select Extension.       5. Click Browse and navigate to the extension you downloaded above.       6. Click Open.       7. Click Import.       8. Click Close.       9. On the Refresh Composer? pop-up, click Yes.      Step 6: Create File Repository   ThingWorx Foundation uses File Repositories to read and write files from disk (including .csv files created by the CSV Parser Extension).   In this step, we’ll create a File Repository Entity.       1. Return to Browse > All.       2. Click MODELING > Things.       3. Click + New.       4. In the Name field, type ESDS_File_Repository.       5. If Project is not already set, search for and select PTCDefaultProject.       6. In the Base Thing Template field, search for and select FileRepository.      7. At the top, click Save.        Step 7: Create .csv Export Service   We have imported an Extension which gives us tools to manipulate .csv files. We have created a File Repository to which the export can save the file. We'll now make use of some of this new functionality.    We’ll do so by creating a Service which calls built-in functions of the CSV Parser Extension.       1. Return to EdgeThing.       2. Click Services.       3. Click + Add.       4. On the drop-down, select Local (JavaScript).       5. In the Name field, type exportCSVservice.       6. In the blank JavaScript field, copy-and-paste the following code:           var sFile = "vibrationCSVfile.csv"; var paramsCSV = { path: sFile, data: me.infoTableProperty, fileRepository: "ESDS_File_Repository", withHeader: true }; Resources["CSVParserFunctions"].WriteCSVFile(paramsCSV);               7. Click Save and Continue. Note that you should NOT click the top Save button, as that will erase your Service.         Step 8: Export the Engine Data   We now have all the tools in place to export the infoTableProperty as a .csv file to our new File Repository.   All that’s left is to call the appropriate functions.       1. Ensure that you’re still on the Services tab of EdgeThing, and have the exportCSVservice open.       2. At the bottom, click Execute.       3. Return to ESDS_File_Repository.       4. Click Services.       5. Scroll down and find the GetFileListingsWithLinks Service.       6. Click the “Play” icon for Execute service.       7. At the bottom-right, click Execute.       8. On the right, click Thingworx/FileRepositories/ESDS_File_Repository/vibrationCSVfile.csv.     9. The .csv export of the vibration data will now be in your local folder to which your browser saves downloads.       Step 9: Next Steps   Congratulations! You've completed the Engine Simulator Data Storage guide, and learned how to:   Create a Timer Subscribe to a Timer to Trigger a Service Generate Mass Amounts of Test Data Import the CSV Parser Extension Create a File Repository Export the Test Data as a Comma-Seperated Values (.csv) file Download from a File Repository   The next guide in the Vehicle Predictive Pre-Failure Detection with ThingWorx Platform learning path is Build an Engine Analytical Model   Learn More   We recommend the following resources to continue your learning experience:   Capability Guide Analyze Build a Predictive Analytics Model Build Implement Services, Events, and Subscriptions   Additional Resources   If you have questions, issues, or need additional information, refer to:        Resource Link Community Developer Community Forum Support Analytics Builder Help Center
View full tip
    Step 4: Create Thing   Now that we have a Data Shape to format the combination of data coming from the various sub-systems, we can now instantiate a Thing with an Info Table Property to hold all of said data.   Click Browse > Modeling > Things.   Click + New. In the Name field, type MDSD_Thing. If Project is not already set, search for and select PTCDefaultProject. In the Base Thing Template field, search for and select Generic Thing.   At the top, click Save.   Add Info Table Property   We now have a Thing to aggregate the MRI sub-system information, but we still need a Property to perform the actual storage.   We'll use an Info Table Property for this, with the columns of the Info Table formatted by the Data Shape we created in the previous step.   At the top, click Properties and Alerts.   Click + Add.   On the right in the Name field, type MDSD_InfoTable_Property. Change the Base Type to INFOTABLE. In the Data Shape field, search for and select MDSD_DataShape.   Check the box for Persistent.   At the top-right, click the "Check" button for Done. At the top, click Save.     Step 5: Create Service   Now that we have a Thing with an Info Table Property to store our aggregated data from multiple MRI sub-systems, we need to develop a Service which will grab said data and propagate that information into the Info Table Property.   At the top of MDSD_Thing, click Services.   Click + Add.   Under Service Info in the Name field, type MDSD_Aggregation_Service.     Access to MRI Sub-systems   We now need to access the various sub-systems of the MRI that are already talking to ThingWorx Foundation.   Once again, we'll only be doing so for two sub-systems in this MVP example. But the general premise will extend to as many remote devices as is necessary.   You will simply add more references as additional sub-systems are needed.   In the Javascript code window, copy-and-paste in var embedded_properties = Things["MDSD_Embedded_Thing"].GetPropertyValues(); This provides a reference to the embedded microcontroller's Properties. All Things are accessible in Foundation via the "Things" array, and you simply need to provide the Thing-name to index into the array; this functions similarly to a "global" variable, so that any Thing can reference any other Thing. The built-in GetPropertyValues Service simply returns the values of all Properties of the Thing being referenced. In the Javascript code window, copy-and-paste in var pc_properties = Things["MDSD_PC_Thing"].GetPropertyValues(); This provides a reference to the PC's Properties.     Add Values to Info Table   Now that we have references to the sub-systems, we'll add their individual Property values to each field of the Info Table Property.   We'll do this via the built-in AddRow() Service.   To begin an AddRow Service call, copy-and-paste me.MDSD_InfoTable_Property.AddRow({ The me reference is MDSD_Thing, since we're inside said Entity. The MDSD_InfoTable_Property is the Property we added in this guide's previous step. The built-in AddRow Service will add each following Property value to a field of the Info Table formatted by the previously-created Data Shape.   Copy-and-paste Coolant_Percent:embedded_properties.Coolant_Percent, This stores the embedded microcontroller's "Coolant Percent" in the first field of a row of the aggregated Info Table.   Copy-and-paste Field_Strength:embedded_properties.Field_Strength, Likewise, this references the second Property of the embedded microcontroller to store in the second field of the Info Table.   Copy-and-paste Magnet_Temperature:embedded_properties.Magnet_Temperature,   Now that we have all the embedded microcontroller's values, copy-and-paste the following lines for the PC's values: Number_of_Scans:pc_properties.Number_of_Scans,SSD_Space_Open:pc_properties.SSD_Space_Open, Unused_RAM:pc_properties.Unused_RAM,   We also want to record the Timestamp (via the built-in Date Service) when these entries were added; copy-and-paste Timestamp:Date.now()   Finally, close off the AddRow Service with some braces, i.e. copy-and-paste });   Review the entire Service in Foundation and ensure that it matches the Javascript code below. var embedded_properties = Things["MDSD_Embedded_Thing"].GetPropertyValues(); var pc_properties = Things["MDSD_PC_Thing"].GetPropertyValues();me.MDSD_InfoTable_Property.AddRow({ Coolant_Percent:embedded_properties.Coolant_Percent, Field_Strength:embedded_properties.Field_Strength, Magnet_Temperature:embedded_properties.Magnet_Temperature, Number_of_Scans:pc_properties.Number_of_Scans, SSD_Space_Open:pc_properties.SSD_Space_Open, Unused_RAM:pc_properties.Unused_RAM, Timestamp:Date.now() }); For the MDSD_Aggregation_Service, click Done. Click Save.   Test Service   Before going further, we should test the Service to ensure that it is correctly adding entries to the aggregate Info Table Property.   On the MDSD_Aggregation_Service row, under the Execute column, click the Play icon.   At the bottom-right of the Execute Service pop-up, click Execute.   Click Done, and return to Properties and Alerts. Notice under the Value column that the Info Table Property now has an entry.   Under the Value column, click the Pencil icon for Edit.   Review the values and confirm that every field has a valid entry. Note that your values will differ from those in the picture due to the random nature of the simulator. On the pop-up, click Cancel. At the top, click Save.     Step 6: Create Mashup   Now that we have a Thing that has logically aggregated the infomation into a single Info Table Property (and a Service to carry out said aggregation), we can start to visualize the data with a Mashup.   For more information on Mashups, reference the Create Your Application UI guide.   Click Browse > Visualization > Mashups.   Click + New.   On the New Mashup Pop-up, leave the defaults, and click OK. In the Name field, type MDSD_Mashup.   If Project is not already set, search for and select PTCDefaultProject. At the top, click Save.   At the top, click Design.   At the top-left, click the Layout tab.   For Positioning, select the Static radio-button.   At the top-left, click the Widgets tab. At the top, click Save.   Widgets   We now have a "Static Positioning" Mashup, which will let us drag-and-drop Widgets without them auto-expanding to fill the entire space. This will alow us to have multiple Widgets without worrying about sub-dividing the Mashup.   In particular, we're interested in the Grid Widget to display our aggregated data, as well as a Button Widget to call the Service to perform the aggregation.   On the left in the Filter Widgets field, type grid.   Drag-and-drop a Grid Advanced Widget onto the central Canvas area.   In the Filter Widgets field, type button.   Drag-and-drop a Button Widget onto the central Canvas area.   Re-size (by clicking-and-stretching) and move the two Widgets such that they look roughly like the picture below.   Click the Button Widget to select it. In the Filter Properties field of the bottom-left Properties section, type label.   In the Label field, type Retrieve MRI Statistics, and then hit the Tab keyboard key to lock in the change.   At the top, click Save.     Click here to view Part 3 of this guide.
View full tip
  Step 5: Contained Mashup   Our Minimum Viable Product (MVP) Mashup which we created in the last guide did have valid information.   Being able to display the inputs coming from the engine, as well as the analytical results coming from ThingWorx Analytics, are certainly items we don’t want to lose in this new, more complete Mashup.   Rather than recreating that work from scratch, we’ll simply include that previous Mashup in one of our sub-section via the Contained Mashup Widget.       1. Click on the top-left section to select it, and ensure that you’re on the Widgets tab in the top-left.       2. Drag-and-drop a Contained Mashup Widget onto the top-left section.       3. With the Contained Mashup Widget selected, return to the Properties tab in the bottom-left.       4. Scroll down and locate the Name Property.       5. Search for and select EFPG_Mashup       6. Click Save.     Add Column Labels   The original Mashup we created (and have now embedded in the new one) had some labels for the inputs and outputs. However, you had to know what things like “s1_fb1” meant to understand that that was an input.   We can go back to the original EPFG_Mashup, make some modifications for greater clarity, and those changes will also carry over to our new Mashup.       1. Reopen the old EPFG_Mashup on the Design tab.       2. Move all of the Widgets down to leave some extra room at the top.       3. Drag-and-drop two Divider Widgets onto the Canvas above both the Inputs and Results columns.       4. Select a Divider Widget, and go to its Style Properties.       5. Expand Base > Line to reveal the background Style Property.       6. Click on the default gray color to see the available options.       7. Choose the built-in black at the bottom, and click Select.       8. Make the same modification to the other Divider Widget.       9. Drag-and-drop two more Label Widgets onto the Canvas above the two columns.       10. Change their LabelText Properties to Inputs and Results, respectively.     Change Background and Size       1. From the Explorer tab in the top-left, select the container.       2. Select the Style Properties tab in the bottom-left and expand Base > Container.       3. Change the background Style Property to a color you prefer.       4. With the container still selected in the Explorer tab, drag the corners of the Mashup to reduce its size.       5. You could even move the Results column over, place the Auto Refresh Widget underneath, and then reduce the container size even further.       6. Click Save.     View Mashup Thus Far   With the changes to the previous EFPG_Mashup now complete, let’s ensure that everything carried over to our new Mashup.       1. Return to EEFV_Mashup.       2. Click Save.       3. Click View Mashup.   Note how the various changes we made to the base Mashup are also being shown, via a Contained Mashup Widget, in our new Mashup.   Splitting out functionality to a separate Mashup that is then embedded where needed is a great way to re-use content and simplify development.       Step 6: Add Chart   Our original Mashup (which has now been embedded in our new one) shows the instantaneous analytical results based on the inputs coming from the Edge MicroServer (EMS).   However, when investigating remote customer issues, it might be helpful to see some historical trends. A temporary "blip" of a low-grease indication might be worrisome, but it may not require immediate intervention unless the issue was occuring consistently or for extended periods of time.   Fortunately, creating a historical record is relatively simple in ThingWorx Foundation.   All that is really needed is a place in which to store the past records.   One of the easiest such storage methods is a Value Stream.       1. In ThingWorx Foundation, click Browse > Data Storage > Value Streams.       2. Click + New.       3. On the Choose Template pop-up, select ValueStream and click OK.       4. In the Name field, type EEFV_ValueStream.       5. If Project is not already set, search for and select PTCDefaultProject.       6. At the top, click Save.     Link Value Stream and Begin Storage   Now that we have a Value Stream to act as a storage location, we want to link it to EdgeThing.   After EdgeThing knows where to store historical data, we can simply instruct it which Property we want to archive by setting it to Logged.       1. Return to EdgeThing and its General Information tab.       2. In the Value Stream field, search for and select EEFV_ValueStream.       3. Click Save.       4. Still on EdgeThing, click Properties and Alerts.       5. Click Result_low_grease_mo to trigger the slide-out from the right-side.         6. Check Logged.       7. Click the Check icon in the top-right to close the slide-out.       8. Click Save.     Add Line Chart and Data   As per most guides in this Learning Path, it is assumed that you have an active connection to the EMS Engine Simulator and have your Analytics Event currently set to active.   This provides both the engine-sensor inputs and the analytical results for our Mashup.   After adding the Value Stream above, you'll need to let it run for a bit for the historical data to be archived. After it's run for a while and we have a valid history build-up, you can display that history in a Line Chart.       1. Return to EEFV_Mashup on the Design tab.       2. Click on the top-right section to select it.         3. From the Widgets tab, drag-and-drop a Line Chart onto the top-right section.         4. In the top-right of Mashup Builder, ensure the Data tab is selected.         5. Click the green + button.         6. On the Add Data pop-up in Entity Filter, search for and select EdgeThing.       7. In Services Filter, type queryprop.       8. Click the right arrow button besides QueryPropertyHistory.       9. Check Execute on Load.         10. Click Done.       11. Expand Data > Things_EdgeThing > QueryPropertyHistory > Returned Data.       Bind Data and View Mashup   Now that we have both our method of displaying the historical data, i.e. a Line Chart, as well as a method to bring backend data into the Mashup, i.e. QueryPropertyHistory, we can bind them together and see how our Mashup is progressing.       1. From the right under the Data tab, drag-and-drop EdgeThing > QueryPropertyHistory > Returned Data > All Data onto the Line Chart in the top-right of the Canvas.         2. On the Select Binding Target pop-up, click Data.         3. With the Line Chart selected, explore its Properties in the bottom-left.       4. Change XAxisField to timestamp.         5. Click Save.       6. Click View Mashup.     Your own Line Chart will vary depending on what values your Engine Simulator is sending to Foundation and Analytics.   NOTE: Remember that the Analysis Event needs to be Enabled for new values to be fed into Result_low_grease_mo.     Click here to view Part 3 of this guide.  
View full tip
    Step 3: Important Factors   If the company cannot ship or deliver their products fast enough, that will cause food waste and less revenue. At the same time, having nonstop access to meaningful data about the logistics side of the company provides a new level of decision-making capabilities.   Let’s first see what some of the pitfalls are that causes bad logistics or room for improvement. We can keep these items in mind as we work on our application.   Customer behavior – More attention can be put on how customers are shopping in certain areas. It’s not enough to know what areas are buying the most products and send them more shipments. Deadhead miles and load management will help save unnecessary costs. Shipment tracking and route planning – The traveling salesman problem is one that scientists have been working on for ages. There is no one solution to this problem, but there are many bad ones. Planning methods and routes is almost magic, but the more methodical the process, the more can be saved here. Something as simple and selecting the routes based on the number of right turns to reduce gas can save millions on yearly gas expenses. Method of travel utilization – Sea. Air. Road. Train. Each method has its benefits and down sides. When you pick a method, also incorporate how to utilize all the space provided. This could be using smaller boxes, a different type of packaging material, or playing Tetris in a trailer.   Customer Models   Understanding your customer and their habits is of utmost importance. We'll start by creating some of the base models used for customers in this applications. You will build on top of these models as you progress through this learning path.   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Shape in the dropdown.   In the name field, enter Fizos.Customers.DataShape. All of our customers will be based off this Data Shape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes. Click on the Field Definitions tab and click the + Add button to add new Field Definitions.   Add the list of Properties below:  Name       Base Type      Aspects                           Description ID Integer 0 minimum, primary key, default 0 Row identifier UUID String N/A String used as unique identifer across multiple platforms Type String N/A Type of customer (individual or another company) Factors Tags Data Tag This will hold the different type of data points or tags that will help to analyze a customer's characteristics and behavior Name String N/A Customer name Email String N/A Customer email Address String N/A Customer address Phone String N/A Customer phone number   The Properties for the Fizos.Customers.DataShape Data Shape should match the following:   7. In the ThingWorx Composer, click the + New in the top left of the screen.   8. Select Data Table in the dropdown and select Data Table in the prompt.   9. In the name field, enter Fizos.Customers.DataTable. Our differing types of customers will fall under this template.   10. For the Data Shape field, select Fizos.Customers.DataShape. 11. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   12. This entity will be used to house our data and provide assistance with our analytics. Vehicle Models   To build a plan for your logistics solutions, you first need to have the data necessary for your vehicles and factories. Let's begin housing this data to help us with our planning. In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Shape in the dropdown.   In the name field, enter Fizos.Vehicles.DataShape. All of our vehicles will be based off this Data Shape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   Add the list of properties below: Name          Base Type       Aspects                               Description ID Integer 0 minimum, primary key, default 0 Row identifier FactoryID Integer 0 minimum, default 0 Factory row identifier Location Location N/A String used as unique identifer across multiple platforms Features Tags Data Tag This will hold the different type of data points or tags that will help to plan what this vehicle can and will build Size String N/A Factory size   The properties for the Fizos.Factories.DataShape Data Shape are as follows:   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Table in the dropdown and select Data Table in the prompt.   In the name field, enter Fizos.Vehicles.DataTable. Our differing types of vehicles will be inside of this Data Table. For the Data Shape field, select Fizos.Vehicles.DataShape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   This entity will be used to house our data and provide assistance with our analytics.   Factory Models   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Shape in the dropdown.   In the name field, enter Fizos.Factories.DataShape. All of our factories will be based off this Data Shape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   Add the list of properties below: Name        Base Type       Aspects                                  Description ID Integer 0 minimum, primary key, default 0 Row identifier Location Location N/A String used as unique identifer across multiple platforms Features Tags Data Tag This will hold the different type of data points or tags that will help to plan what this factory can and will build Size String N/A Factory size   The properties for the Fizos.Factories.DataShape Data Shape are as follows:     In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Table in the dropdown.   In the name field, enter Fizos.Factories.DataTable. Our differing types of factories will be inside of this Data Table. For the Data Shape field, select Fizos.Factories.DataShape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   This entity will be used to house our data and provide assistance with our analytics.   Centralized Logistics   Our application needs an efficient system of logistics. We already have sensors for our food entities, so see below how we work to move in the right direction. We'll be using a Thing Template to allow our new services to be overriden later if we so choose.   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Thing Template in the dropdown.   In the name field, enter Fizos.Logstics. All of our product line will fit this abstract entity. For the Base Thing Template field, select GenericThing. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   Add the list of Services below. The level of the complexity in these Service vary based on how you would like to start your daily routine, the number of employees, number of deliveries and facilities, etc. Name                                    Return Type   Override    Async     Description PerformDailyDeliveries Nothing Yes Yes Start process of regular product deliveries.   The list of services should look like the following:     Click here to view Part 3 of this guide.
View full tip
  Step 4: Write Data to External Database   You’ve connected to the database, you’re able to query the database. Now let’s handle inserting new data into the database. The update statements and data shown below are based on the table scripts provided in the download. Examples of how the ThingWorx entity should look can be seen in the SQLServerDatabaseController and OracleDatabaseController entities that you've downloaded   Running an Insert   Follow the steps below to set up a helper service to perform queries for the database. While other services might generate the query to be used, this helper service will be your shared execution service. In the DatabaseController entity, go to the Services tab.     2. Create a new service of type SQL (Command) called RunDatabaseCommand. 3. Keep the Output as Integer. 4. Add the following parameter:   Name Base Type Required command String True         5. Add the following code to your new service:   <<command>>       6. Click Save and Continue. Your service signature should look like the below example.     You now have a service that can run commands to the database. Run your service with a simple insert.   There are two ways to go from here. You can either query the database using services that call this service, or you can create more SQL Command services that query the database directly. Let’s go over each method next, starting with a service to call the helper.   In the Services tab of the DatabaseController entity, create a new service of type JavaScript. Name the service JavaScriptInsert_PersonsTable. Set the Output as InfoTable, but do not set the DataShape for the InfoTable. Add the following code to your new service: try { var command = "INSERT INTO Persons (person_key, person_name_first, person_name_last, person_email, person_company_name, " + "person_company_position, person_addr1_line1, person_addr1_line2, person_addr1_line3, person_addr1_city, person_addr1_state, " + "person_addr1_postal_code, person_addr1_country_code, person_addr1_phone_number, person_addr1_fax_number, person_created_by, " + "person_updated_by, person_created_date, person_updated_date) VALUES ('" + key + "', '" + name_first + "', '" + name_last + "', '" + email + "', '" + company_name + "', '" + company_position + "', '" + addr1_line1 + "', '" + addr1_line2 + "', '" + addr1_line3 + "', '" + addr1_city + "', '" + addr1_state + "', '" + addr1_postal_code + "', '" + addr1_country_code + "', '" + addr1_phone_number + "', '" + addr1_fax_number + "', '" + created_by + "', '" + updated_by + "', '" + created_date + "', '" + updated_date + "')"; logger.debug("DatabaseController.JavaScriptInsert_PersonsTable(): Query - " + command); var result = me.RunDatabaseCommand({command: command}); } catch(error) { logger.error("DatabaseController.JavaScriptInsert_PersonsTable(): Error - " + error.message); }         5. Add the following parameter:   Name Base Type Required key String True name_first String True name_last String True company_name String True company_position String True addr1_line1 String True addr1_line2 String True addr1_line3 String True addr1_city String True addr1_state String True addr1_postal_code String True addr1_country_code String True addr1_phone_number String True addr1_fax_number String True created_by String True updated_by String True created_date String True updated_date String True         6. Click Save and Continue.   Any parameter, especially those that were entered by users, that is being passed into a SQL Statement using the Database Connectors should be fully validated and sanitized before executing the statement! Failure to do so could result in the service becoming an SQL Injection vector.   Now, let’s utilize a second method to create a query directly to the database. You can use open and close brackets for parameters for the insert. You can also use <> as a method to mark a value that will need to be replaced. As you build your insert statement, use [[Parameter Name]] for parameters/variables substitution and <<string replacement >> for string substitution.   1. In the Services tab of the DatabaseController entity, create a new service of type SQL (Command).   2. Name the service SQLInsert_PersonsTable. 3. Add the following code to your new service: INSERT INTO Persons (person_key ,person_name_first ,person_name_last ,person_email ,person_company_name ,person_company_position ,person_addr1_line1 ,person_addr1_line2 ,person_addr1_line3 ,person_addr1_city ,person_addr1_state ,person_addr1_postal_code ,person_addr1_country_code ,person_addr1_phone_number ,person_addr1_fax_number ,person_created_by ,person_updated_by ,person_created_date ,person_updated_date) VALUES ([[key]] ,[[name_first]] ,[[name_last]] ,[[email]] ,[[company_name]] ,[[company_position]] ,[[addr1_line1]] ,[[addr1_line2]] ,[[addr1_line3]] ,[[addr1_city]]]] ,[[addr1_state]] ,[[addr1_postal_code]] ,[[addr1_country_code]] ,[[addr1_phone_number]] ,[[addr1_fax_number]] ,[[created_by]] ,[[updated_by]] ,[[created_date]] ,[[updated_date]]);       4. Add the following parameter:   Name Base Type Required key String True name_first String True name_last String True company_name String True company_position String True addr1_line1 String True addr1_line2 String True addr1_line3 String True addr1_city String True addr1_state String True addr1_postal_code String True addr1_country_code String True addr1_phone_number String True addr1_fax_number String True created_by String True updated_by String True created_date String True updated_date String True         5. Click Save and Continue.   Examples of insert services can be seen in the provided downloads.     Step 5: Executing Stored Procedures   There will be times when a singluar query will not be enough to get the job done. This is when you'll need to incorporate stored procedures into your database design.   ThingWorx is able to use the same SQL Command when executing a stored procedure with no data return and a SQL query when executing a stored procedure with an expected result set. Before executing these services or stored procedures, ensure they exist in your database. They can be found in the example file provided.   Execute Stored Procedure   Now, let's create the service to handle calling/executing a stored procedure.   If you are expecting data from this stored procedure, use EXEC to execute the stored procedure. If you only need to execute the stored procedure and do not expect a result set, then using the EXECUTE statement is good enough. You're also able to use the string substitution similar to what we've shown you in the earlier steps.   In the DatabaseController entity, go to the Services tab. Create a new service of type SQL (Command) called RunAssignStudentStoredProcedure. Add the following parameter:   Name Base Type Required student_key String True course_key String True         4. Add the following code to your new service:   EXECUTE AddStudentsToCourse @person_key = N'<<person_key>>', @course_key = N'<<course_key>>'; You can also perform this execute in a service based on JavaScript using the following code: try { var command = "EXECUTE AddStudentsToCourse " + " @student_key = N'" + student_key + "', " + " @course_key = N'" + course_key + "'"; logger.debug("DatabaseController.RunAssignStudentStoredProcedure(): Command - " + command); var result = me.RunDatabaseCommand({command:command}); } catch(error) { logger.error("DatabaseController.RunAssignStudentStoredProcedure(): Error - " + error.message); }         5. Click Save and Continue.   Execute Stored Procedure for Data   Let's create the entity you will use for both methods. This can be seen in the example below:     In the DatabaseController entity, go to the Services tab. Create a new service of type SQL (Query) called GetStudentCoursesStoredProcedure. Set the Output as InfoTable, but do not set the DataShape for the InfoTable. Add the following parameter:   Name Base Type Required course_key String True         5. Add the following code to your new service:   EXEC GetStudentsInCourse @course_key = N'<<course_key>>'   You can also perform this execute in a service based on JavaScript using the following code:   try { var query = "EXEC GetStudentsInCourse " + " @course_key = N'" + course_key + "'"; logger.debug("DatabaseController.GetStudentCoursesStoredProcedure(): Query - " + query); var result = me.RunDatabaseQuery({query:query}); } catch(error) { logger.error("DatabaseController.GetStudentCoursesStoredProcedure(): Error - " + error.message); }       6. Click Save and Continue.   You've now created your first set of services used to call stored procedures for data. Of course, these stored procedures will need to be in the database before they can successfully run.     Click here to view Part 3 of this guide.
View full tip
  Learn how to store and display medical device data for a Service opportunity.   Guide Concept   In this guide, you’ll learn how to combine information from multiple Edge devices into a single, logical Thing.   You’ll then create a GUI to display this combined information (as well as retrieve new information on demand) to facilitate a “Medical Service Play”.     You'll learn how to   Create a Data Shape and Info Table Property to store Medical Data Create a Service to combine data from multiple Edge devices into a single, logical Thing Create a Mashup to view and retrieve Medical data   NOTE:  The estimated time to complete ALL parts of this guide is 60 minutes     Step 1: Medical Learning Path   So far in this Learning Path, you've been able to connect both an embedded controller (simulated by a Raspberry Pi) and a PC to ThingWorx Foundation.   This is important, as medical devices can be complicated pieces of technology controlled by multiple "intelligent" subsystems. It is not always practical (or desirable) to have these subsystems communicate with each other, even if they all need to work together to function optimally.   Fortunately, Foundation has the capability to combine relevant data from multiple Edge devices into a single, logical Thing.   In this step of the Learning Path, you will do just that to facilitate a "Service Play".   The scenario is that your company manufactures and services Magnetic Resonance Imaging (MRI) devices.     Rather than simply servicing the MRI when there is an issue (and after a very expensive device has been damaged from something going wrong), your company maintains a continuing service contract with the hospital to monitor the MRI and perform preventative maintenance on it BEFORE anything goes wrong.   The value proposition to the hospital is the ability to keep their expensive investment in perfect operating order rather than suffering an unexpected failure. In turn, your company reaps the benefits of receiving a steady source of income from said service contract.   In order to achieve this level of preventative maintenance, your company needs to constantly monitor the MRI's various functions.   The MRI is composed of multiple working parts, but for this scenario, we'll limit our Minimum Viable Product (MVP) Service Application to the following:   An embedded device which monitors various hardware elements (such as magnet temperature and remaining coolant) A Windows PC (common in hospital equipment) which is used by a Medical Technician to control the MRI In particular, it's important to note that this PC has access to patient medical data, which can be subject to Health Insurance Portability and Accountability Act (HIPAA) violations. Fortunately here again, it's possible to segregate this information into protected and non-protected sections to limit your company's liability. We'll only propogate non-protected, generalized information to Foundation, such as the total number of scans that have been run thus far. This can facilitate your company getting a connected device approved by a hospital worried about HIPAA-compliance.   To help you run through this guide, we'll also utilize a pair of "simulators" to mimic data coming from the EMS-es on the Pi and PC, rather than directly taking information from them. So if you had any issues with the previous steps getting the EMS running on these devices, you can still complete this guide without issue.     Step 2: Import Simulators   As mentioned, we'll be using a pair of simulators to mimic connections to both an embedded microcontroller and a PC, both of which are part of the MRI.   Perform the following steps to import the simulators.   Download and unzip the MDSD_Entities.zip file attached to this article. In the bottom-left of Foundation Composer, click Import/Export.   Click Import.   On the Import pop-up, click Browse. Navigate to the download location and select the MDSD_Entities.twx file.   Click Open.   Click Import.   Click Close.   Note how there are now several MDSD Entities. These represent Things connected to different parts of the MRI which are communicating to Foundation via the EMS Agent (as per the previous guides in this Learning Path).   Investigate Simulators   Since there are multiple sub-systems which are all communicating directly to Foundation (but we really only want to check the status of the MRI as one logical entity), we need to know what exactly is being communicated back.   What is being communicated was likely determined by your company's Edge Developers when they implemented the EMS agent on said sub-systems.   Perform the following steps to investigate the simulators.   Click MDSD_Embedded_Thing.   Click Properties and Alerts.   So the embedded microcontroller has sensors which are tracking the following:    Property                           Units                       Description Coolant Percent Percent Amount of coolant left to refrigerate the super-conducting magnets Field Strength Tesla Strength of the magnetic field Magnet Temperature Degrees Celsius Temperature of the magnets   Next, let's look at the other simulator. Return to Browse > All. Click MDSD_PC_Thing. Click Properties and Alerts.   The PC is tracking information from the MRI controlling software, including the following:   Property Units  Description Number of Scans Scans Aggregate count of all scans the MRI has performed since last reset SSD Space Open Megabytes Amount of space left on the hard-drive Unused RAM Megabytes Amount of RAM still available to the system   Both of these remote devices are communicating valuable information.   The embedded microcontroller is feeding us information about the hardware of the MRI itself. Refilling coolant as needed is likely one of the service contract stipulations and will be a regular service need. And a Field Strength drop or Magnet Temperature rise could indicate more significant issues which could damage the MRI if not promptly addressed.   The PC gives us information about the operation of the MRI. Hard-drive or RAM running low could indicate that the hardware specs of the PC need to be upgraded, while the Number of Scans can give us a rough estimate of how much refrigerant should have been used versus the currently available level. If only a handful of Scans have been run, but the amount of coolant has dropped by a significant amount, then there could be a leak somewhere which would need to be addressed.   The combination of the Embedded and PC data is enough for us to begin working on a constantly-monitoring application which will facilitate our Service Play.       Step 3: Create Data Shape   Now that we're aware of what information is coming from the separate sub-systems of the MRI, we need to combine them into a single, logical Thing for the purposes of a Service Play.   To do so, we'll create a Thing which represents the MRI as a whole. Additional sub-systems can be added to to the collective information of this Thing as is necessary. But, as stated, this is simply an MVP, so we'll stick to the embedded and PC sub-systems at present.   A good way to aggregate multiple data points into a single item is via an Info Table Property. However, any time you create an Info Table, you also need a Data Shape to format the "columns" of the spreadsheet-like Property.   For more information on the storage of "mass-data", please refer to the Methods for Data Storage guide.   Perform the following steps to create a Data Shape.   Click Browse > Modeling > Data Shapes.   Click + New. In the Name field, type MDSD_DataShape.   If Project is not already set, search for and select PTCDefaultProject. At the top, click Field Definitions.   Click + Add.   Embedded Definitions   We now want to add Field Definitions to the Data Shape which map the information that we want to collate between the various sub-systems.   We'll start with the Embedded Microcontroller Properties.   On the far-right in the Name field, type Coolant_Percent. Change the Base Type to Number.   At the top-right, click the "Check with a +" for Done and Add. In the Name field, type Field_Strength. Change the Base Type to Number.   Click the "Check with a +" for Done and Add. In the Name field, type Magnet_Temperature. Change the Base Type to Number.   Click the "Check" button for Done. At the top, click Save.   PC Definitions   Additional Definitions can be added at anytime as other sub-systems are included.   We'll now include the PC Properties.   Click + Add. In the Name field, type Number_of_Scans. Change the Base Type to Number.   Click the "Check with a +" for Done and Add. In the Name field, type SSD_Space_Open. Change the Base Type to Number.   Click the "Check with a +" for Done and Add. In the Name field, type Unused_RAM. Change the Base Type to Number.   Click the "Check" button for Done. At the top, click Save.   Timestamp   It might also be benificial to include a Timestamp of when the values of both the embedded microcontroller and PC were added.   Click + Add. In the Name field, type Timestamp. Change the Base Type to DATETIME.   Click the Check button for Done. Click Save.     Click here to view Part 2 of this guide.
View full tip
  Step 4: Simulated Model   Models are primarily used by Analytics Manager (which will be discussed in the next guide), but they can still be used to estimate the accuracy of predictions.   When Models are calculated, they inherently withhold a certain amount of data (~20%). The prediction model is then run against the withheld data. This provides a form of "accuracy measure".   The withheld-data is selected randomly, so you'll actually get a slightly different model and accuracy measure each time that you create a Model versus the same dataset.   On the left, click Analytics Builder > Models.   Click New….   In the Model Name field, enter simulated_model. In the Dataset field, select simulated_dataset.   Click Submit. After ~60 seconds, the Model Status will change to COMPLETED.     Select the model that was created in the previous step, i.e. simulated_model. Click View… to open the Model Information page.   As with Signals and Profiles, our Model is once again "too good". In fact, it's perfect.   The expected "Precision" is 1.0, i.e. 100%. The True vs False Positive rate shown in the graph goes straight up to the top immediately.   While you want a graph that is "high and left", you're very unlikely to ever see real-world scenarios such as shown here.   Still, you've been able to progress the process of using Foundation (and now Analytics) to build an analytical model of MotorCo's prototype engine.   What needs to happen now is to receive real world data from your R&D engineers.     Step 5: Upload Real World Data   In your process of using the EMS Engine Simulator, the idea has always been to get a headstart on the engine developers.   At some point, they would wire sensors into the EMS and start providing real world data.   In our scenario, that has now happened. Real world data is being fed from the EMS to Foundation, Foundation is collecting that data in an Info Table Property, and you've even exported the data as a .csv. file.   This new dataset is over periods of both good and bad grease conditions. The engineers monitoring the process can flip a sensor switch connected to the EMS to log the current grease situation as either good or bad at the same time that the vibration sensors are taking readings.   We will now load this real world dataset into Analytics in the same manner that we did earlier with the simulated dataset.   Download the attached analytics_vibration.zip file to your computer. Unzip the analytics_vibration.zip file to access the vibration_data_and_header.csv and vibration_metadata.json files. On the left, click Analytics Builder > Data. Under Datasets, click New....   In the Dataset Name field, enter vibration_dataset. In the File Containing Dataset Data section, search for and select vibration_data_and_header.csv. In the File Containing Dataset Field Configuration section, search for and select vibration_metadata.json.   Click Submit.     Step 6: Real World Signals and Profiles   Now that the real-world vibration data has been uploaded, we’ll re-run Signals and Profiles just as we did before.   Hopefully, we’ll start seeing some patterns.   On the left, click Analytics Builder > Signals. At the top, click New….   In the Signal Name field, enter vibration_signal. In the Dataset field, select vibration_dataset.   Click Submit. Wait ~30 seconds for Signal State to change to COMPLETED     The results show that the five Frequency Bands for Sensor 1 are the most highly correlated with determining our goal of detecting a low grease condition.   For Sensor 2, only bands one and four seem to be related, while bands two, three, and five are hardly relevant at all.   This is a very different result than our earlier simulated data. Instead, it looks like it’s possible that the vibration-frequencies getting pickup up by our first sensor are explicitly more important.   Profiles   We’ll now re-run Profiles with our real-world dataset. On the left, click Analytics Builder > Profiles. Click New….   In the Profile Name field, enter vibration_profile. In the Dataset field, select vibration_dataset.   Click Submit. After ~30 seconds, the Signal State will change to COMPLETED.     The results show several Profiles (combinations of data) that appear to be statistically significant.   Only the first few Profiles, however, have a significant percentage of the total number of records. The later Profiles can largely be ignored.   Of those first Profiles, both Frequency Bands from Sensor 1 and Sensor 2 appear.   But in combination with the result from Signals (where Sensor 1 was always more important), this could possibly indicate that Sensor 1 is still the most important overall.   In other words, since Sensor 1 is statistically significant both by itself and in combination (but Sensor 2 is only significant in combination  with Sensor 1), then Sensor 2 may not be necessary.     Click here to view Part 3 of this guide.
View full tip
    Generate engine-failure predictions and gain insight into your data with machine learning.   GUIDE CONCEPT   This guide will upload captured data from an Edge MicroServer (EMS) "Engine Simulator" to ThingWorx Analytics Builder.   Following the steps in this guide, you will create an analytical model, and then refine it based on further information from the Analytics platform.   We will teach you how to determine whether or not a model is accurate and how you can optimize both your data inputs and the model itself.   NOTE: This guide's content aligns with ThingWorx 9.3. The estimated time to complete ALL parts of this guide is 60 minutes     YOU'LL LEARN HOW TO   Load an IoT dataset Generate machine learning predictions Evaluate the analytics output to gain insight     Step 1: Scenario   In this guide, we’re continuing the same MotorCo scenario, where an engine can fail catastrophically in a low-grease condition.   In previous guides, you’ve gathered and exported engine vibration-data from an Edge MicroServer (EMS).   The goal of this guide is to now import that previously-exported Comma-Separated Values (.csv) data into ThingWorx Analytics, and then create an analytical model for predictive maintenance.   Analytical model creation can be extremely helpful for the automotive segment in particular. For instance, each car that comes off the factory line could have an EMS constantly sending data from which an analytical model could automatically detect engine trouble.   This could enable your company to offer an engine monitoring subscription service to your customers.   This guide will show you how to build an analytic model of your engine to facilitate this monitoring service.     Step 2: Upload Simulated Data   This guide assumes that you are using either the hosted trial (with has both Foundation and Analytics pre-installed) or a combination of the Foundation and Analytics downloadable installers.   To confirm that Foundation is communicating with Analytics, perform the following steps:   On the ThingWorx Foundation left-side navigation column, click Analytics > Analytics Builder > Settings.   At the top-right in the Analytics Server Version field, ensure that you see an appropriate version number.     NOTE:  If you use your own dataset, it's possible that you're results in the following steps will differ from those created by the provided-dataset. If you were unable to generate a 30,000+ entry dataset in the last guide, then you may download testCSVfile.csv attached here,instead. You will also need to download and extract vibration_metadata.zip which describes each column of the dataset. On the left, click Analytics Builder > Data.   Under Datasets, click New....   In the Dataset Name field, enter simulated_dataset. In the File Containing Dataset Data section, search for and select testCSVfile.csv. In the File Containing Dataset Field Configuration section, search for and select vibration_metadata.json.   Click Submit. Note that the time it takes to import the dataset is determined by its size.       Step 3: Simulated Signals and Profiles    The Signals section of ThingWorx Analytics looks for the most statistically correlated single field in the dataset which relates to your selected goal.   This doesn't necessarily indicate that it is the cause of your goal, whether maximizing or minimizing. It just means that the dataset indicates that this single field happens to correlate with the goal that you desire.   On the left, click Analytics Builder > Signals.   At the top, click New….   In the Signal Name field, enter simulated_signal. In the Dataset field, select simulated_dataset.   Click Submit. Wait ~30 seconds for Signal State to change to COMPLETED     Unfortunately, our results aren't very good. Or, more accurately, they're too good.   Our simulated dataset has some noise in it from adding random values to our five frequency bands on each our two sensors. However, ThingWorx Analytics has instantly seen through that noise and discarded it. Instead, it's only detected that s2_fb5 isn't relevant.   If you look back at the Use the EMS to Create an Engine Simulator guide, you'll see that s2_fb5 has the same base value between both a "good grease" and a "bad grease" condition, i.e. a base of 190.   This does show already that Analytics is working, though. Since s2_fb5 didn't change between good and bad grease conditions, our Signal analysis is indicating that it's not relevant to our model.   Profiles   Now, let's do the same for a Profile.   The Profiles section of ThingWorx Analytics looks for combinations of data which are highly correlated with your desired goal.   On the left, click Analytics Builder > Profiles.   Click New....   In the Profile Name field, enter simulated_profile. In the Dataset field, select simulated_dataset.   Click Submit. Wait ~30 seconds for the Profile State to change to COMPLETED.     Just like with Signals, our Profile is too good. In fact, Analytics is indicating that just s1_fb2 by itself is the primary indicator of good vs. bad grease conditions.   This is likely due to random chance. The random noise added to s1_fb2 just happened to be slightly less than the other frequency bands, so everything else was discarded.   Regardless, ThingWorx Analytics is quickly seeing through our simulated data.   Next, we'll actually create a Model using the simulated dataset.     Click here to view Part 2 of this guide  
View full tip
    Use a Timer to record mass amounts of test data, and then export it as a Comma-Separated Values file.   GUIDE CONCEPT   Having an Edge MicroServer (EMS) Engine Simulator has allowed you to begin work on using ThingWorx Foundation for instrumenting a prototype engine.   However, the end goal is not to inspect the data manually, but to have ThingWorx Analytics perform an automatic notification for any issues.   In this guide, you’ll create a Timer to generate thousands of data points, and then export the dataset as a Comma-Separated Values (.csv) file for future use in building an analytical model of the engine.     YOU'LL LEARN HOW TO   Create a Timer Subscribe to a Timer to Trigger a Service Generate Mass Amounts of Test Data Import the CSV Parser Extension Create a File Repository Export the Test Data as a Comma-Separated Values (.csv) file Download from a File Repository   NOTE:  The estimated time to complete all parts of this guide is 30 minutes.     Step 1: Scenario   In this guide, we're finishing up with the MotorCo scenario where an engine can fail catastrophically in a low-grease condition.   In previous guides, you've gathered and exported engine vibration-data from an Edge MicroServer (EMS) and used it to build an engine analytics model. You've even put that analytical model into service to give near-immediate results from current engine-vibration readings.   The goal of this guide is to create a GUI to visualize those predicted "low grease" conditions to facilitate customer warnings.     This is a necessary step, as the end-goal is to automate failure analysis by utilizing ThingWorx Analytics, which builds an analytical model by importing a .csv file with several thousand data points.   Data storage, export, and formatting in this manner can be extremely helpful for the automotive segment in particular. For instance, each car that comes off the factory line could have an EMS constantly sending data from which an analytical model could automatically detect engine trouble.   This could enable your company to offer an engine monitoring subscription service to your customers.   But to enable automatic comparison of engine data to an analytical model, you must first generate and format sample data to build said model, and this guide will show you exactly how to do that.     Step 2: Create a Timer   In the previous Use the EMS to Create an Engine Simulator guide, you ended up with an EMS engine simulator from which Foundation could capture individual readings and store them in an Info Table Property.   But for ThingWorx Analytics, we need thousands of data points, if not tens-of-thousands.   Manually triggering the Service to generate that many data points would be tedious.   Instead, we'll create a Timer Thing off which we can trigger the automatic calling of the data-capture Service.   This guide assumes that you have already completed the Use the Edge MicroServer (EMS) to Connect to ThingWorx and Use the EMS to Create an Engine Simulator guides and have a working, active connection from the EMS Engine Simulator to ThingWorx Foundation.       1. Return to the ThingWorx Foundation Browse > All navigation.          2. Click MODELING > Timers.       3, Click + New.       4. On the Choose Template pop-up, select Timer and click OK.     5. In the Name field, type ESDS_Timer.       6. If Project is not already set, search for and select PTCDefaultProject.        7. In the Run As User field, search for and select Administrator.       8. On the Warning pop-up, click Yes.   Note that the Administrator user should only be utilized for testing and never in a production system.       9. Set the Update Rate to 2000.   The EMS updates values around every second, i.e. 1000ms, so we want a time longer than that.     10. At the top, click Save.       Step 3: Subscribe to the Timer   Now that we have a Timer, we can use its 2000ms (two seconds) Event generation to trigger something else.   In this case, we’ll use it to trigger the data-capture Service we created in the previous guide.      1. Click Browse > MODELING > Things.      2. Open EdgeThing and click Properties and Alerts.      3. Scroll down past the custom Properties to the Inherited Properties.      4. Under the Value column, ensure that isConnected is checked. If not, return to the previous guides and confirm that your EMS engine simulator is running.     Having ensured that the EMS engine simulator is still providing values to ThingWorx Foundation, we now want to create a Subscription, which will trigger off our earlier timer.      1. At the top, click Subscriptions.      2. Click + Add.      3. In the Name field, type ESDS_Timer_Subscription.      4. Under Source, select the Other entity radio-button.      5. In the Search Entities field, search for and select ESDS_Timer.      6. Check the Enabled box.      7. Expand the Inputs section.      8. In the Select an Event field, search for and select Timer.      9. Expand the Me/Entities section.      10. Expand the Services sub-section.      11. Scroll down until you find the custom recordService, and click the right-arrow beside it.      12. Click Save and Continue. Note that you should NOT click the top “Save”, as that will erase the Subscription.                     Step 4: Data Acquisition   With the progress so far, another entry is captured and added to the Info Table Property ever two seconds. We'll confirm that now.   The longer that you let the Subscription run, the more entries will be automatically captured in the infoTableProperty. ThingWorx Analytics can use this information to build an analytical model.   To do so, though, it needs thousands of entries. For example, we’ve gotten good model results with 30,000 data points. In general, more is better.   As such, your Subscription would need to run until you have gathered 30,000 entries in the infoTableProperty. Unfortunately, this can take roughly 15-16 hours.   You can simply let the timer run for a short time and then continue with this guide immediately.       1. At the top, click Properties and Alerts.       2. Click the Refresh button several times. Note that both the identifier Property and the count of the number of entries in the infoTableProperty are continually increasing.       3. Under the Value column, click the “pencil icon” for infoTableProperty to select Set value of property. It may take a few moments for the pop-up to load.         4. Note that various values coming from the EMS engine simulator.       5. At the top-right of the pop-up, click the X button.   Stop Data Gathering After achieving the dataset size you desire, you should stop gathering to prevent your dataset from growing arbitrarily large.        1. At the top of EdgeThing, click Subscriptions.       2. If it is not already expanded, click ESDS_Timer_Subscription.       3. Expand Subscription Info.       4. Uncheck the Enabled box.        5. Click Save and Continue.       Click here to view Part 2 of this guide.    
View full tip
Build a Predictive Analytics Model Guide Part 2   Step 5: Profiles   The Profiles section of ThingWorx Analytics looks for combinations of data which are highly correlated with your desired goal. On the left, click ANALYTICS BUILDER > Profiles. Click New....The New Profile pop-up will open. NOTE: Notice the Text Data Only section which is new in ThingWorx 9.3.         3. In the Profile Name field, enter vibration_profile. 4. In the Dataset field, select vibration_dataset. 5. Leave the Goal field set to the default of low_grease. 6. Leave the Filter field set to the default of all_data. 7. Leave the Excluded Fields from Profile field set to the default of empty. 8. Click Submit. 9. After ~30 seconds, the Signal State will change to COMPLETED. The results will be displayed at the bottom.                 The results show several Profiles (combinations of data) that appear to be statistically significant. Only the first few Profiles, however, have a significant percentage of the total number of records. The later Profiles can largely be ignored. Of those first Profiles, both Frequency Bands from Sensor 1 and Sensor 2 appear. But in combination with the result from Signals (where Sensor 1 was always more important), this could possibly indicate that Sensor 1 is still the most important overall. In other words, since Sensor 1 is statistically significant both by itself and in combination (but Sensor 2 is only significant in combation with Sensor 1), then Sensor 2 may not be necessary.     Step 6: Create Model   Models are primarily used by Analytics Manager (which is beyond the scope of this guide), but they can still be used to measure the accuracy of predictions. When Models are calculated, they inherently withhold a certain amount of data. The prediction model is then run against the withheld data. This provides a form of "accuracy measure", which we'll use to determine whether Sensor 2 is necessary to the detection of a low grease condition by creating two different Models. The first Model (which you will create below) will contain all the data, while the second Model (in the next step) will exclude Sensor 2. On the left, click ANALYTICS BUILDER > Models.   Click New….The New Predictive Model pop-up will open.   3. In the Model Name field, enter vibration_model. 4. In the Dataset field, select vibration_dataset. 5. Leave the Goal field set to the default of low_grease. 6. Leave the Filter field set to the default of all_data.         7. Leave the Excluded Fields from Model section at its default of empty.       8. Click Submit. 9. After ~60 seconds, the Model Status will change to COMPLETED.   View Model   Now that the prediction model is COMPLETED, you can view the results. Select the model that was created in the previous step, i.e. vibration_model. Click View… to open the Model Information page.   Review the visualization of the validation results. Note that your results may differ slightly from the picture, as the automatically-withheld "test" portion of the dataset is randomly chosen. Click on the ? icon to the right of the chart for details on the information displayed.   The desired outcome is for the model to have a relatively high level of accuracy. The True Positive Rate shown on the Receiver Operating Characteristic (ROC) chart are much higher than the False Positives. The curve is relatively high and to the left, which indicates a high accuracy level. You may also click on the Confusion Matrix tab in the top-left, which will show you the number of True Positive and True Negatives in comparison to False Positives and False Negatives.     Note that the number of correct predictions is much higher than the number of incorrect predictions.     As such, we now know that our Sensors have a relatively good chance at predicting an impending failure by detecting low grease conditions before they cause catastrophic engine failure.     Step 7: Refine Model   We will now try comparing this first Model that includes both Sensors to a simpler Model using only Sensor 1. We do this because we suspect that Sensor 2 may not be necessary to achieve our goal. On the left, click ANALYTICS BUILDER > Models.   Click New…. In the Model Name field, enter vibration_model_s1_only. In the Dataset field, select vibration_dataset. Leave the Goal field set to the default of low_grease. Leave the Filter field set to the default of all_data.   On the right beside Excluded Fields from Model, click the Excluded Fields button. The Fields To Be Excluded From Job pop-up will open. 8. Click s2_fb1 to select the first Sensor 2 Frequency Band. 9. Select the rest of the Frequency Bands through s2_fb5 to choose all of the Sensor 2 frequencies. 10. While all the s2 values are selected, click the green "right arrow", i.e. the > button in the middle. 11. At the bottom-left, click Save. The Fields To Be Excluded From Job pop-up will close.           12. Click Submit. 13. After ~60 seconds, the Model State will change to COMPLETED. 14. With vibration_model_s1_only selected, click View....   The ROC chart is comparable to the original model (including Sensor 2). Likewise, the Confusion Matrix (on the other tab) indicates a good ratio of correct predictions versus incorrect predictions.     NOTE: These Models may vary slightly from your own final scores, as what data is used for the prediction versus for evaluation is random. ThingWorx Analytics's Models have indicated that you are likely to receive roughly the same accuracy of predicting a low-grease condition whether you use one sensor or two! If we can get an accurate early-warning of the low grease condition with just one sensor, it then becomes a business decision as to whether the extra cost of Sensor 2 is necessary.   Step 8: Next Steps   Congratulations! You've successfully completed the Build a Predictive Analytics Model guide, and learned how to:   Load an IoT dataset Generate machine learning predictions Evaluate the analytics output to gain insight    This is the last guide in the Getting Started on the ThingWorx Platform learning path.   This is the last guide in the Monitor Factory Supplies and Consumables learning path.   The next guide in the Design and Implement Data Models to Enable Predictive Analytics learning path is Operationalize an Analytics Model.     Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Support Analytics Builder Help Center    
View full tip
  Step 4: Install and Configure   Before you install the plugin, ensure that software requirements are met for proper installation of the plugin.   Open The Eclipse IDE and choose a suitable directory as a workspace. Go to the menu bar of the Eclipse window and select Help->Install New Software… After the Install window opens, click Add to add the Eclipse Plugin repository. Click Archive… and browse to the directory where the Eclipse Plugin zip file is stored and click Open. NOTE: Do not extract this zip file. Enter a name (for example, Eclipse Plugin).   Click OK. Ensure that the Group items by category checkbox is not selected. Select ThingWorx Extension Builder in the items list of the Install window. Click Next and the items to be installed are listed. Click Next and review the license agreement. Accept the license agreement and click Finish to complete the installation process. If a warning for unsigned content is displayed, click OK to complete the installation process. Restart Eclipse. When Eclipse starts again, ensure that you are in the ThingWorx Extension perspective. If not, select Window->Perspective->Open Perspective->Other->ThingWorx Extension, then click OK.     NOTE: Opening any item from File->New->Other…->ThingWorx will also change the perspective to ThingWorx Extension.   You are ready to start a ThingWorx Extension Project!   Step 5: Create Extension Project   In this tutorial, you will create a ThingWorx extension that performs authentication based on your security needs.   While in the ThingWorx Extension Perspective, go to File->New->Project. Click ThingWorx->ThingWorx Extension Project.   Click Next. Enter the Project Name (for example, AuthenticatorExample). Select Gradle or Ant as your build framework. Enter the SDK location by browsing to the directory where the Extension SDK is stored. Enter the Vendor information (for example, ThingWorx Labs). Select the JRE version to 1.8. Click Finish. Your newly created project is added to the Package Explorer tab. The information from ThingWorx Extension Properties is used to populate the metadata.xml file in your project. The metadata.xml file contains information about the extension and details for the various artifacts within the extension. The information in this file is used in the import process in ThingWorx to create and initialize the entities.   Create New Authenticator   Select your project and click New -> Authenticator to create a new Authenticator.   In the new window, enter AwesomeCustomAuthenticator for the Name.   If no Server is available, create a Server using any available option. You will not need that for this guide. This Server option might be utilized based on your later needs. Enter a description to your Authenticator, such as Sample Authenticator that validates against the Thingworx User. Select Finish. You will be able to check these settings within the metadata.xml file inside of the configfiles  directory.   You will now need to add the stubs for the authenticate, issueAuthenticationChallenge, and matchesAuthRequest methods. See below for sample code and descriptions.   Method Description Constructor Needed to instantiate new objects of type AwesomeCustomAuthenticator. Instance member data/variable values in your Authenticator will not be available across requests. initializeEntity This method is called when the Authenticator Thing is saved in ThingWorx Composer, e.g. saving configuration table updates. authenticate The logic/implementation that is used to authenticate a HTTP request. issueAuthenticationChallenge Handles logic which follows authentication failure (e.g. logging an error). matchesAuthRequest This method determines if this Authenticator is valid for the authentication request type.   Below provides more information about each of these methods and some example source code:   Constructor:   A new instance of custom Authenticator class is created to handle each new HTTP request for authentication. Upon importing a custom Authenticator extension, that Authenticator is registered into AuthenticatorManager and can be managed in the ThingWorx Composer with the other system authenticators. When that custom Authenticator is enabled, it will be used in conjunction with the other configured Authenticators to attempt to authenticate HTTP requests. Any static data for each new authentication instance should be thread safe. Best to avoid putting very much logic here, even calls to get configuration or instance data (use authenticate method instead).   initializeEntity:   Read configuration data into properties as needed for Authenticator challenges. Write the LDAP server address to some static property for use across all future instances for use in Authenticator challenges. This would be a way to ensure the LDAP server location is configurable from within ThingWorx Composer. Best to update this only once (e.g. for when the first connection is made).   authenticate:   If the authentication logic/implementation fails to authenticate the HTTP request due to error in the logic or the HTTP request contained invalid data that does not pass authentication, then this implementation should throw an exception.   Example code below:   @Override public void authenticate(HttpServletRequest request, HttpServletResponse response) throws AuthenticatorException { String username = request.getHeader("User"), password = request.getHeader("Password"); if(username.isEmpty() || password.isEmpty()) throw new AuthenticatorException("User login info is empty"); try { // This section logs the latest login time and login user to a thing called MyThing // Subscribing to these properties via DataChange event will allow this information to be stored Thing LoginHelper = (Thing) EntityUtilities.findEntity("MyThing", ThingworxRelationshipTypes.Thing); LoginHelper.setPropertyValue("LatestLoginUser", new StringPrimitive(username)); LoginHelper.setPropertyValue("LatestLoginTime", new DatetimePrimitive(DateTime.now())); _logger.warn(DateTime.now() + " -- " + username + " login attempt"); // Checks that user exists and is enabled; throws exception if can't validate // May want to create user in ThingWorx if they don't exist AuthenticationUtilities.validateEnabledThingworxUser(username); // Checks that user exists and validates credentials through all configured DirectoryServices // (one is the internal directory of ThingWorx users, one could be LDAP if configured); // throws exception if can't validate AuthenticationUtilities.validateCredentials(username, password); // REQUIRED: tells rest of ThingWorx which user is logged in for purposes of permissions, etc. this.setCredentials(username); } }   issueAuthenticationChallenge:   This may not be used at all, or it may be used for alerting or logging. May be used for constructing and sending a response to the client so the client can ask the user to enter credentials again (i.e. authentication challenge). matchesAuthRequest:   Example code below. This sample code for Authenticator to automatically login the user with default username/password when specific URI used in web browser.   @Override public boolean matchesAuthRequest(HttpServletRequest httpRequest) throws AuthenticatorException { String requestURI = httpRequest.getRequestURI(); // Must access it from this URL and not from /Thingworx/Runtime/index.html#mashup=LogoutButtonMashup as // the Request URI in the latter case is always going to show as /Thingworx/Runtime/index.html if (requestURI.equals("/Thingworx/Mashups/LogoutButtonMashup")) return true; else return false; }   Below is another example of how to implement the matchesAuthRequest method. Of course, it’s not a safe method. Nevertheless, it provides input into a different way to handle things.   @Override public boolean matchesAuthRequest(HttpServletRequest request) throws AuthenticatorException { try { // DON'T DO THIS by itself -- getHeader returns null if it can't find the header, // so this is unsafe and may block other authenticators from attempting String value1 = request.getHeader("User"); String value2 = request.getHeader("Password"); // DO ADD THIS - this is safe // Optionally add some logging statement here to inform of missing headers if(value1 == null || value2 == null) return false; return true; } catch(Exception e) { // This won't normally hit. This is really for other, more complicated validation processes throw new AuthenticatorException("Missing headers"); } }   Step 6: Build Extension   You can use either Gradle or Ant to build your ThingWorx Extension Project.   Build Extension with Gradle   Right click on your project ->Gradle (STS)->Tasks Quick Launcher.   Set Project from the drop-down menu to your project name and type Tasks as build. Press Enter. This will build the project and any error will be indicated in the console window.   Your console window will display BUILD SUCCESSFUL. This means that your extension is created and stored as a zip file in your_project->build->distributions folder.   Build Extension with Ant   Go to the Package explorer -> your_project->. Right click on build-extension.xml->Run As->Ant Build.   Your console output will indicate BUILD SUCCESSFUL similar to the below text: build: [echo] Building AuthenticatorExample extension package... BUILD SUCCESSFUL Total time: 770 milliseconds   NOTE: This will build your project and create the extension zip in the AuthenticatorExample->build->distributions folder of your project.     Click here to view Part 3 of this guide.  
View full tip
  Use the Collection Widget to organize visual elements of your application.   GUIDE CONCEPT   This project will introduce the Collection Widget.   Following the steps in this guide, you will learn to display different values from a single dataset in real-time.   We will teach you how to utilize the Collection Widget to generate a series of repeated Mashups for every row of data in a table.     YOU'LL LEARN HOW TO   Create a Datashape to define columns of a table Create a Thing with an Infotable Property Create a base Mashup to display data Utilize a Collection Widget to display data from multiple rows of a table   NOTE: This guide's content aligns with ThingWorx 9.3. The estimated time to complete ALL 3 parts of this guide is 60 minutes.      Step 1: Create a Datashape    Create a Datashape   The Collection Widget uses a Service to dynamically define visual content.   The data must be in a tabular format (for example: Data Table, Info Table, or external Database-connection) in order for the Collection Widget to access it.   In this part of the lesson, we'll create a Data Shape to define the columns of the table.   In a subsequent step, we'll create an Info Table Property within a Thing in order to store the information.   In the ThingWorx Composer Browse tab, click Modeling > Data Shapes, + New              2.  In the Name field, enter cwht_datashape.                      3. If Project is not already set, search for and select PTCDefaultProject.         4. At the top, click Field Definitions          First Definition:          1. Click + Add to open the New Field Definition slide-out.          2. In the Name field, enter first_number.          3. Change the Base Type to NUMBER.          4. Check the Is Primary Key box. All Datashapes must have a single Primary Key, and the first Field is as acceptable as any other for our purposes here.                5. At the top-right, click the "Check with a +" button for Done and Add.   Second Definition:   In the Name field, enter second_number.       2. Change the Base Type to NUMBER.               3. At the top-right, click the "Check with a +" button for Done and Add.     Third Definition:   In the Name field, enter third_number.         2. Change the Base Type to NUMBER.               3. At the top-right, click the "Check" button for Done         4. At the top, click Save.                   Step 2: Create a Thing   Create a Thing   In the previous step, we created a Data Shape to define the Info Table Property columns.   Now, we will create a Thing, add an Info Table Property, format the Info Table Property with the Data Shape, and set some default values.   On the ThingWorx Composer Browse tab, click Modeling > Things, + New.           2. In the Name field, enter cwht_thing.         3. If Project is not already set, search for and select PTCDefaultProject.         4. Select GenericThing in the Base Thing Template field       Add InfoTable Property         1. At the top, click Properties and Alerts.        2. Click the + Add button to open the New Property slide-out.        3. In the Name field, enter infotable_property.        4. Change the Base Type to INFOTABLE.        5. In the Data Shape field, select cwht_datashape.        6. Check the Persistent checkbox.       First Default   Check the Has Default Value checkbox. A cwht_datashape icon will appear underneath.            2. Under Has Default Value, click the cwht_datashape button to open the pop-up menu which  sets the default values               3. Click the + Add icon.         4. Enter values in each number field, such as 1, 2, and 3.              5. At the bottom-right, click the green Add button.     Second Default   Click the + Add icon.         2. Enter values in each number field, such as 10, 20, and 30            3. At the bottom-right, click the green Add button       Third Default   Click the + Add icon.       2. Enter values in each number field, such as 100, 200, and 300            3. At the bottom-right, click the greenAddbutton           4.  At the bottom-right, click Save to close the pop-up menu               5. At the top-right, click the "Check" button for Done.       6. At the top, click Save          Click here to view Part 2 of this guide.
View full tip
  Step 7: Real World Model   We’ll now rerun model creation with the Real World data.   Even though Signals and Profiles are possibly telling us that only Sensor 1 is needed, the first Model you’ll create will contain all the data, while the second Model will exclude Sensor 2. We’ll then compare the Models to see which one is going to work the best for predicting engine failures.   On the left, click Analytics Builder > Models. Click New….   In the Model Name field, enter vibration_model. In the Dataset field, select vibration_dataset.   Click Submit. After ~60 seconds, the Model Status will change to COMPLETED. Select the model that was created in the previous step, i.e. vibration_model. Click View… to open the Model Information page. Note that your model may differ slightly from the picture below, as the automatically-withheld "test" data is randomly chosen.       Unlike our simulated dataset, this real-world data is not perfect. However, it’s still pretty good, and is much more representative of what a real-world scenario would indicate.   The True Positive Rate shown on the Receiver Operating Characteristic (ROC) chart are much higher than the False Positives.   The curve is relatively high and to the left, which indicates a high accuracy level.   You may also click on the Confusion Matrix tab in the top-left, which will show you the number of True Positive and True Negatives in comparison to False Positives and False Negatives.     NOTE: The number of correct predictions is much higher than the number of incorrect predictions.   As such, we now know that our Sensors have a relatively good chance at predicting an impending failure by detecting low grease conditions before they cause catastrophic engine failure.   Refined Model   We can now compare this first Model that includes both Sensors to a Model using only Sensor 1, since we suspect that Sensor 2 may not be necessary to achieve our goal. On the left, click Analytics Builder > Models. Click New…. In the Model Name field, enter vibration_model_s1_only. In the Dataset field, select vibration_dataset.   On the right beside Excluded Fields from Model, click the Excluded Fields button.   Select s2_fb1 through s2_fb5.   While all the s2 values are selected, click the green "right-arrow", i.e. > button, in the middle.   At the bottom-left, click Save.   Click Submit. After ~60 seconds, the Model State will change to COMPLETED. With vibration_model_s1_only selected, click View….     The ROC chart is comparable to the original model (including Sensor 2).   Likewise, the Confusion Matrix (on the other tab) indicates a good ratio of correct predictions versus incorrect predictions.     NOTE: These Models may vary slightly from your own final scores, as what data is used for the prediction versus for evaluation is random.   ThingWorx Analytics’s Models have indicated that you are likely to receive roughly the same accuracy of predicting a low-grease condition whether you use one sensor or two!   If we can get an accurate early-warning of the low grease condition with just one sensor, it then becomes a business decision as to whether the extra cost of Sensor 2 is necessary.     Step 8: Next Steps   Congratulations! You've successfully completed the Build an Engine Analytical Model guide, and learned how to:   Load an IoT dataset Generate machine learning predictions Evaluate the analytics output to gain insight   The next guide in the Vehicle Predictive Pre-Failure Detection with ThingWorx Platform learning path is Manage an Engine Analytical Model.   Learn More   We recommend the following resources to continue your learning experience:   Capability Guide Analyze Operationalize an Analytics Model Build Implement Services, Events, and Subscriptions Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Community Developer Community Forum Support Analytics Builder Help Center
View full tip
  Step 8: Configure Template File (Service)   Services are implemented as Lua functions. In our Lua script, Services are divided into two pieces. The first is the Service definition which consists of a Service name, inputs and output. The second part of defining a Service is the service code. The Service code is run when you execute the service.   Create Service Definition Open the PiTemplate.lua file. Append the service definition to the file. Create a service named GetSystemProperties that gets the system properties (cpu temperature, clock frequencies, voltages) from your Raspberry Pi and updates the respective properties on the Thingworx platform. Specify your output type but not the name because the name of every output from a ThingWorx service is always result. serviceDefinitions.GetSystemProperties( output { baseType="BOOLEAN", description="" }, description { "updates properties" } ) NOTE: This service has no input parameters and an output that results in True if the properties were successfully updated on Thingworx.   Create Service code The Service code is run when you execute the Service. Functions in Lua are variables therefore to define the Service code, you will create a variable. The name of the Service has to match the name you specified in the Service definition.   Copy the service code below with comments explaining the logic and add append it to your template file, or download and unzip the full PiTemplate.zip attached here. services.GetSystemProperties = function(me, headers, query, data) log.trace("[PiTemplate]","########### in GetSystemProperties#############") queryHardware() -- if properties are successfully updated, return HTTP 200 code with a true service return value return 200, true end function queryHardware() -- use the vcgencmd shell command to get raspberry pi system values and assign to variables -- measure_temp returns value in Celsius -- measure_clock arm returns value in Hertz -- measure_volts returns balue in Volts local tempCmd = io.popen("vcgencmd measure_temp") local freqCmd = io.popen("vcgencmd measure_clock arm") local voltCmd = io.popen("vcgencmd measure_volts core") -- set property temperature local s = tempCmd:read("*a") s = string.match(s,"temp=(%d+\.%d+)"); log.debug("[PiTemplate]",string.format("temp %.1f",s)) properties.cpu_temperature.value = s -- set property frequency s = freqCmd:read("*a") log.debug("[PiTemplate]",string.format("raw freq %s",s)) s = string.match(s,"frequency%(..%)=(%d+)"); s = s/1000000 log.debug("[PiTemplate]",string.format("scaled freq %d",s)) properties.cpu_freq.value = s -- set property volts s = voltCmd:read("*a") log.debug("[PiTemplate]",string.format("raw volts %s", s)) s = string.match(s,"volt=(%d+\.%d+)"); log.debug("[PiTemplate]",string.format("scaled volts %.1f", s)) properties.cpu_volt.value = s end tasks.refreshProperties = function(me) log.trace("[PiTemplate]","~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In tasks.refreshProperties~~~~~~~~~~~~~ ") queryHardware() end Save the PiTemplate.lua file. cntrl x     Step 9: Run LSR   1. Navigate from installation directory to microserver directory. cd microserver 2. Ensure that wsems is running in a separate terminal session before you start running LuaScriptResource. 3. Ensure that you have ownership to the executable luaScriptResource and executable privileges. To check ownership: Ls -la -rwxrwxr-x 1 pi pi 769058 Jun 9 17:46 luaScriptResourc NOTE: The owner of luaScriptResource should be the user you are logged in as on the Raspberry Pi.   4. Confirm you have executable privileges by running the following command: sudo chmod 775 luaScriptResource 5. Run LuaScriptResource by executing the following command: sudo ./luaScriptResource   6. The output will show an error until we create the corresponding Thing in the next step.     Step 10: Bind Remote Thing Properties   Now we need to register a Thing so your Raspberry Pi can bind to its Properties on the Thingworx Platform.   Create a Thing named PiThing that will bind the scripts.PiThing created in config.lua . Open your Composer screen. Click Things on the left-navigation and the + symbol. Enter PiThing in the Name field and click RemoteThing in the Thing Template field.   Click Save. Ensure that the Remote Thing Property is connected. Click Properties in the left-hand navigation. Verify that the isConnected Property has a value of true. This means that your Raspberry Pi is still connected and now bound to this Thing on Thingworx Platform.   Bind the remote Thing Properties. Make sure the Properties tab is selected and click Edit at the top of the PiThing. Click Manage Bindings. Select the Remote tab at the top.   Click Add All Above Properties or drag and drop the ones you need. Click Done. Click Save. Verify that the Properties were updated with readings from the Raspberry Pi. Both the Value and Default Value for the three Properties will be set to the current reading from the Raspberry Pi. Cover the Raspberry Pi and wait about a minute, then Select the Properties tab and click Refresh. You will see the cpu_temperature value change.     NOTE: The system properties from your Raspberry Pi are now being passed to the server every 30 seconds. Wait a couple of cycles to see if the values from the Raspberry Pi change. If you are impatient, manually change the value of the property using the Set button in the Composer then click Refresh to see the updated value. The value will be temporarily updated for about 30 seconds until the Raspberry Pi reports the current live value.   Troubleshooting Tips   Tip #1 If the properties are not updating, try to stop and start both the wsems and luaScriptResource services.   quit sudo ./wsems or ./luaScriptResource Tip #2 If a wsems and/or luaScriptResource is not shut down gracefully, sometimes the service is still running which can cause issues. You can search and kill any wsems/luaScriptResource services by using the following command. Re-run the GetSystemProperties to test if this fixed the issue.   ps -efl kill -9 <id#>   Step 11: View Data from Devices   In order to demonstrate how ThingWorx can render a visualization of data from connected devices, at this point in the lesson you will import a pre-configured Mashup.   On the ThingWorx server that the EMS is connected to, start on the Home tab of Composer. Import a pre-built Mashup. Download and save the pre-built Mashup XML file attached here: Mashups_PiThingMashup-v91.xml. In Composer, click the Import/Export drop-down at the bottom-left.   Click Import. Leave all default values and click Browse to select the Mashups_PiThingMashup-v91.xml file that you just downloaded. Click Open, then Import, and once you see the success message, click Close. View Mashup displaying live data. Select the home icon in the top left side of Composer, then click Mashups on the left-navigation panel. Click Mashups_PiThingMashup-v91 and you'll see the design view of the Mashup.   Click View Mashup, and you'll see the live Mashup.    TIP: You will need to allow pop-ups in your browser for the Mashup to be displayed.     Click here to view Part 4 of this guide. 
View full tip
  Step 5: Create Services   Below shows how we can create the GetCustomerPackages Service for the PTCDeliversBusinessLogic Thing.   Create Service Definition   On the home page, filter and select the PTCDeliversBusinessLogic Thing. Switch to the Services tab. Click + Add.   Enter the name of the Service, GetCustomerPackages. Switch to the Output tab of the Service. Select InfoTable from the list as the Base Type. When prompted for the DataShape, select CustomerDataShape. Switch to the Inputs tab of the Service. Click the + Add button.   Enter the name CustomerId. Check the Required checkbox. Click Done.   Add Service Functionality   Switch to the Me/Entities tab. Switch the radio option to Other Entity. Filter and select PackageDataTable as the Entity. A list of all accessible Services for the PackageDataTable will appear. In the search bar for Services, enter QueryDataTableEntries in the Services section and click the arrow next to it. Update the inserted code to use the input parameter, CustomerId, in the query. An example is below and more information can be found on queries on our Query Help Page. Click Done and save your work.             / Provide your filter using the format as described in the help topic "Query Parameter for Query Services" var query = { "filters": { "type": "EQ", "fieldName": "CustomerId", "value": CustomerId } }; // result: INFOTABLE dataShape: "" var result = Things["PackageDataTable"].QueryDataTableEntries({ maxItems: undefined /* NUMBER */, values: undefined /* INFOTAB             After saving changes and closing the Service edit view, you can test the method by selecting the Execute play button.   The Service we created will query PackageDataTable for any packages with the CustomerId you entered. If no data has been added to the DataTable as yet, open PackageDataTable's Services tab and execute the AddDataTableEntries Service with test data.       Step 6: Create Subscriptions   Subscriptions are based on tracking the firing of an Event. When the Event is triggered or fired, all entities with a Subscription to the Event will perform the script as defined. The JavaScript interface and script tabs are the same as those utilized for the Services interface.   Subscriptions are a great resource for making updates in other Entities, Databases, or even just logging this information to your liking. On the home page, filter and select the PTCDeliversBusinessLogic Thing. Switch to the Subscriptions tab. Click + Add.   For Source, select *Other Entity. Filter and select PackageStream. Check the Enabled checkbox. Using a Stream to track events in the application allows for one source to watch for activity. The source for a Subscription can be other Entities if a single Entity is wanted. In order to capture all source data from the PackageStream, you will need to set it as the Stream for the Entity you would like to track. Switch to the Inputs tab. Select the Event drop-down and pick the PackageDelivered Event. This PackageDelivered Event only exists in the completed download. If you are not using that download, create your own Event based on the PackageDataShape. Update the script area of the Subscription using the below code. This JavaScript code will take the information from the triggered Event and add it to the DeliveryDataTable.           / tags:TAGS var tags = new Array(); // values:INFOTABLE(Datashape: PackageDataShape) var values = Things["DeliveryDataTable"].CreateValues(); values.Customer = eventData.Customer; // THINGNAME values.Content = eventData.Content; // STRING values.ID = eventData.ID; // INTEGER [Primary Key] values.Weight = eventData.Weight; // NUMBER // location:LOCATION var location = new Object(); location.latitude = 0; location.longitude = 0; location.elevation = 0; location.units ="WGS84"; var params = { tags : tags, source : me.name, values : values, location : location }; // AddOrUpdateDataTableEntry(tags:TAGS, source:STRING("me.name"), values:INFOTABLE(DeliveryDataTable), location:LOCATION):STRING var id = Things["DeliveryDataTable"].AddOrUpdateDataTableEntry(params);                 Step 7: Next Steps   Congratulations! You've successfully completed the Implement Services, Events and Subscriptions guide.   In this guide you learned how to to create Events, Services and Subscriptions you can utilize to monitor and optimize your IoT applications.   The next guide in the Design and Implement Data Models to Enable Predictive Analytics learning path is Build a Predictive Analytics Model.    The next guide in the Monitor Factory Supplies and Consumables learning path is Build a Predictive Analytics Model.   Learn More   We recommend the following resources to continue your learning experience:   Capability Guide Build Create Custom Business Logic Guide Secure Configure Permissions Connect SDK Reference   Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Community Developer Community Forum Support Help Center  
View full tip
    Step 2: Creating Machine Templates   Creating machine templates allows us to have specific levels of consistency with all of our machinery, no matter the purpose of the machine. As it goes towards more specific a machine, it will have it's own unique features and properties.   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Shape in the dropdown.   In the name field, enter Fizos.MachineInspections.DataShape and set a Project (ie, PTCDefaultProject).   Click Save. All of our machine inspections will be based on this Data Shape. Add the list of fields below: Name Base Type Aspects Description GUID GUID Primary key String used as unique identifier for the inspection FactoryID Integer 0 minimum Factory identifier at time of inspection DateRequest Date N/A Date the inspection was requested DateCompleted Date N/A Date the inspection was completed Report JSON N/A This will hold the inspection report data   The properties for the Fizos.MachineInspections.DataShape Data Shape are as follows:   Create Machine Template In the ThingWorx Composer, click the + New in the top left of the screen.   Select Thing Template in the dropdown.   In the name field, enter Fizos.Machine.ThingTemplate. All of our machines will be based off this template. In the Base Template field, enter GenericThing and set a Project (ie, PTCDefaultProject). In real world examples, you would likely use a RemoteThing. 5. Open the Properties section. Create the following list of properties.   Name Base Type Aspects Description FactoryID Integer 0 minimum, default 0 The factory ID in which this machine is currently located Type String N/A Type of machine SerialNo String N/A Serial number of the machine Model String N/A Machine make and model State String Default: Idle Machine state (Idle, Working, Warning, Failed) Status String Default: Active Machine status (Active, Inactive, etc) Inspections InfoTable DataShape: Fizos.MachineInspections.DataShape List of inspection reports These properties should match the following: 6. Open the Alerts section. Create the following list of alerts.                     Name Property Configuration StateFailedAlert State Equal Failed StateWarningAlert State Equal Warning StatusInactiveAlert Status Equal Inactive These alerts should match the following: Create Machine Template By Product   Here at Fizos, we specialize in brauts and regular sausages. That being said, we will have some machines that are specific to each product. We will also have machines that are generic in nature and shared between the two systems. The template we just created will work for the machines that are common between both product lines. We'll now create two templates that will be specific to brauts and regular sausages. We are doing this to show the levels of granularity that can be done. In some cases, you might not want to create another template level based on your design.   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Thing Template in the dropdown.   In the name field, enter Fizos.BrautsMachine.ThingTemplate. All of our brauts machines will be based off this template. In the Base Template field, enter Fizos.Machine.ThingTemplate and set a Project (ie, PTCDefaultProject).   Open the Properties section. Create the following list of properties.                                       Name Base Type Aspects Description CookTemperature Number default 155, units - minutes The standard the machine cooking temperature CookTime Number default 78.5, units - minutes The standard the machine cooking temperature EggLevel Number 0 minimum, 100 maximum, default 0, % units The percentage of eggs left in the machine CreamLevel Number 0 minimum, 100 maximum, default 0, % units The percentage of cream left in the machine   6. Open the Alerts section. Create the following list of properties. Name Property Configuration EggLevelWarningAlert EggsLevel Below 20 EggLevelDepletedAlert EggsLevel Below 5 CreamLevelWarningAlert CreamLevel Below 20 CreamLevelDepletedAlert CreamLevel Below 5   Now for the more general sausages. In the ThingWorx Composer, click the + New in the top left of the screen.   Select Thing Template in the dropdown.   In the name field, enter Fizos.SausageMachine.ThingTemplate. All of our sausage machines will be based off this template. In the Base Template field, enter Fizos.Machine.ThingTemplate and set a Project (ie, PTCDefaultProject).   Open the Properties section. Create the following list of properties. Name Base Type Aspects Description CookTemperature Number default 150, units - minutes The standard the machine cooking temperature CookTime Number default 72.5, units - minutes The standard the machine cooking temperature   Next, we'll create our services for how these machines will work.         Click here to view Part 3 of this guide.  
View full tip
  Step 5: Launch the EMS   1. Navigate to the microserver directory. cd .. 2. Ensure that you have ownership to the executable wsems and executable privileges. To check ownership. Ls -la -rwxrwxr-x 1 pi pi 769058 Jun 9 17:46 wsems NOTE: The account that owns the wsems executable should be used to log into the Raspberry Pi.   3. Confirm that you have executable privileges by run the following command: sudo chmod 775 wsems 4. Run the EMS. sudo ./wsems 5. Validate that your EMS successfully connected.   Depending on your logger level, your wsems execution should indicate Websocket connected in the log and the following INFO message: [INFO ] 2016-10-11 14:22:54,770 Main: Succesfully connected. Saving .booted config file   Troubleshoot Connectivity Issues   If the websocket does not connect successfully, check the following:    Issue                                               Solution WEBSOCKET CLOSED - Warning seen immediately after Websocket gets connected. Ensure that the host IP address, port and appKey of the ThingWorx composer instance are accurately set. If in the config.json you have selected the option to validate certification, then make sure the path to the certificate file is correctly set. twWs_Connect - Error trying to connect. Ensure that the host IP address, port running the ThingWorx Composer is accurately set. Check if the certification parameter is set or not. By default the WS EMS validates certificates. To ensure that the validation is performed correctly without errors, ensure that the certificates configuration parameters are set accurately with the correct path to the certificate file. If you do not wish to validate the certificate, you may explicitly set the validate parameter in certificates parameter set to false. twTlsClient_Connect - Error intializing TLS connection. Invalid certificate. Check if the ws_encryption parameter is present in your config.json file. By default, WS EMS enables TLS connection to ThingWorx platform. Ensure that the certificate file mentioned in the config.json is valid and stored in the path specified in the config.json. For debugging purposes, you can set the ssl parameter to none in ws_encryption configuration parameter. [WARN ] ... Main - Unable to connect to server. Trying .booted config file. Ensure that the host is up and running at the IP address and port number mentioned in the config.json file. Ensure that ThingWorx is running on the host address at the correct port number. Ensure that all appropriate networking ports are open and available.     Step 6: Configure Lua Script Resource (LSR)   The Lua Script Resource (LSR) is used to implement Things on the Edge device. Using the Lua Script Resource, you can define for your Raspberry Pi: Data Shapes Properties Services Tasks NOTE: The steps in this guide install the LSR on the same server (Raspberry Pi) as the EMS but it could also be installed on another server.   Launch a new terminal session that will be used to configure and launch the LSR. Navigate to etc folder. cd microserver/etc Create the config.lua file. sudo nano config.lua Set the logger level. scripts.log_level = "INFO" Turn off encryption for connection to EMS. This should only be used for testing. scripts.script_resource_ssl = false scripts.script_resource_authenticate = false Create the Edge RemoteThing. scripts.PiThing = { file = "thing.lua", template = "PiTemplate", scanRate = 1000, taskRate = 30000 }   NOTE: This configuration tells the Lua Script Resource to create a Thing called PiThing whose template definition is in PiTemplate.lua file in the path etc/customs/templates/PiTemplate.lua. You will create the template file PiTemplate.lua in the next section.   Property    Description scanRate Controls how frequently (milliseconds) properties are evaluated and pushed to the server. In our example, the Pi will check every 1 second if the values have changed. If so, the values will be pushed to the server. taskRate Controls how frequently the tasks specified in the Thing's template should be executed. In our example, the task will run every 30 seconds.   7.  Set the IP and port address of the LSR host server. scripts.rap_host = "127.0.0.1" scripts.rap_port = 8080   Sample config.lua File   Here is an example of a complete config.lua that can be used to configure the Lua Script Resource.   scripts.log_level = "INFO" scripts.script_resource_ssl = false scripts.script_resource_authenticate = false scripts.PiThing = { file = "thing.lua", template = "PiTemplate", scanRate = 1000, taskRate = 30000 } scripts.rap_host = "127.0.0.1" scripts.rap_port = 8080     Step 7: Configure Template File (Properties)   The template file is located in the microserver/etc/custom/templates directory. The template file provides a base configuration for defining Properties, Services, tasks, etc. This section will focus on defining the template file and adding Properties.   Navigate from the installation directory to the templates folder at microserver/etc/custom/templates . cd microserver/etc/custom/templates Create the file PiTemplate.lua. sudo nano PiTemplate.lua NOTE: This is the same template filename used in config.lua in the previous section. 3. Define the template. The module statement is used to define the template containing the configuration for the software component of the edge device. module ("templates.PiTemplate", thingworx.template.extend)   Parameter                                   Description templates.PiTemplate refers to the name of the template file: PiTemplate.lua thingworx.template.extend identifies the file as a template and provides base Thingworx template implementation   4. Define the Properties. For this guide, we are going to use the Raspberry Pi’s system properties like CPU temperature, clock frequency and internal voltage as sensor readings for the Remote Thing. Add the properties for these in your PiTemplate.lua file. properties.cpu_temperature={baseType="NUMBER", pushType="ALWAYS", value=0} properties.cpu_freq={baseType="NUMBER", pushType="ALWAYS", value=0} properties.cpu_volt={baseType="NUMBER", pushType="ALWAYS", value=0} NOTE: This code defines the properties cpu_temperature, cpu_freq and cpu_volt with a baseType of NUMBER. Additionally, it sets each default value to null as well as sets the pushType to ALWAYS which means that the property is always pushed to the Thingworx Server from the Raspberry Pi. The pushType can be set to ALWAYS, ON, OFF, NEVER or VALUE.   Sample PiTemplate.lua File   Here is an example of a complete PiTemplate.lua that can be used to configure the Lua Script Resource. module ("templates.PiTemplate", thingworx.template.extend) properties.cpu_temperature={baseType="NUMBER", pushType="ALWAYS", value=0} properties.cpu_freq={baseType="NUMBER", pushType="ALWAYS", value=0} properties.cpu_volt={baseType="NUMBER", pushType="ALWAYS", value=0}     Click here to view Part 3 of this guide. 
View full tip
  Step 6: Data Shapes   Data Shapes are an important part of creating/firing Events and also invoking Services   Define With Macros   In order to define a Data Shape using a macro, use TW_MAKE_DATASHAPE.   NOTE: The macros are all defined in the twMacros.h header file.   TW_MAKE_DATASHAPE("SteamSensorReadingShape", TW_DS_ENTRY("ActivationTime", TW_NO_DESCRIPTION ,TW_DATETIME), TW_DS_ENTRY("SensorName", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("Temperature", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("Pressure", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("FaultStatus", TW_NO_DESCRIPTION ,TW_BOOLEAN), TW_DS_ENTRY("InletValve", TW_NO_DESCRIPTION ,TW_BOOLEAN), TW_DS_ENTRY("TemperatureLimit", TW_NO_DESCRIPTION ,TW_NUMBER), TW_DS_ENTRY("TotalFlow", TW_NO_DESCRIPTION ,TW_INTEGER) );   Define Without Macros   In order to define a Data Shape without using a macro, use the  twDataShape_CreateFromEntries  function. In the example below, we are creating a Data Shape called SteamSensorReadings that has two numbers as Field Definitions.   twDataShape * ds = twDataShape_Create(twDataShapeEntry_Create("a",NULL,TW_NUMBER)); twDataShape_AddEntry(ds, twDataShapeEntry_Create("b",NULL,TW_NUMBER)); /* Name the DataShape for the SteamSensorReadings service output */ twDataShape_SetName(ds, "SteamSensorReadings");     Step 7: Events and Services   Events and Services provide useful functionality. Events are a good way to make a Service be asynchronous. You can call a Service, let it return, then your Entity can subscribe to your Event and not keep the original Service function waiting. Events are also a good way to allow the platform to respond to data when it arrives on the edge device without it having to poll the edge device for updates.   Fire Events   To fire an Event you first need to register the Event and load it with the necessary fields for the Data Shape of that Event using the twApi_RegisterEvent function. Afterwards, you would send a request to the ThingWorx server with the collected values using the twApi_FireEvent function. An example is as follows:   twDataShape * ds = twDataShape_Create(twDataShapeEntry_Create("message", NULL,TW_STRING)); /* Event datashapes require a name */ twDataShape_SetName(ds, "SteamSensorFault"); /* Register the service */ twApi_RegisterEvent(TW_THING, thingName, "SteamSensorFault", "Steam sensor event", ds); …. struct { char FaultStatus; double Temperature; double TemperatureLimit; } properties; …. properties. TemperatureLimit = rand() + RAND_MAX/5.0; properties.Temperature = rand() + RAND_MAX/5.0; properties.FaultStatus = FALSE; if (properties.Temperature > properties.TemperatureLimit && properties.FaultStatus == FALSE) { twInfoTable * faultData = 0; char msg[140]; properties.FaultStatus = TRUE; sprintf(msg,"%s Temperature %2f exceeds threshold of %2f", thingName, properties.Temperature, properties.TemperatureLimit); faultData = twInfoTable_CreateFromString("message", msg, TRUE); twApi_FireEvent(TW_THING, thingName, "SteamSensorFault", faultData, -1, TRUE); twInfoTable_Delete(faultData); }   Invoke Services   In order to invoke a Service, you will use the twApi_InvokeService function. The full documentation for this function can be found in [C SDK HOME DIR]/src/api/twApi.h. Refer to the table below for additional information.   Parameter Type Description entityType Input The type of Entity that the service belongs to. Enumeration values can be found in twDefinitions.h. entityName Input The name of the Entity that the service belongs to. serviceName Input The name of the Service to execute. params Input A pointer to an Info Table containing the parameters to be passed into the Service. The calling function will retain ownership of this pointer and is responsible for cleaning up the memory after the call is complete. result Input/Output A pointer to a twInfoTable pointer. In a successful request, this parameter will end up with a valid pointer to a twInfoTable that is the result of the Service invocation. The caller is responsible for deleting the returned primitive using twInfoTable_Delete. It is possible for the returned pointer to be NULL if an error occurred or no data is returned. timeout Input The time (in milliseconds) to wait for a response from the server. A value of -1 uses the DEFAULT_MESSAGE_TIMEOUT as defined in twDefaultSettings.h. forceConnect Input A Boolean value. If TRUE and the API is in the disconnected state of the duty cycle, the API will force a reconnect to send the request.   See below for an example in which the Copy service from the FileTransferSubsystem is called:   twDataShape * ds = NULL; twInfoTable * it = NULL; twInfoTableRow * row = NULL; twInfoTable * transferInfo = NULL; int res = 0; const char * sourceRepo = "SimpleThing_1"; const char * sourcePath = "tw/hotfolder/"; const char * sourceFile = "source.txt"; const char * targetRepo = "SystemRepository"; const char * targetPath = "/"; const char * targetFile = "source.txt"; uint32_t timeout = 60; char asynch = TRUE; char * tid = 0; /* Create an infotable out of the parameters */ ds = twDataShape_Create(twDataShapeEntry_Create("sourceRepo", NULL, TW_STRING)); res = twDataShape_AddEntry(ds, twDataShapeEntry_Create("sourcePath", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("sourceFile", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetRepo", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetPath", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("targetFile", NULL, TW_STRING)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("async", NULL, TW_BOOLEAN)); res |= twDataShape_AddEntry(ds, twDataShapeEntry_Create("timeout", NULL, TW_INTEGER)); it = twInfoTable_Create(ds); row = twInfoTableRow_Create(twPrimitive_CreateFromString(sourceRepo, TRUE)); res = twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(sourcePath, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(sourceFile, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetRepo, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetPath, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromString(targetFile, TRUE)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromBoolean(asynch)); res |= twInfoTableRow_AddEntry(row, twPrimitive_CreateFromInteger(timeout)); twInfoTable_AddRow(it,row); /* Make the service call */ res = twApi_InvokeService(TW_SUBSYSTEM, "FileTransferSubsystem", "Copy", it, &transferInfo, timeout ? (timeout * 2): -1, FALSE); twInfoTable_Delete(it); /* Grab the tid */ res = twInfoTable_GetString(transferInfo,"transferId",0, &tid);   Bind Event Handling You may want to track exactly when your edge Entities are successfully bound to or unbound from the server. The reason for this is that only bound items should be interacting with the ThingWorx Platform and the ThingWorx Platform will never send any requests targeted at an Entity that is not bound. A simple example that only logs the bound Thing can be seen below. After creating this function, it will need to be registered using the twApi_RegisterBindEventCallback function before the connection is made.   void BindEventHandler(char * entityName, char isBound, void * userdata) { if (isBound) TW_LOG(TW_FORCE,"BindEventHandler: Entity %s was Bound", entityName); else TW_LOG(TW_FORCE,"BindEventHandler: Entity %s was Unbound", entityName); } …. twApi_RegisterBindEventCallback(thingName, BindEventHandler, NULL);   OnAuthenticated Event Handling   You may also want to know exactly when your Edge device has successfully authenticated and made a connection to the ThingWorx platform. Like the bind Event handling, this function will need to be made and registered. To register this handler, use the   twApi_RegisterOnAuthenticated Callback function before the connection is made. This handler form can also be used to do a delay bind for all Things.   void AuthEventHandler(char * credType, char * credValue, void * userdata) { if (!credType || !credValue) return; TW_LOG(TW_FORCE,"AuthEventHandler: Authenticated using %s = %s. Userdata = 0x%x", credType, credValue, userdata); /* Could do a delayed bind here */ /* twApi_BindThing(thingName); */ } … twApi_RegisterOnAuthenticatedCallback(AuthEventHandler, NULL);     Step 8: Tasks   If you are using the built-in Tasker to drive data collection or other types of repetitive or periodic activities, create a function for the task. Task functions are registered with the Tasker and then called at the rate specified after they are registered. The Tasker is a very simple, cooperative multitasker, so these functions should not take long to return and most certainly must not go into an infinite loop.   The signature for a task function is found in [C SDK HOME DIR]/src/utils/twTasker.h. The function is passed a DATETIME value with the current time and a void pointer that is passed into the Tasker when the task is registered. After creating this function, it will need to be registered using the twApi_CreateTask function after the connection is created. Below shows an example of creating this function, registering this function, and how this function can be used.   #define DATA_COLLECTION_RATE_MSEC 2000 void dataCollectionTask(DATETIME now, void * params) { /* TW_LOG(TW_TRACE,"dataCollectionTask: Executing"); */ properties.TotalFlow = rand()/(RAND_MAX/10.0); properties.Pressure = 18 + rand()/(RAND_MAX/5.0); properties.Location.latitude = properties.Location.latitude + ((double)(rand() - RAND_MAX))/RAND_MAX/5; properties.Location.longitude = properties.Location.longitude + ((double)(rand() - RAND_MAX))/RAND_MAX/5; properties.Temperature = 400 + rand()/(RAND_MAX/40); /* Check for a fault. Only do something if we haven't already */ if (properties.Temperature > properties.TemperatureLimit && properties.FaultStatus == FALSE) { twInfoTable * faultData = 0; char msg[140]; properties.FaultStatus = TRUE; properties.InletValve = TRUE; sprintf(msg,"%s Temperature %2f exceeds threshold of %2f", thingName, properties.Temperature, properties.TemperatureLimit); faultData = twInfoTable_CreateFromString("message", msg, TRUE); twApi_FireEvent(TW_THING, thingName, "SteamSensorFault", faultData, -1, TRUE); twInfoTable_Delete(faultData); } /* Update the properties on the server */ sendPropertyUpdate(); } … twApi_CreateTask(DATA_COLLECTION_RATE_MSEC, dataCollectionTask); … while(1) { char in = 0; #ifndef ENABLE_TASKER DATETIME now = twGetSystemTime(TRUE); twApi_TaskerFunction(now, NULL); twMessageHandler_msgHandlerTask(now, NULL); if (twTimeGreaterThan(now, nextDataCollectionTime)) { dataCollectionTask(now, NULL); nextDataCollectionTime = twAddMilliseconds(now, DATA_COLLECTION_RATE_MSEC); } #else in = getch(); if (in == 'q') break; else printf("\n"); #endif twSleepMsec(5); }   Click here to view Part 4 of this guide
View full tip
    Step 5: Collection Widget   A Collection widget is used to display information from a collection of Things. Similar to a Grid Widget, the Collection widget gives you complete control over how data is displayed by binding data to an embedded static Mashup.   In the first part of this step we will create a static Mashup with Parameters bound to its widgets. Next, we will configure a Collection widget to use the static Mashup we create. In the next step we will customize the Collection styling.   Create Static Mashup   Click the Browse tab on the far left of Composer, in the Visualizations section, click Mashups . Click + New to create a Mashup. Select Static Mashup, then click OK.   Name your Mashup TractorListFormat If Project is not already set, click the + in the Project text box and select the PTCDefaultProject. Click Save. Click the Design tab If Project is not already set, click the + in the Project text box and select the PTCDefaultProject. In the Layout tab, under Container > Positioning, Click Static, then scroll to Container Size, and click Fixed Size NOTE: A Static Mashup maintains fixed widget sizes and spacing, as opposed to a Responsive Mashup that will dynamically change widget sizes and spacing to use the available window area. Click the Width property and set it to 200 and the Height property to 60   In the lower left panel, click the Style tab and click the X to remove the default Style. Click the + icon, then select DefaultImageBorderStyle to remove all styling. NOTE: The default Mashup styling is removed so it will not override the sidebar and parent Mashup styling. Scroll to the Width property and set it to 200 and the Height property to 60 Click the Widgets tab and type label in the filter text box. Drag and drop two Label Widgets onto the upper left of your static Mashup.   Add Parameters to Mashup Select the Explorer tab then select the top level Mashup Click Configure Mashup Parameters from the drop-down menu in the upper left of the Mashup canvas.   Click the Add Parameter button. Type firstLine in the Name field and select a Base Type of STRING. Click Add Parameter again and name this parameter secondLine also with a Base Type STRING.   Click Done to return to Mashup Builder.   Bind Parameters to Widgets   Click the drop-down in the upper left of the Mashup canvas, then select Configure Bindings Click firstLine from the Properties list on the left. Clicking the drop-down arrow and click Add Source to display the Mashup entities that can be bound to the Mashup parameter named firstLine.   Select the checkbox next to the top LabelText property.   Click Done to return to the Configure Bindings pop-up. Click secondLine, then Binding Targets, and select the checkbox next to the bottom Label Text property, then click Done Click Done to close Configure Bindings pop-up and return to Mashup builder. NOTE: The Mashup parameters and bindings are displayed in the Connections panel at the bottom. Click Save before continuing to the next step.   Bind Data to Collection   Return to the main Mashup then drag and drop a Collection widget onto the top area of the left side bar. In the Collection Properties panel, set the View property to Table Scroll to the Mashup property, click the wand icon and browse to the name of the static Mashup created above. Drag the All Data source from the data panel on the right onto the Collection widget, then click Data in the Select Binding Target pop-up.   Set UIDField property and SortField to SerialNumber.   Drag the All Data source from the data panel on the right onto the Collection widget, then click Data in the Select Binding Target pop-up.   In the Collection Properties panel, scroll to MashupPropertyBinding and click *Add Enter the text below, then click Done: { "SerialNumber": "firstLine", "ModelNumber": "secondLine" } NOTE: This JSON property binds the SerialNumber and ModelNumber properties in the data source to the the firstLine and secondLine parameters in the embedded mashup   9. Save the Mashup and click View Mashup.   10. Test the Mashup and you will see the navigation panel on the left is showing data and is linked to the Google Map widget in the center.       Step 6: Customize Collection   The Collection uses default styling and no images. In this part of the exercise, we will replace the blue bar that indicates the selected row with a custom icon and modify the default styles so that the left panel's background color is shown.   Right-click on each of the images below to download and save them for use in the next step.     We will upload these images to create new Media entities and apply them to the Repeater widget.   Select the Browse folder icon on the far left of Composer, in the Visualization section click Media Click the + New to create a new Media entity and enter a name for the un-selected tractor image. Click Change in the Image section, then browse to the saved image. Click Open, then Save. Repeat these steps to create a Media entity for a selected tractor Open the static TractorListFormat mashup that controls the Collection widget formating Click and drag an Image widget onto the mashup In the lower left panel, in the SourceURL property, click the wand icon to select the unselected tractor image. Change both the Width and Height properties to 50 pressing Tab after each entry to record them.   Click the Explorer tab in the top left, click the top-most Mashup entity, then click the Style Properties tab Cick the X in the Style property and select DefaultImageBorderStyle to remove all styling, then click Save Click the More drop-down at the top of Composer and click Duplicate   Enter TractorListSelectFormat for the name and click Save then click the Design tab Click on the tractor image, then, the lower left, click the wand icon in the SourceURL property and select the selected tractor image and Click Save   Open your original Mashup and click on the Collection widget in the Mashuo Builder canvas. Scroll to the SelectedMashupName Property and click the + to select TractorListSelecteFormat.   Click Save for the Mashup, then View Mashup to see your Mashup with customized icons.   The default black text on green is a little hard to read. The steps below will change the text colors to make the data more readable. Open the TractorListSelectFormat Mashup then click on the top Label widget to change the color of the text. Click the Style Properties tab and expand Base and Label In color property select yellow and select Bold in the font-weight property before clicking Save. Select the other Label widget and assign a light grey color for the color then save the embedded Mashup. Reload the runtime view of the Mashup to see the results.     Step 7: Detail Panel   The right sidebar has a simple Image of a tractor along with product-specific information shown in Gauges and Value Displays.   The right sidebar contains two tabs in a Tabs - Responsive widget. The tabs are used to selectively hide and display groups of functions and data. The orange button labeled "View Vehicle Specs" is a Navigation widget that opens a pop-up window with other detailed product information. The colored range indications on the right Gauge were created by configuring the gauges ValueFormatter property to use State Formatting.   Add Tab Widget   Open the original, main Mashup and enter tab in the Widget panel search field. Drag and drop a Tabs widget onto the Right Sidebar.   Scroll to the Tab1Name property and enter Tractor Details. NOTE: This guide only covers configuring one of the two tabs added to the Mashup. Using the skills you've practiced thus far, feel free to add additional information to the tabs on your Mashup. Uncheck the RoundedCorners property. Click the Layout tab and click the radio button under Container > Orientation > Vertical   Add ValueDisplay Widgets   Type value in the Widget search box then click and drag a Value Display Widget onto Tab 1. In the Property panel, scroll to the Label property and enter Serial Number In the Data panel, expand Selected Row(s) then drag the SerialNumber property onto the Value Display widget and click Data when the Select Binding Target pop-up is displayed.   Drag another Value Display Widget onto the tab widget below to the first one and enter Name for the Label property. Drag name from the Selected Row(s) onto the second Value Display and then click Data. NOTE: Be sure to select data sources under Selected Row(s) so that data displayed will correspond to the tractor selected from either the map or the left side menu. Save the Mashup then click View Mashup to see all three panels working together to show data.   Add Gauges   Enter gauge in the Widget search box then click and drag a Gauge Widget onto Tab 1. In the Properties panel, enter 3000 for the MaxValue property and RPM in the Legend property. In the Data panel, expand Selected Row(s) then drag CurrentRPM onto the Gauge widget and click Data when the Select Binding Target pop-up is displayed. Drag another Gauge widget onto the canvas next to the first one and enter MPH for the Legend property. Drag CurrentSpeed from the Selected Row(s) onto the second Gauge, then click Data.   Click here to view Part 3 of this guide.
View full tip
  Use the C SDK to build an app that connects to ThingWorx with persistent bi-directional communication.     GUIDE CONCEPT   This project will introduce more complex aspects of the ThingWorx C SDK and help you to get started with development.   Following the steps in this this guide, you will be ready to develop your own IoT application with the ThingWorx C SDK.   We will teach you how to use the C programming language to connect and build IoT applications to be used with the ThingWorx Platform.   YOU'LL LEARN HOW TO   Establish and manage a secure connection with a ThingWorx server, including SSL negotiation and connection maintenance Enable easy programmatic interaction with the Properties, Services, and Events that are exposed by Entities running on a ThingWorx server Create applications that can be directly used with your device running the C programming language Basic concepts of the C Edge SDK How to use the C Edge API to build a real-world application How to utilize resources provided in the Edge SDK to help create your own application    Note: The estimated time to complete ALL 4 parts of this guide is 60 minutes.      Step 1: Completed Examples   Download the completed files for this tutorial: ThingWorx C Edge SDK Sample Files.zip.   This tutorial will guide you through working with the C SDK on differing levels. Utilize this file to see a finished example and return to it as a reference if you become stuck creating your own fully fleshed out application.   Keep in mind, this download uses the exact names for Entities used in this tutorial. If you would like to import this example and also create Entities on your own, change the names of the Entities you create.     Step 2: Environment Setup   In order to compile C code, you need a C compiler and the ThingWorx C Edge SDK. It will be helpful to have CMake installed on your system. CMake is a build tool that will generate make or project files for many different platforms and IDEs.   Operating System Notes Windows You will need a 3rd party compiler such as MinGW GCC, Cygwin GCC or you can follow these Microsoft instructions to download and use the Microsoft Visual C++ Build Tool. Mac Download the Apple Developer Tools. Linux/Ubuntu A compiler is included by default.   NOTE: You can use CMake, version 2.6.1 or later to build projects or make files, which then are used to build the applications that you develop with the C SDK.   Before you can begin developing with the ThingWorx C SDK, you need to generate an Application Key and modify the source code file. You can use the Create an Application Key guide as a reference.   Modify Source File   Extract the files from the C SDK samples zip file. At the top level of the extracted files, you will see a folder called examples. This directory provides examples of how to utilize the C SDK. Open a terminal, go to your workspace, and create a new directory. You can also just switch to the unzipped directory in your system. After you've created this directory in your workspace, copy the downloaded files and folders into your new directory. You can start creating your connection code or open the main.c source file in the examples\SteamSensor\src directory for an example.   Operating System Code Linux/Ubuntu gedit main.c OR vi main.c Mac open –e main.c Windows start main.c        5. Modify the Server Details section at the top with the IP address for your ThingWorx Platform instance and the Application Key you would like to use.   Change the TW_HOST definition accordingly. Change the TW_PORT definition accordingly. Change the TW_APP_KEY definition to the keyId value saved from the last step.   /* Server Details */ #define TW_HOST "https://pp-XXXXXXXXX.devportal.ptc.i" #define TW_PORT 80 #define TW_APP_KEY "e1d78abf-cfd2-47a6-92b7-37ddc6dd34618" NOTE: Using the Application Key for the default Administrator is not recommended. If administrative access is absolutely necessary, create a User and place the user as a member of Admins.   Compile and Run Code   To test your connection, you will only need to update the main.c in the SteamSensor example folder. CMake can generate Visual Studio projects, make build files or even target IDEs such as Eclipse, or XCode. CMake generates a general description into a build for your specific toolchain or IDE.   Inside the specific example folder you would like to run, ie SteamSensor. Create a directory to build in, for this example call it bin. mkdir bin  cd bin      5. Run the CMake command listed below. This assumes CMake is already on your PATH. cmake ..      6. CMake has now produced a set of project files which should be compatible with your development environment.   Operating System Command Note Unix make A set of make files Windows msbuild tw-c-sdk.sln /t:build A visual studio solution   NOTE: CMake does its best to determine what version of Visual Studio you have but you may wish to specify which version to use if you have more than one installed on your computer. Below is an example of forcing CMake to use a specific version of Visual Studio: cmake -G "Visual Studio 15 2017" .. If your version of Visual Studio or other IDE is unknown, use cmake -G to see a list of supported IDEs.   You also have the alternative of opening the tw-c-sdk.sln from within Visual Studio and building in this IDE.   NOTE: By default, CMake will generate a build for the creation of a release binary. If you want to generate a debug build, use the command-> cmake -DBUILD_DEBUG=ON ..       7. Once your build completes you will find the build products in the CMake directory (see example below). From here, open the project in your IDE of choice.   NOTE: You should receive messages confirming successful binding, authentication, and connection after the main.c file edits have been made.   Operating System Files Description Unix ./bin/src/libtwCSdk_static.a  Static Library Unix ./bin/src/libtwCSdk.so  Shared Library Unix ./bin/examples/SteamSensor/SteamSensor   Sample Application Windows .\bin\src\<Debug/Release>\twCSdk_static.lib  Static Library Windows .\bin\src\<Debug/Release>\twCSdk.dll  Shared Library Windows .\bin\examples\<Debug/Release>\SteamSensor\SteamSensor.exe  Sample Application     Step 3: Run Sample Code   The C code in the sample download is configured to run and connect to the Entities provided in the ThingWorxEntitiesExport.xml file. Make note of the IP address of your ThingWorx Composer instance. The top level of the exported zip file will be referred to as [C SDK HOME DIR].   Navigate to the [C SDK HOME DIR]/examples/ExampleClient/src directory. Open the main.c source file.   Operating System Command Linux/Ubuntu gedit main.c OR vi main.c Mac open –e main.c Windows start main.c   Modify the Server Details section at the top with the IP address for your ThingWorx Platform instance and the Application Key you would like to use. Change the TW_HOST definition accordingly.   NOTE: By default, TW_APP_KEY has been set to the Application Key from the admin_key in the import step completed earlier. Using the Application Key for the default Administrator is not recommended. If administrative access is absolutely necessary, create a user and place the user as a member of the Admins security group.   /* Server Details */ #define TW_HOST "127.0.0.1" #define TW_APP_KEY "ce22e9e4-2834-419c-9656-e98f9f844c784c"   If you are working on a port other than 80, you will need to update the conditional statement within the main.c source file. Search for and edit the first line within the main function. Based on your settings, set the int16_t port to the ThingWorx platform port. Click Save and close the file. Create a directory to build in, for this example call it bin.   Operating System Command Linux/Ubuntu mkdir bin Mac mkdir bin Windows mkdir bin   Change to the newly created bin directory.   Operating System Command Linux/Ubuntu cd bin Mac cd bin Windows cd bin   Run the CMake command using your specific IDE of choice.    NOTE: Include the two periods at the end of the code as shown below. Use cmake -G to see a list of supported IDEs.   cmake ..   Once your build completes, you will find the build products in the bin directory, and you can open the project in your IDE of choice. NOTE: You should receive messages confirming successful binding, authentication, and connection after building and running the application    10. You should be able to see a Thing in your ThingWorx Composer called SimpleThing_1 with updated last Connection and isConnected properties. SimpleThing_1 is bound for the duration of the application run time                                                                                                                                                                                The below instructions will help to verify the connection.   Click Monitoring. Click Remote Things from the list to see the connection status.   You will now be able to see and select the Entity within the list.   Step 4: ExampleClient Connection   The C code provided in the main.c source file is preconfigured to initialize the ThingWorx C Edge SDK API with a connection to the ThingWorx platform and register handlers. In order to set up the connection, a number of parameters must be defined. This can be seen in the code below.   #define TW_HOST "127.0.0.1" #define TW_APP_KEY "ce22e9e4-2834-419c-9656-ef9f844c784c #if defined NO_TLS #define TW_PORT = 80; #else #define TW_PORT = 443; #endif The first step of connecting to the platform: Establish Physical Websocket, we call the   twApi_Initialize function with the information needed to point to the websocket of the ThingWorx Composer. This function:   Registers messaging handlers Allocates space for the API structures Creates a secure websocket   err = twApi_Initialize(hostname, port, TW_URI, appKey, NULL, MESSAGE_CHUNK_SIZE, MESSAGE_CHUNK_SIZE, TRUE); if (TW_OK != err) { TW_LOG(TW_ERROR, "Error initializing the API"); exit(err); }   If you are not using SSL/TLS, use the following line to test against a server with a self-signed certificate:   twApi_SetSelfSignedOk();   In order to disable HTTPS support and use HTTP only, call the twApi_DisableEncryption function. This is needed when using ports such as 80 or 8080. A call can be seen below:   twApi_DisableEncryption();   The following event handlers are all optional. The twApi_RegisterBindEventCallback function registers a function that will be called on the event of a Thing being bound or unbound to the ThingWorx platform. The twApi_RegisterOnAuthenticatedCallback function registered a function that will be called on the event the SDK has been authenticated by the ThingWorx Platform.  The twApi_RegisterSynchronizeStateEventCallback function registers a function that will be called after binding and used to notify your application about fields that have been bound to the Thingworx Platform.   twApi_RegisterOnAuthenticatedCallback(authEventHandler, TW_NO_USER_DATA); twApi_RegisterBindEventCallback(NULL, bindEventHandler, TW_NO_USER_DATA); twApi_RegisterSynchronizeStateEventCallback(NULL, synchronizeStateHandler, TW_NO_USER_DATA);   NOTE: Binding a Thing within the ThingWorx platform is not mandatory, but there are a number of advantages, including updating Properties while offline.   You can then start the client, which will establish the AlwaysOn protocol with the ThingWorx Composer. This protocol provides bi-directional communication between the ThingWorx Composer and the running client application. To start this connection, use the line below:   err = twApi_Connect(CONNECT_TIMEOUT, RETRY_COUNT); if(TW_OK != err){ exit(-1); }     Click here to view Part 2 of this guide
View full tip
    Step 5: Limiting Composer Access   If you would like to limit a User even more, there are a few things you can do. Go back to the Administrator account and open one of the accounts we created, such as User.OtherAgencies, you will notice the Enabled and Locked checkboxes. Enabled allows you to set whether an account can be used in ThingWorx during runtime. Locked dictates whether an account can be logged into at all.     Suppose we would like for the user to only see emptiness when they try to access the Composer. Follow the below steps to limit ThingWorx Composer access even more.   1. Open one of the Users we created earlier, ie User.OtherAgencies and click on the User Profile tab.  The user profile configuration allows an administrator to control which categories and entities should be displayed for an individual User.     2. You will see various sections and checkboxes. Uncheck all of them to stop access to importing, exporting, creating new Entities, being able to see existing Entities, and much more.     3. Click Save.   Now if you attempt to log into the ThingWorx Composer, you will notice a very difference experience without the ability to see current Entities. Perform this update for all the Users we created, except for User.IT and User.AgencySuperUser.     Step 6: Creating Clearance Levels   ThingWorx does not include default security clearance levels for you. What it does include are Thing Groups. Thing Groups are a reference-able entity type in ThingWorx that allow for Things and Thing Groups as its members. They also provide ThingWorx administrators the ability to manage at scale exposure of Things to only those that require access.   Before we create out first Thing Group, let us create some Entities that will house resources. The first will be an image that is top secret (shown below). In ThingWorx, this would be of type Media. After, we will create a file repository that will contain super-secret documents, a repository for job applications, and another repository for documents that are publicly accessible.   Our Top Secret Image:     Create the Media Entity    Let us store our image in ThingWorx. This image will need extra credentials to access it. This authentication can be performed with a basic username/password setup or SSO utilizing your own configurations.   1.  In the ThingWorx Composer, click the + New button in the top left.    2. In the dropdown list, click Media.   3. In the Name field, use TopSecretImage.   4. Set the Project field to an existing Project (ie, PTCDefaultProject) and click Save. 5. Click Change and add an image or use the image above.     6. Click on the Configuration tab.     7. For the Authentication Type field, select basic. You can select other types based on your Single Sign On and server level configurations, but we will keep this scenario simple.     8. Set a Username and Password that would be used to access our top secret Media.     9. Click Save.   Create the File Repositories   Let us create the setup for our repositories.   1.  In the ThingWorx Composer, click the + New button in the top left.    2. In the dropdown list, click Thing.     3. In the Name field, use TopSecretDocuments and FileRepository as the Base Thing Template.     4. Click Save.  5. Repeat steps 1-4 to create two File Repositories titled JobApplications and PublicDocuments.     Security Levels and Resource Lockdown    We now have our several resources and areas for differing levels of access. We will create 3 Thing Groups to mimic security levels. Our top-secret image will exist independently on ThingWorx, but also inside of a file repository for some level of redundancy. That file repository will belong to one Thing Group, while the other two file repositories will have their own separate Thing Groups.   1. Open the TopSecretDocuments File Repository Thing.  2. Click on the Services tab.     3. Scroll down to the SaveImage and click the play button.      4. Enter a path (such as /SecretImage.png) for the image to reside on the server and click Change to add an image.     5. Click the Execute button.    You now have your image in a File Repository. Let us add this Entity to a Thing Group, then configure the permissions at the Thing Group level.   1.  In the ThingWorx Composer, click the + New button in the top left.      2. In the dropdown list, click Thing Group.     3. In the Name field enter Clearance.Top.     4. Set the Project field to an existing Project (ie, PTCDefaultProject) and click Save. 5. Click the Services tab and click the play button to execute the AddMembers Service.     6. Click on the members Input Info Table and click the + Add button.      7. Enter TopSecretDocuments as the name of the member and Thing as the type. 8. Click Add and Save. Set the Project field to an existing Project (ie, PTCDefaultProject).      9. With you members set, click Execute. 10. Repeat steps 1-9 to create two more Thing Groups and add the other File Repository Entities that we created earlier. Name these Thing Groups Clearance.Public and Clearance.HumanResources. If we wanted to, we could create a Thing Group to add here as a member of another Thing Groups’ hierarchy.   Thing Group Permissions    Time to set the permissions. With the Clearance.Top Thing Group selected, follow the below instructions. As mentioned before, in a production system, you would have more Users and User Groups to completely setup this scenario.   1. Click Permissions. 2. For Visibility, enter PTCDefenseDepartment into the filter.  3. Expand the Organization and select the Agents unit and click Save. 4. Click the Run Time tab. 5. Set the permissions for the Agency.Agents User Group to have full access as shown below:  6. Click Save.  7. Repeat steps 1-6 for our other security clearance Thing Groups. Set the permissions to a department and User Group that you see fit.     Step 7: Next Steps   Congratulations! You've successfully completed the Securing Resources and Private Data guide. In this guide, you learned how to:   Securing data and private information Use Services, Alerts, and Subscriptions to handle processes without human interaction Handling group and organization permissions   The next guide in the Utilizing ThingWorx to Secure Your Aerospace and Defense Systems learning path is Connecting External Databases and Model.    Learn More   We recommend the following resources to continue your learning experience:   Capability Guide Build ThingWorx Solutions in Food Industry Build Design Your Data Model Build Implement Services, Events, and Subscriptions   Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Community Developer Community Forum
View full tip