cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

ThingWorx Navigate is now Windchill Navigate Learn More

IoT & Connectivity Tips

Sort by:
  Step 3: Create Validation and Status   With our MyFunctionsMashup Mashup open, let's add a Validation. A Validation is similar to an Expression, except you have the added capability of triggering Events based on a True or False outcome of your validation. We will use the Validation to check and confirm the Text Field we created only has the values we added in our Functions. Let's also add two Status Message Functions that will show whether or not a user has added any text outside of what we want.   Open the MyFunctionsMashup Mashup to the Design tab. Click the green + button in the Functions area.    In the New Function modal, select Validator.   Set the Name to isDataClean.   Click Next.  Click Add Parameter. Set the Name to text and ensure the Base Type is STRING.   Add the following code to the Expression are: if(text === "NO") { result = true; } else if(text === "YES") { result = true; } else { let array = text.split("YES").join(""); array = array.split(",").join(""); let count = array.trim().length; if(count) { result = false; } else { result = true; } }   9. Click Done.   We have our Validator in place, now we need our two Status Message Functions. Why two? You can setup one Status Message to perform the task, but for this case, we're keeping things simple.   Click the + button in the Functions area. Select Status Message in the dropdown.    Set the Name to GoodInputProvided.   Click Next. Ensure Message Type is Info. In the Message field, enter Text is all good!.   Click Done. Let's create another Status Message Function. Set the Name to BadnputProvided.   Click Next. Change Message Type to Error. In the Message field, enter Text is BAD!.   Click Done.   We now have two Status Message Functions and a Validator to help with checking our text data. Let's connect everything together. This time, let's use the Bind button.   Expand the Validator section in the Functions tab. Click the Bind (arrows) button on the isDataClean Validator. This window will help us configure connections a bit easier.    Click the down arrow by the True Event. Click Add Trigger Service.   Click Functions. Check the checkbox by GoodInputProvided.   Click Next. Click the down arrow by the False Event. Click Add Trigger Service.   Click Functions. Check the checkbox by BadInputProvided.   Click Next. You should currently have the following setup:    Let's add in our connections to the Text Field and when we'll run this Validation.    Click the down arrow by the text Property.   Click Add Source. With the Widgets tab selected, scroll down and select the Text Property of our Text Field.   Click Next. Click the down arrow by Evaluate Service. Select Add Event Trigger.   With the Widgets tab selected, scroll down and select the Clicked Property of our Button.   Click Next. You should currently have the following setup:   Click Done. Click Save and view your updated Mashup.   Your Validator is complete. You now have a way to tell when a user has inputed their own text into the text box. To try things out, add some crazy characters, hit the button, and see what happens. You might notice that you have your Expressions running at the same time as your Validator. Switch up the bindings to get it to run the way you want it to.     Step 4: Create Confirmation Modal   With our MyFunctionsMashup Mashup open, let's add a Confirmation Function. A Confirmation Function provides a quick modal that will give users a method to confirm actions or events before they take place. If you've ever almost deleted a production database (don't judge me!), then you know how handy a confirmation screen can be. Let's add a button that will trigger a confirmation as to whether we would like to run the Validator we created in the last section.   Open the MyFunctionsMashup Mashup to the Design tab. Click the + button in the Functions panel. Select Confirmation in the dropdown.   Set the Name to confirmDataValidation. Click Next.   Set the Title Text to Confirm Data Validation?. Set the Message Text to Would you like to perform a data validation?.   Set the Cancel Button Label to No Thanks!. Scroll down and set the Action Button Label to Yes Sir!.   Click Done. Click the Widgets tab in the top left. Filter for and select a Button Widget.   Drag and drop a Button Widget onto the Canvas.   With the new Button selected, click the down arrow that appears on the Button. Drag and drop the Clicked Event of the Button to the OpenConfirmation Service of our Confirmation Function.     We now have our Confirmation Function and a Button that will open the Confirmation when clicked. Let's add the final step by connecting the Confirmation to our Validation Function.   Click the Bind (arrows) button for our isDataClean Validator.   Click the down arrow by the Evaluate Service. Select Add Event Trigger.   Click the Functions tab. Select the ActionClick Event of our Confirmation Function.   Click Next. Click Done. Click Save for our Mashup.   We now have a way to independently validate that the text in our text box does not contain random values added by the user. View the Mashup and test things out by clicking on the second button and adding some crazy characters to our text box.       Step 5: Create Navigation   Thus far, we have been sticking to one Mashup. Let's venture out a bit by showing a different Mashup. Inside of our MyFunctionsMashup Mashup, we will add a Navigation Function. This will allow us to go to or just show a different UI based on some kind of user input or event. For our example, when a user clicks No Thanks! in our Confirmation Function, let's send them to a different Mashup.   Follow the steps to create a Navigation Function, a destination Mashup, and tie the two together.    Create Destination Mashup Navigate to Browse > Visualization > Mashups.   Click + New. Keep the defaults and click OK. In the Name field, type MyNavigationDestination.   Click Save. Click the Design tab at the top to open the Mashup canvas. Click the Widgets tab.   Filter and search for the word Label. Drag and drop a Label Widget to the Mashup canvas. If you like, enlarge the Label sizing.    In the Label Widget Properties section, scroll down to the LabelType Property.       Click the dropdown and select Large Header.   In the LabelText Property field, type MY DESTINATION UI SCREEN. Click Save. You have now created a simple UI that we will go to when we click our navigation button. Next, we'll tie together our navigation button and our freshly created Mashup.   Create Navigation Function Reopen the MyFunctionsMashup Mashup to the Design tab. Click the + button in the Functions panel. Select Navigation in the dropdown.   Set the Name to travelToDestination.   Click Next. Set the Target Window Type to ModalPopup. Set the Pop-up Title to New Popup Here.   Set the Pop-up Width to 400. Set the Pop-up Height to 400.   Click the + button at Target Mashup. Type MyNavigationDestination into the search bar. Select the MyNavigationDestination Mashup when it appears.   Click Done. Select the Bind (arrows) button for our new travelToDestination Navigation Function.   Click the down arrow next to Navigate Service. Click Add Event Trigger.   Click the Functions tab. Select the CancelClick Event of our confirmDataValidation Confirmation Function.   Click Done. Click Save for the Mashup.   We now have a modal that will appear after we click the No Thanks! button in our Confirmation Function. View the Mashup and try out what you've done by clicking the bottom button, then clicking No Thanks!     Click here to view Part 3 of this guide.  
View full tip
    Step 3: Important Factors   If the company cannot ship or deliver their products fast enough, that will cause food waste and less revenue. At the same time, having nonstop access to meaningful data about the logistics side of the company provides a new level of decision-making capabilities.   Let’s first see what some of the pitfalls are that causes bad logistics or room for improvement. We can keep these items in mind as we work on our application.   Customer behavior – More attention can be put on how customers are shopping in certain areas. It’s not enough to know what areas are buying the most products and send them more shipments. Deadhead miles and load management will help save unnecessary costs. Shipment tracking and route planning – The traveling salesman problem is one that scientists have been working on for ages. There is no one solution to this problem, but there are many bad ones. Planning methods and routes is almost magic, but the more methodical the process, the more can be saved here. Something as simple and selecting the routes based on the number of right turns to reduce gas can save millions on yearly gas expenses. Method of travel utilization – Sea. Air. Road. Train. Each method has its benefits and down sides. When you pick a method, also incorporate how to utilize all the space provided. This could be using smaller boxes, a different type of packaging material, or playing Tetris in a trailer.   Customer Models   Understanding your customer and their habits is of utmost importance. We'll start by creating some of the base models used for customers in this applications. You will build on top of these models as you progress through this learning path.   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Shape in the dropdown.   In the name field, enter Fizos.Customers.DataShape. All of our customers will be based off this Data Shape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes. Click on the Field Definitions tab and click the + Add button to add new Field Definitions.   Add the list of Properties below:  Name       Base Type      Aspects                           Description ID Integer 0 minimum, primary key, default 0 Row identifier UUID String N/A String used as unique identifer across multiple platforms Type String N/A Type of customer (individual or another company) Factors Tags Data Tag This will hold the different type of data points or tags that will help to analyze a customer's characteristics and behavior Name String N/A Customer name Email String N/A Customer email Address String N/A Customer address Phone String N/A Customer phone number   The Properties for the Fizos.Customers.DataShape Data Shape should match the following:   7. In the ThingWorx Composer, click the + New in the top left of the screen.   8. Select Data Table in the dropdown and select Data Table in the prompt.   9. In the name field, enter Fizos.Customers.DataTable. Our differing types of customers will fall under this template.   10. For the Data Shape field, select Fizos.Customers.DataShape. 11. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   12. This entity will be used to house our data and provide assistance with our analytics. Vehicle Models   To build a plan for your logistics solutions, you first need to have the data necessary for your vehicles and factories. Let's begin housing this data to help us with our planning. In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Shape in the dropdown.   In the name field, enter Fizos.Vehicles.DataShape. All of our vehicles will be based off this Data Shape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   Add the list of properties below: Name          Base Type       Aspects                               Description ID Integer 0 minimum, primary key, default 0 Row identifier FactoryID Integer 0 minimum, default 0 Factory row identifier Location Location N/A String used as unique identifer across multiple platforms Features Tags Data Tag This will hold the different type of data points or tags that will help to plan what this vehicle can and will build Size String N/A Factory size   The properties for the Fizos.Factories.DataShape Data Shape are as follows:   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Table in the dropdown and select Data Table in the prompt.   In the name field, enter Fizos.Vehicles.DataTable. Our differing types of vehicles will be inside of this Data Table. For the Data Shape field, select Fizos.Vehicles.DataShape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   This entity will be used to house our data and provide assistance with our analytics.   Factory Models   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Shape in the dropdown.   In the name field, enter Fizos.Factories.DataShape. All of our factories will be based off this Data Shape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   Add the list of properties below: Name        Base Type       Aspects                                  Description ID Integer 0 minimum, primary key, default 0 Row identifier Location Location N/A String used as unique identifer across multiple platforms Features Tags Data Tag This will hold the different type of data points or tags that will help to plan what this factory can and will build Size String N/A Factory size   The properties for the Fizos.Factories.DataShape Data Shape are as follows:     In the ThingWorx Composer, click the + New in the top left of the screen.   Select Data Table in the dropdown.   In the name field, enter Fizos.Factories.DataTable. Our differing types of factories will be inside of this Data Table. For the Data Shape field, select Fizos.Factories.DataShape. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   This entity will be used to house our data and provide assistance with our analytics.   Centralized Logistics   Our application needs an efficient system of logistics. We already have sensors for our food entities, so see below how we work to move in the right direction. We'll be using a Thing Template to allow our new services to be overriden later if we so choose.   In the ThingWorx Composer, click the + New in the top left of the screen.   Select Thing Template in the dropdown.   In the name field, enter Fizos.Logstics. All of our product line will fit this abstract entity. For the Base Thing Template field, select GenericThing. Set the Project (ie, PTCDefaultProject) and click Save to store all changes.   Add the list of Services below. The level of the complexity in these Service vary based on how you would like to start your daily routine, the number of employees, number of deliveries and facilities, etc. Name                                    Return Type   Override    Async     Description PerformDailyDeliveries Nothing Yes Yes Start process of regular product deliveries.   The list of services should look like the following:     Click here to view Part 3 of this guide.
View full tip
This Groovy script gets the weather forecast for a given lat/long by calling an external web service. Use in an Expression rule like this: If: something Then: SetDataItem ("precipitation", round(ExecuteCustomObject ("GetPrecipitation", location) )) This sets the dataitem "precipitation" to the value returned by this script. Parameters: Variable Name               Display Name location                         localtion (lat, lon) import org.apache.commons.httpclient.methods.* import org.apache.commons.httpclient.* import java.text.SimpleDateFormat def location = parameters.location.toString() def locparts = location.split(',') def lat = locparts[0] def lon = locparts[1] def hostname = " www.weather.gov" def url = "/forecasts/xml/sample_products/browser_interface/ndfdXMLclient.php" String ndfdElement = "pop12" // see http://www.weather.gov/forecasts/xml/docs/elementInputNames.php def perceptTimeFormat = new SimpleDateFormat ("yyyy-MM-dd'T'HH:mm:ss"); def cal = Calendar.getInstance(); Date startDate = cal.getTime() cal.add(Calendar.HOUR,12) Date endDate = cal.getTime() def client = new HttpClient () HostConfiguration host = client.getHostConfiguration() host.setHost(hostname, 80, "http") GetMethod get = new GetMethod (url) NameValuePair [] params = new NameValuePair [6] params[0] = new NameValuePair ("lat", lat); params[1] = new NameValuePair ("lon", lon); params[2] = new NameValuePair ("product", 'time-series'); params[3] = new NameValuePair ("begin", perceptTimeFormat.format(startDate)); params[4] = new NameValuePair ("end", perceptTimeFormat.format(endDate)); params[5] = new NameValuePair (ndfdElement, ndfdElement); get.setQueryString(params) client.executeMethod(host, get); message = "Status:" + get.getStatusText() content = get.getResponseBodyAsString() get.releaseConnection() // parse result XML and compute average def dwml = new XmlSlurper ().parseText(content) readings = dwml.data.parameters."probability-of-precipitation".value.collect { Integer.parseInt(it.toString()) } average = readings.sum() / readings.size() //logger.info "Expected precipitation for $location is $readings" return readings[0]
View full tip
    Step 3: Test Services and Save Test Cases   In the previous step, you created an Entity with a customized Service. You can easily test and update the Service within the same editing window. This allows you to quickly modify a Service and confirm that its new behavior is functioning as expected. Test Execution   To execute a Service, you can do it while still developing and after you have created it.   During Development   Open the LineCheckSystem Thing. Select the Services tab. Click the ListHotLineParts link.   If you scroll to the bottom of the page, you will see the Execute window. Enter a value for the TemperatureThreshold and click Execute.     After Development   Open the LineCheckSystem Thing. Select the Services tab. Click the ListHotLineParts Execute play button.   Enter a value for the TemperatureThreshold and click Execute.       Saving Test Cases   Store input parameters makes testing during development and after development much faster. Whether you are testing base cases, specific scenarios, or throwing everything you possibly can at your Service, this tab will allow you to store and name test cases for later usage. During Development   Open the LineCheckSystem Thing. Select the Services tab. Click the ListHotLineParts link   If you scroll to the bottom of the page, you will see the Execute window. Enter a value for the TemperatureThreshold and click Save Input Set. When prompted, enter a name for this test case and click Save Input Set.   After Development   Open the LineCheckSystem Thing. Select the Services tab. Click the ListHotLineParts Execute play button.   Enter a value for the TemperatureThreshold and click Save Input Set.   When prompted, enter a name for this test case and click Save Input Set.   While this example is pretty simple and straightforward, it is also possible to store other values and other data types! You will be provided with a dropdown of your stored test cases. Just select a case and click Execute.     Step 4: Utilize Code Auto-Complete Feature   If it is not open already, open the ListHotLineParts Service or create a new one. On a new line, enter ThingTemplates["LinePartTemplate"]. Add a period . after the closing bracket to gain access to the Services and Properties of the Thing Template we created earlier.     You will now see the Services and Properties that were created inside of the LinePartTemplate or passed down in its object-oriented model.     Step 5: Test Code at Design Time with Linting   Linting is a process by which your development environment warns you of possible issues even before any attempt to run or test the code. Instead, an in-editor warning pops up that alerts you to the issue as you’re writing your code. Linting is yet another feature which many IDEs provide outside of the IoT realm.   Select the Lint checkbox at the top of the Service-editing section.   Type any statement that would cause a Lint warning or error to pop up. In this case, it is best programming practice to use a semi-colons at the end of JavaScript code.   Debugging   Services within the composer are based on the Rhino JavaScript engine, which is built using Java.   It is currently unavailable to see the JavaScript code running within the browser, but you are able to still log what is occurring at each step of development and runtime. ThingWorx provides logging that will go to the ScriptLog under the Monitor section on the left. The logger uses different log levels that will appear differently in the ScriptLog screen.   Level Syntax Example info logger.info logger.info("X + 1 = " + 5); trace logger.trace logger.trace("Print this InfoTable - " + table.toJSON()); warning logger.warn logger.warn("Print random JSON data - " + JSON.stringify(data)); debug logger.debug logger.debug("Adding debug information here."); error logger.error logger.error("What kind of error took place? " + err.message); Testing Tips   From the Lint website, here are some common mistakes that JavaScript Lint looks for: Missing semicolons at the end of a line. Curly braces without an if, for, while, etc. Code that is never run because of a return, throw, continue, or break. Case statements in a switch that do not have a break statement. Leading and trailing decimal points on a number. A leading zero that turns a number into octal (base 8). Comments within comments. Ambiguity whether two adjacent lines are part of the same statement. Statements that don't do anything. JavaScript Lint also looks for the following less common mistakes: Regular expressions that are not preceded by a left parenthesis, assignment, colon, or comma. Statements that are separated by commas instead of semicolons. Use of increment (++) and decrement (--) except for simple statements such as "i++;" or "--i;". Use of the void type. Successive plus (e.g. x+++y) or minus (e.g. x---y) signs. Use of labeled for and while loops. if, for, while, etc. without curly braces. (This check is disabled by default.)     Step 6: Next Steps    Congratulations! You've successfully completed the Application Development Tools, Tips & Tricks, and learned how to:   Use Snippets to generate code Execute and test Services Save Service test cases to facilitate QA process Utilize the code auto-completion feature Test code at design time with Lint warnings and errors   Learn More   We recommend the following resources to continue your learning experience:   Capability Guide Build Create Custom Business Logic Guide Experience Create Your Application UI     Additional Resources   If you have questions, issues, or need additional information, refer to:   Resource Link Community Developer Community Forum Support Help Center
View full tip
The ThingWorx Manufacturing Apps download includes both a setup.exe (for installation) and an uninstall.exe (for uninstallation).
View full tip
This article explains how to monitor concurrent user logins on the Axeda Platform. Its going to do this by creating an asset to monitor the solution. This asset will have a dataitem that tracks the users logged in. You can use this dataitem to trend usage during the day, or calculate the max per day. Or you could alarm if the number of users goes over some limit. The following needs to be created on your Platform : A model named “Monitor”, containing an analog DataItem named “userlogins” and an asset named “Metrics” This asset will receive the values. Expression Rule: Create two expression rules that update the number of users, triggered on user login or logout: Name : UserLoginMonitor Type: Userlogin and Userlogout (two rules required) IF:     true THEN: ExecuteCustomObject(“getUserLogins”, User.total) This calls a script and passes in User.total  = The total number of users that are logged into the system (Concurrently). Groovy Script (Custom Object) Now, the next step is to write a Groovy Script (Custom Object) You can copy and paste the following code into your Groovy script. Give the script a name : getUserLogins This script also has a parameter named logins import com.axeda.drm.sdk.device.DataItem; import com.axeda.drm.sdk.device.Device; import com.axeda.drm.sdk.Context import com.axeda.drm.sdk.device.DataItemFinder; import com.axeda.drm.sdk.device.ModelFinder; import com.axeda.drm.sdk.device.DeviceFinder; import com.axeda.drm.sdk.data.DataValueEntry import com.axeda.drm.sdk.device.Model; import com.axeda.drm.sdk.data.CurrentDataFinder; import com.axeda.drm.sdk.data.DataValue; def logins= parameters.logins def ctx = Context.create() def mod = loadModel("Monitor",ctx) def dev = loadDevice("Metrics",mod,ctx) DataItemFinder dif = new DataItemFinder(ctx) dif.setModel(mod) dif.setDataItemName("userlogins") DataItem di = dif.find() DataValueEntry dve = new DataValueEntry(ctx, dev, di, logins) dve.store()     public void setDataItem(String dataItemName, Integer dataItemValue, Device device, Context context)     {         DataItemFinder dif = new DataItemFinder(ctx);         dif.setDataItemName(dataItemName);         dif.setModel(device.getModel());         DataItem di = dif.find();         DataValueEntry dve = new DataValueEntry(ctx, dev, di, newValue)         dve.store()     }     public DataValue findCurrentDataItemValue(Device device, String dataItemName,Context ctx)     {         CurrentDataFinder cdFinder =  new CurrentDataFinder(ctx,device);         DataValue dv = cdFinder.find(dataItemName);         return dv;     }     public DataItem loadDataItem(Model model,String dataItemName, Context ctx)      {         DataItemFinder iFinder = new DataItemFinder(ctx);         iFinder.setDataItemName(dataItemName);         iFinder.setModel(model);         return iFinder.find();     }     public Model loadModel(String modelNumber, Context ctx)     {         ModelFinder mf = new ModelFinder(ctx);         mf.setName("Monitor");         return mf.find();     }     public Device loadDevice(String serialNumber,Model model, Context context ) {         DeviceFinder df = new DeviceFinder(context);         df.setSerialNumber(serialNumber);         df.setModel(model);         return df.find();     } Whenever a user logs into the Axeda Platform, the Metrics asset will show the concurrent number of logged in users in the Platform. This dataitem can be graphed to see the pattern of usage. If you wanted to take action based on the number of users, create an expression rule to alarm when a threshold is reached. This rule should be associated with the model "Monitor". Name : LoginsCheck Type: Data IF:     userlogins > 40 THEN: CreateAlarm("Login limit", 100, str(userlogins)+" users") Now an alarm is created each time too many users are logged in. An alarm can be used for notifications or viewed on the Monitor asset.
View full tip
We will host a live Expert Session: "Thingworx Mashup 101 - Do's and Don'ts" on February 24th, 13h30 EST.   Please find below the description of the expert session and the registration link.   Expert Session: Thingworx Mashup 101 - Do's and Don'ts Date and Time: February 24th, 13h30 EST Duration: 1 hour Host: Aanjan Ravi - Technical Product Manager Registration Here: https://www.ptc.com/en/events/thingworx-mashup-101   Description: This session covers the most common and useful tips about how to correctly use Mashup builder, Widgets and Layouts – and what to avoid -  to create applications with good principles of UI/UX and easier to maintain.   Existing Recorded sessions can be found on support portal using the keyword ‘Expert Sessions’. You can also suggest topics for upcoming sessions using this small form.   Here are some recorded sessions that might be of your interest. You can find recordings for the full library of webinars using the keyword ‘Expert Sessions’ in PTC support portal search Thingworx Active Active Clustering This session will cover the main aspects of the High Availability Clustering feature launched with the ThingWorx 9.0 release.   Recoding Link Upgrade to Thingworx 9 – How to Plan / Evaluate Impacts This session highlights the key points you should evaluate to properly plan your upgrade to Thingworx 9. Recording Link Top 5 items to check for Thingworx Performance Troubleshooting How to troubleshoot performance issues in a Thingworx Environment? Here we cover the top 5 investigation steps that will help you understand the source of your environment issues and allow better communication with PTC Technical Support     Recording Link
View full tip
  Step 8: Troubleshooting   Issue                                       Resolution CSS does not seem to be applied to the Mashup. Verify your CSS is included in the runtime TWX CSS. Clear Browser cache if your CSS is not merged to the combined TWX CSS. (under debug mode: Hold refresh button->Clear Cache). TWX fails to import the extension. If the extension is already installed, but you made recent changes, you need to bump the Version number in the metadata.xml. Recent CSS changes are ignored. Clear browser cache if your CSS file has changed recently.     Step 9: Next Steps   Congratulations! You've successfully completed the Add Style to Your UI with CSS guide, and learned how to:   Create custom CSS classes using the integrated CSS editor Bind CSS classes to a Mashup and to individual Widgets Use Media queries to dynamically apply styling   Learn More   We recommend the following resources to continue your learning experience:    Capability     Guide Build Application Development Tips & Tricks Experience ThingWorx Application Development Reference   Additional Resources   If you have questions, issues, or need additional information, refer to:  Resource       Link Community Developer Community Forum Support Help Center
View full tip
Hi I have attached a Postman collection, this can be used as a template and be modified. steps to upload the collection to Postman. 1. In your Postman window click at Import. 2. Once you clicked import, you can chose your file. 3. The collection is now visible in your left side of the window.
View full tip
Data is NOT free. It is easy to overlook the cost of data collection, but all data incurs some cost when it is collected. Data collection in and of itself does not bring business value. If you don’t know why you’re collecting the data, then you probably won’t use it once you have it. For a wireless product, it is felt in the cost of bytes transferred, which makes for an expensive solution, but happy Telco's. Even for wired installations, data transfer isn’t free. Imagine a supermarket with 20 checkout lanes - with only a 56K DSL line - and the connection is shared with the credit card terminals, so it is important to upload only the necessary data during business hours. For the end user, too much data leads to information clutter. Too much information increases the time necessary to locate and access critical data. All enterprise applications have some associated "Infrastructure Tax", and the Axeda Platform is no exception. This is the cost of maintaining the existing infrastructure, as well as increasing capacity through the addition of new systems infrastructure. This includes: The cost of the physical hardware The additional software licenses The cost of the network bandwidth The cost of IT staff to maintain the servers The cost of attached storage Optimizing your data profile will maximize the performance of your existing infrastructure. Scaling decisions should be based on load because 50,000 well defined Assets can yield less data than 2,000 extremely "chatty" Assets. Types of Data To develop your data profile, first identify the types of data you’re collecting. "Actionable Data": This is used to drive business logic. This is your most crucial data, and tends to be "real-time" "Informational Data": This changes at a very low rate, and represents properties of your assets as opposed to status "Historical Data": Sometimes you need to step back to appreciate a work of art. Historical data is best viewed with a wide lens to identify trends "Payload Data": Data which is being packaged and shipped to an external system Actionable Data Actionable Data controls the flow of business logic and has three common attributes: It tends to represent the status of the Asset It typically the highest priority data you will receive It usually has more frequent occurrences than other data Informational Data Informational Data is typically system or software data of which some examples include: OS Version Firmware information Geographical region Historical Data Historical Data will represent the results of long-term operations and is typically used for operational review of trends. May be sourced either from Data Items, File uploads or Web Services operations May feed the Axeda integrated business intelligence solution, or internal customer BI systems Payload Data Payload data travels through the Cloud to your system of record. In this case, the Axeda Platform is a key actor in your system, but its presence is not directly visible to the end user Data Types Key Points Understanding the nature of your data helps to inform your data collection strategy. The four primary attributes are the following: Frequency Quantity Storage Format Knowing what to store, what to process and what to pass through for storage is the first key to optimizing your data profile. The "everything first" approach is an easy choice, but a tough one from which to realize value. A "bottom up" or use-case driven approach will add data incrementally, and will reveal the subset of data you actually need to be collecting.Knowing your target audience for the data is the next step. A best practice to better understand who is trying to innovate and how they are looking to do it begins with questions such as the following: Is marketing looking for trends to highlight? Is R&D looking for areas to improve the product? Is the Service team looking to pro-actively troubleshoot assets in the field? Is Sales looking to sell more consumables? Is Finance trying to resolve a billing dispute? Answers to these questions will help determine which data contributes to solving real business problems. Most Service technicians only access a handful of pieces of information about an Asset while troubleshooting, regardless of how many they have access to. It’s important to close the information loop when finding out which data is actually being used.In addition to understanding the correct target audience and their goals, milestone events are also opportunities to revisit your strategy, specifically times like: New Model rollouts Migration to the Cloud New program launch Once your data profile has been established, the next phase of optimization is to plan the way the data will be received. Strategies Data Item vs. File Upload A decision should be made as to the best way to transfer data to the Axeda Platform, whether that is data items, events, alarms or file transfers. Here's a Best Practice approach that's fairly universal: Choose a Data Item if: (a)You are sending Actionable Data, or (b)You are sending discreet Informational Data Choose a File Upload if: (a)You are sending bulk Data which does not need to trigger an immediate response, or (b)You intend to forward the Data to an external system Agent-Side Business Logic Keep in mind that the Axeda Platform allows business logic to be implemented before transmitting any data. The Agent can be configured to determine when Data needs to be sent via numerous mechanisms: Scripts provide the ability to trigger on-demand uploads of data, either via a human UI interaction or an automated process The "Black Box" configuration allows for a rolling sample window, and will only upload the data in the window based on a configured condition Agent Rules Agent Rules allow the Agent to monitor internal data values to decide when to send data to the Cloud. Data can be continuously sampled and compared against configured thresholds to determine when a value worthy of transmission is encountered. This provides a very powerful mechanism to filter outbound data. The example below shows a graphical representation of how an Agent might monitor a data flow and transmit only when it reaches an Absolute-high value of 1200: Axeda provides a versatile platform for managing the flow of data through your Asset ecosystem. It helps to cultivate an awareness not only of what the data set is but what it represents and to whom it has value. While data is cheap, the hidden costs of data transmission make it worthwhile to do your "data profiling homework" or risk paying a high price in the longer term.
View full tip
This video begins Module 3: Data Profiling of the ThingWorx Analytics Training videos. It describes the process of examining your data to make sure that it is suitable for the use case you would like to explore.
View full tip
With ThingWorx, we can already use univariate anomaly alerts (on a single sensor value). However, in many situations, the readings from an individual sensor may not tell you much about the overall issue and a multivariate anomaly detector can be more useful. This post is intended to provide an overview of the Azure Anomaly Detector and how it can be integrated with ThingWorx. The attachment contains: A document with detailed instructions about the setup; A .csv file with the multivariate timeseries dataset; A .twx file with some entities that need to be imported in ThingWorx as well as the CSVParser extension that needs to be installed; A .zip file that will need to uploaded in an Azure Blob Container at some point in the setup
View full tip
For a recent project, I was needing to find all of the children in a Network Hierarchy of a particular template type... so I put together a little script that I thought I'd share. Maybe this will be useful to others as well.   In my situation, this script lived in the Location template. This was useful so that I could find all the Sensor Things under any particular node, no matter how deep they are.   For example, given a network like this: Location 1 Sensor 1 Location 1A Sensor 2 Sensor 3 Location 1AA Sensor 4 Location 1B Sensor 5 If you run this service in Location 1, you'll get an InfoTable with these Things: Sensor 1 Sensor 2 Sensor 3 Sensor 4 Sensor 5 From Location 1A: Sensor 2 Sensor 3 Sensor 4 From Location 1AA: Sensor 4 From Location 1B: Sensor 5   For this service, these are the inputs/outputs: Inputs: none Output: InfoTable of type NetworkConnection   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(AlertSummary) let result = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape({ infoTableName : "InfoTable", dataShapeName : "NetworkConnection" }); // since the hierarchy could contain locations or sensors, need to recursively loop down to get all the sensors function findChildrenSensors(thingName) { let childrenThings = Networks["Hierarchy_NW"].GetChildConnections({ name: thingName /* STRING */ }); for each (var row in childrenThings.rows) { // row.to has the name of the child Thing if (Things[row.to].IsDerivedFromTemplate({thingTemplateName: "Location_TT"})) { findChildrenSensors(row.to); } else if (Things[row.to].IsDerivedFromTemplate({thingTemplateName: "Sensor_TT"})) { result.AddRow(row); } } } findChildrenSensors(me.name);    
View full tip
import com.axeda.drm.sdk.Context import com.axeda.drm.sdk.device.ModelFinder import com.axeda.drm.sdk.device.Model import com.axeda.drm.sdk.device.DeviceFinder import com.axeda.drm.sdk.data.CurrentDataFinder import com.axeda.drm.sdk.device.Device import com.axeda.drm.sdk.data.HistoricalDataFinder import net.sf.json.JSONObject /* * DataItemEachDevice.groovy * * Find the current data item and historical data items for all assets in a given model. * * @param model_name        -   (REQ):Str name of the model. * @param data_item_name    -   (REQ):Str name of the data item to query on. * @param from_time         -   (REQ):Long millisecond timestamp to begin query from. * @param to_time           -   (REQ):Long millisecond timestamp to end query at. * * @note from_time and to_time should be provided because it limits the query size. * * @author Sara Streeter <sstreeter@axeda.com> */ def response = [:] // measure the script run time def timeProfiles = [:] def scriptStartTime = new Date() try { // getUserContext is supported as of release 6.1.5 and higher     final def CONTEXT = Context.getUserContext() // confirm that required parameters have been provided     validateParameters(actual: parameters, expected: ["model_name", "data_item_name", "from_time", "to_time"]) // find the model     def modelFinder = new ModelFinder(CONTEXT)     modelFinder.setName(parameters.model_name)     Model model = modelFinder.findOne() // throw exception if no model found     if (!model) {         throw new Exception("No model found for ${parameters.model_name}.")     } // find all assets of that model     def assetFinder = new DeviceFinder(CONTEXT)     assetFinder.setModel(model)     def assets = assetFinder.findAll() // find the current and historical data values for each asset //note: since device will be set on the datafinders going forward, a dummy device is set on instantiation which is not actually stored     def currentDataFinder = new CurrentDataFinder(CONTEXT, new Device(CONTEXT, "placeholder", model))     def historicalDataFinder = new HistoricalDataFinder(CONTEXT, new Device(CONTEXT, "placeholder", model))     historicalDataFinder.startDate = new Date(parameters.from_time as Long)     historicalDataFinder.endDate = new Date(parameters.to_time as Long) // assemble the response     assets = assets.collect { Device asset ->         currentDataFinder.device = asset         def currentValue = currentDataFinder.find(parameters.data_item_name)         historicalDataFinder.device = asset         def valueList = historicalDataFinder.find(currentValue?.dataItem)         [                 id: asset.id.value,                 name: asset.name,                 serialNumber: asset.serialNumber,                 model: [id: asset.model.id.value, name: asset.model.name],                 current_data: currentValue.asString(),                 historical_data: valueList.collect { [timestamp: it.getTimestamp().format("yyyyMMdd HH:mm"), value: it.asString()] }         ]     }     response = [result: [items: assets]] } catch (def ex) {     logger.error ex     response += [             error: [                     type: "Backend Application Error", msg: ex.getLocalizedMessage()             ]     ] } finally { // create and output the running time profile     timeProfiles << createTimeProfile("DataItemEachDevice", scriptStartTime, new Date())     response += [params: parameters, meta: [:], timeProfiles: timeProfiles] } return ['Content-Type': 'application/json', 'Content': JSONObject.fromObject(response).toString(2)] private Map createTimeProfile(String label, Date startTime, Date endTime) {     [             (label): [                     startTime: [timestamp: startTime.time, readable: startTime.toString()],                     endTime: [timestamp: endTime.time, readable: endTime.toString()],                     profile: [                             elapsed_millis: endTime.time - startTime.time,                             elapsed_secs: (endTime.time - startTime.time) / 1000                     ]             ]     ] } private validateParameters(Map args) {     if (!args.containsKey("actual")) {         throw new Exception("validateParameters(args) requires 'actual' key.")     }     if (!args.containsKey("expected")) {         throw new Exception("validateParameters(args) requires 'expected' key.")     }     def config = [             require_username: false     ]     Map actualParameters = args.actual.clone() as Map     List expectedParameters = args.expected     config.each { key, value ->         if (args.options?.containsKey(key)) {             config[key] = args.options[key]         }     }     if (!config.require_username) { actualParameters.remove("username") }     expectedParameters.each { paramName ->         if (!actualParameters.containsKey(paramName) || !actualParameters[paramName]) {             throw new IllegalArgumentException(                     "Parameter '${paramName}' was not found in the query; '${paramName}' is a reqd. parameter.")         }     } } Sample Output: {   "result": {     "items": [{       "id": 4240,       "name": "ASVM_9",       "serialNumber": "ASVM_9",       "model": {         "id": 1535,         "name": "SimVM4"       },       "current_data": "142.0",       "historical_data": [{         "timestamp": "20120331 17:00", "value": "142.0"       }, {         "timestamp": "20120331 16:59", "value": "143.0"       }, {         "timestamp": "20120331 16:59", "value": "144.0"       }, {         "timestamp": "20120331 16:58", "value": "145.0"       }, {         "timestamp": "20120331 16:58", "value": "146.0"       }, {         "timestamp": "20120331 16:57", "value": "147.0"       }, {         "timestamp": "20120331 16:57", "value": "148.0"       }, {         "timestamp": "20120330 19:30",         "value": "0.0"       }]     }, {       "id": 4246,       "name": "ASVM_12",       "serialNumber": "ASVM_12",       "model": {         "id": 1535,         "name": "SimVM4"       },       "current_data": "138.0",       "historical_data": [{         "timestamp": "20120331 17:00",        "value": "138.0"       }, {         "timestamp": "20120331 17:00",        "value": "139.0"       }, {         "timestamp": "20120331 16:59",        "value": "140.0"       }, {         "timestamp": "20120331 16:59",        "value": "141.0"       }, {         "timestamp": "20120331 16:59",        "value": "142.0"       }, {         "timestamp": "20120331 16:59",        "value": "143.0"       }, {         "timestamp": "20120330 19:32",         "value": "0.0"       }]      //      // MORE ASSETS HERE      //     }]   },   "params": {     "username": "sstreeter",     "from_time": "1332272219000",     "data_item_name": "CurrentStock",     "sessionid": "JOQ5I7ofRXYA-RnA37Vk93bRUH718yoFF5 9p0JbCnfyoHolFprf",     "model_name": "SimVM4",     "to_time": "1335469008000"   },   "meta": {},   "timeProfiles": {     "DataItemEachDevice": {       "startTime": {         "timestamp": 1335469168725,         "readable": "Thu Apr 26 19:39:28 GMT 2012"       },       "endTime": {         "timestamp": 1335469180569,         "readable": "Thu Apr 26 19:39:40 GMT 2012"       },       "profile": {         "elapsed_millis": 11844,         "elapsed_secs": 11.844       }     }   } }
View full tip
While working with the Axeda Platform you will come across guard rails that limit sizes, recurrence, and duration of certain actions.  When you run into these limitations, it may be an opportunity to re-examine the architecture of your solution and improve efficiency. What this tutorial covers This tutorial discusses the kinds of limits exist across the Platform, however it does not include the exact values of the limits as these may vary across instances.  Skip to the last section on System Configuration to see how to determine the read-only properties of your Axeda Instance.  You can also contact your Axeda Support technician to find out more about how these properties are configured. Types of Limits discussed: Rule Sniper Domain Object Field Length Constraints File Store Limits System Configuration Avoiding Rule Sniper Issues There are two ways a rule can be sniped from statistics (recursive rules are done differently) – frequency count and execution time. When a rule is killed, an email will be sent explaining the statistics behind the event. So what these numbers actually mean… CurrentAverageExecTime = loadExecTime / frequencyCount This determines which rule is sniped... This is the longest running rule on average, NOT the most running per time period. FrequencyCount = how many times this rule ran in this period This is for the rule in general - not this period TotalExecTime = total time this rule has executed for in a time MaxExecTime = longest time this rule has ever taken to run ExecCount = number of times this rule has ever run MaxFrequencyCount = max number of times this rule has ever run in a period The Rule Sniper monitors all the rules as a unit. When the entire system is beyond the “load point” it chooses the heaviest hitting rule and kills it. Some definitions: Execution count Execution count is how many time the rule has ran since it was last enabled. Maximum execution time Maximum execution time is the max time a rule can run. This is controlled by the setting of the following in your DRMConfig.properties: com.axeda.drm.rules.statistics.rule-time-threshold Total execution time Total execution is the time that the rule actually ran. Frequency count Frequency is how many times the same expression rule runs in a set period of time. The period of time is set in DRMConfig.properties by: com.axeda.drm.rules.statistics.rule-frequency-period Maximum frequency count Max frequency is the maximum times the expression can run Recursive expression Rules could be triggered from actions such as file uploads, device registration and data item changes.  A scenario may occur in which an Expression Rule initiates a Then or Else action that triggers itself, such as a Data type Expression Rule setting a data item.  This scenario has led to the existence of the Rule Sniper, which disables Expression Rules that are triggered several times in quick succession.  At times an Expression Rule may be sniped simply for being triggered too many times in too short a period of time, even though the rule was not recursive. Setting a Data Item from a Data type Rule In one scenario, one data item comes in, say Temperature, and you need to set a different data item of Climate based on the value of Temperature.  Without any checking, a Data type Rule that sets a Data Item Value will trigger itself, leading to a recursive rule execution that will be shut down by the Rule Sniper.  A way to do this without the rule being sniped is to check in the If expression that the data item change triggering the rule is the one we are interested in, as opposed to the data item that is changed because it was set by the Rule. If:  Temperature.changed && Temperature.value > 75 Then: SetDataItem("Climate", "Hot")   Since it was the Climate that changed as a result of the Then statement, the rule will not be triggered again. ***Update:  In an ironic twist of fate, it turns out that the solution above only works for data items that are set to be stored On Change rather than Stored data items.  Stored data items are updated whenever a new value is entered, even if it is the same value. In this case, Temperature.changed would not trigger because the value would be the same, only the timestamp would be different.  This would matter if you had the possibility of the same value happening twice consectively and needed the rule to trigger both times, but not on any other data item. The correct solution is the following: If: (!Temperature.changed || Temperature.changed) && Temperature.value > 75 Then: SetDataItem("Climate", "Hot") Admittedly inelegant, this works because if any other data item is passed in, Temperature will not be passed in so there will be no value for Temperature.changed.  If Temperature is passed in, it will trigger either one of the cases (not changed if the value is the same, changed if it isn't). An alternate solution is to make use of the consecutive property of the Expression Rule. "Execute action each time rule evaluates to true" corresponds to the consecutive property, which determines whether the rule will fire every time the If expression evaluates to true.  If the consecutive property is true, it will fire every time.  If it is false, the rule will trigger one time when the If expression evaluates to true, and then it won't be triggered again until the If expression evaluates to false, and then to true again. With the consecutive property set to true, in our scenario above whenever the Temperature changes and is over 75, it will set Climate to Hot.  With consecutive set to false, the rule will set Climate to Hot once, and then Temperature will have to fall below 75 and then rise above 75 again to trigger the rule again. Recurring Actions Sometimes you may need a recurring action to take place.  An example would be if you don't need to evaluate a temperature in real time as it changes, but can check its status periodically.  If the recurrence either requires or can tolerate a set delay, the best practice is to use a Rule Timer.  A Rule Timer allows you to execute an Expression Rule on a schedule much like a cron job.  In fact, the Rule Timer syntax is expressed in crontab format. In order to use a Rule Timer, create an Expression Rule of type System Timer or Asset Timer.  The Asset Timer allows you to scope the rule to a certain set of assets like other rules, while a System Timer is not scoped to assets.  This makes a System Timer more appropriate for a rule that would execute a Custom Object, as opposed to one that creates an alarm directly on an asset. Then create the Timer itself, which will allow you to set the schedule. Navigate to Configuration > New > Rule Timer With a Rule Timer, you can set a rule to run automatically with a preset delay and avoid the recurrence limit on the rule. For more information on the Rule Sniper, there is a Salesforce Knowledgebase Solution article available to Axeda customers called What are the Rule Sniper and Rule Executor Monitor Features For and How Do They Work? as well as the Rules Design and Best Practices Guide. Domain Object Field Length Constraints Every stored object has limits on the length of its fields, such as name and value.  If a script attempts to store a value for a field that exceeds the field length constraints, the value will be truncated to the maximum limit. The maximum size of a data item value in the database is 4000 bytes. Two additional constraints are a limit on number of lines in a custom object (typically 1000 lines) and on the size of a stored data accumulation that can be read out as a string (1MB). The Help documentation available through the Axeda Applications Console contains information regarding field constraints (such as the Help pages on String Length Constraints at http://<<yourdomain>>.axeda.com/help/en/rule_action_data_entry_string.htm ). Limits on File Store Configurable quota limits exist on files that can be uploaded to the Axeda File Store via the SDK v2 FileInfoBridge.  These limits will prevent creating FileUploadSessions, creating or updating FileInfos, or uploading file data if they are exceeded. File count: maximum number of files that can be stored on the system Maximum file size: the maximum size of any one file Total stored bytes: the total bytes for all files that may be stored on the system The configuration of these limits can be found on your system by navigating to Administration > System Configuration as described below and searching for "file" in the Read-Only Properties. System Configuration The System Configuration link under the Administration tab is a useful reference for viewing Read-Only properties of how your instance is configured. Check here when troubleshooting to determine any limit that may influence your app's implementation. Common Question An expression rule has a Data Trigger and in the Then Statement it sets a data item. Why is it getting disabled? Answer:  The rule is being recursively triggered so the Rule Sniper is disabling it.
View full tip
  Connect IoT data from devices to Widgets that display in your application UI.   GUIDE CONCEPT   This project will demonstrate how to bind a data source to a Widget.   Following the steps in this guide, you will be able to show state-based changes resulting from data updates.   We will teach you how to essentially connect your backend data to the Widgets in your Mashup. ThingWorx facilitates this process with built-in functionality.     YOU'LL LEARN HOW TO   Bind data to Widgets in ThingWorx Mashup   NOTE: This guide's content aligns with ThingWorx 9.3. The estimated time to complete this guide is 30 minutes.      Step 1: Completed Examples    Download the Complete Data Binding Example using file BindDataEntities.zip attached to this guide. Within this file you, you will find Entities referenced in this lesson, including a finished application. Import and utilize this file to see a finished example and return to it as a reference if you become stuck during this guide and need some extra help or clarification.   Keep in mind, this download uses the exact names for Entities used in this tutorial. If you would like to import this example and also create Entities on your own, change the names of the Entities you create.     Step 2: Bind Data to Widget   In order to display data from connected devices, each Widget must be connected to a data source. You should already be familiar with how to find the Widgets in the top left panel of a Mashup screen.   Mashup Areas   In the top right of the screen in a Mashup Entity is the Data Panel. This is where Entities and Services are used to bring in data and added functionality. This area also includes the Session Tab, which includes data that is being stored in the session. You can learn more about that in the Create Session Parameters guide. You can also filter for specific Properties.       In the lower right of the screen in a Mashup Entity is the Data Properties Panel. This is where you can configure how your Service calls will react to different Events. For example, you might want to perform a call to a Service as soon as another Service call is complete. You'll do that in this section. You will also notice the Functions Tab. This table enables you to create custom functionality for you Mashup and Widgets, such as navigating to a different Mashup on the click of a button.       In the lower left of the screen in a Mashup Entity is the Widget Properties Panel. This is where you'll be able to not only customize your Widget, but connect data directly from Services and other Widgets. You will also notice the Style Properties Tab. This will provide access to change the styling and themes used for a Widget. You can also filter for specific Properties.       In the bottom middle of your screen, you'll notice the Bindings Panel. This panel shows you where your connections are in reference to Widgets, Services, and any Events that are being used to connect them. Whenever you have a problem with thinking about the flow of your Mashup, look down to this panel to get a quick idea. You'll also notice the Reminder Tab in this area. This tab just helps with things you might have forgotten to do when setting up your data binding, such as setting the display field for a Widget.       Let's now move forward with setting up our data binding.    Add Service   Open the HelloWorldPlayground Mashup. Drag and drop the Grid Advanced Widget to the left-hand column of your Canvas.      NOTE: If a pop-up appears about adding a Panel, choose Yes. 2. Click + in the Data panel.       3. Search and select the HelloWorldData Data Table from the search bar in the top left of the home screen. 4. Search for the GetFirstEntry Service and click the blue arrow. The GetFirstEntry service is part of the DataMagicians.XML file you imported.   5. Check the Execute on Load checkbox. This makes the page automatically load with content for a Widget. 6. Click Done.     Bind Data to Widget   We will bind data to a Widget, but also make the Widget editable. When it comes to making the field Editable, keep in mind any connections or Entities involved. In this case, the World field is attached to an Entity.   In the Data panel, expand the Returned Data section. Drag and drop the All Data field of the GetFirstEntry Service to the Grid Advanced Widget. A pop-up will appear to select the binding target, select Data.   In the Widget Properties panel, check the checkbox for IsEditable. This will allow users to edit the data in the Property fields. Select the Configure (Gear) button. The Configure Grid Columns window will open.   Update any fields you would like hidden (for instance, uncheck the box next to timestamp and key).     Click on source field. Check the Editable checkbox at the bottom.   Click Done to close the pop-up.   Data Panel Buttons   There is an assortment of helpful buttons in the Data panel to make our lives easier.     Top Row of Buttons   The + button that you used before adds more Entities as a resource to make Service calls. The circlular button next to it provides a reload functionality. This is useful when you've made a change to a Service and would like for it to appear here. For example, adding a new parameter to a Service call.   Buttons by Entity   The i button provides information and access to the Entity in question. The Add Service button adds a Service that belongs to the same Entity. This will definitely help with saving time. The last button is the Delete button. This will delete the Entity from the list of resources that can make Service calls.   Buttons By Service   The only button is the Delete button. This will delete the Service from the list of Service calls an Entity has available.   Adding More Functionality   Click the Add Service button on the HelloWorldData Entity in the Data pane.     Search for the UpdateDataTableEntry Service and click the arrow to select it. Leave the Execute on Load checkbox unchecked and click Done. Deselect the Execute on Load option if you would like the page to load with content in response to user input, rather than automatically at startup. Click on the UpdateDataTableEntry Service, then click on the Data Binding button in the bottom right. Click the arrow next to the Values Property under Parameters. Click Add Source. When the list of Widgets appear, click on the Editedtable of the Grid Advanced Widget that we added. Click Next and Done. Select the Grid Advanced Widget on your canvas. Drag and drop the EditCellCompelted property onto the UpdateDataTableEntry service.     NOTE: The columns returned by a data service are shown under the All Data section of the data service. They are outbound bindable, indicated by the outward facing arrows. When a data service All Data or individual column is bound, the arrows become filled in. You can also bind the data from one Widget to another.         8. Click Save and View Mashup.   You have just bound data from a service to a Widget. There are many different Widgets, and the process for binding data to a Widget is often similar. Within Composer, you can simply drag and drop to bind configurations for Events and Properties.     TIP: As, an extension to this lesson, edit the Property Display Widget to update the entry in the HelloWorldData DataTable. Editing the name of the World will result in no entries being updated. To add or update an entry when you update the Name property, used the AddOrUpdateDataTableEntry service instead of UpdateDataTableEntry.     Step 3: Next Steps    If you have questions, issues, or need additional information, refer to:   Resource Link     Community Developer Community Forum Support Help Center
View full tip
When an Expression Rule of type Data calls a Groovy script, the script is provided with the implicit object dataItems.  This example shows how the dataItems object can be used.to get the dataitem information (value, name, type and update time) import com.axeda.drm.sdk.data.* import com.axeda.drm.sdk.device.DataItem try {         def deviceName = context.device.name         // implicit object dataItems passes a list of dataItem objects         def dataItemsList = dataItems         for(dio in dataItemsList) {                logger.info("Checking " + dio.name + " Value: " + dio.value)                if(dio.name == "updateTime") {                        logger.info("Found: " + dio.name + " Value: " + dio.value + " Type: " +    dio.perceptType + " Last Updated: " + new Date(dio.timeInMillis)) // perceptType = analog, digital or string                }         } } catch (Exception e) {         logger.error e.message }
View full tip
    Generate engine-failure predictions and gain insight into your data with machine learning.   GUIDE CONCEPT   This guide will upload captured data from an Edge MicroServer (EMS) "Engine Simulator" to ThingWorx Analytics Builder.   Following the steps in this guide, you will create an analytical model, and then refine it based on further information from the Analytics platform.   We will teach you how to determine whether or not a model is accurate and how you can optimize both your data inputs and the model itself.   NOTE: This guide's content aligns with ThingWorx 9.3. The estimated time to complete ALL parts of this guide is 60 minutes     YOU'LL LEARN HOW TO   Load an IoT dataset Generate machine learning predictions Evaluate the analytics output to gain insight     Step 1: Scenario   In this guide, we’re continuing the same MotorCo scenario, where an engine can fail catastrophically in a low-grease condition.   In previous guides, you’ve gathered and exported engine vibration-data from an Edge MicroServer (EMS).   The goal of this guide is to now import that previously-exported Comma-Separated Values (.csv) data into ThingWorx Analytics, and then create an analytical model for predictive maintenance.   Analytical model creation can be extremely helpful for the automotive segment in particular. For instance, each car that comes off the factory line could have an EMS constantly sending data from which an analytical model could automatically detect engine trouble.   This could enable your company to offer an engine monitoring subscription service to your customers.   This guide will show you how to build an analytic model of your engine to facilitate this monitoring service.     Step 2: Upload Simulated Data   This guide assumes that you are using either the hosted trial (with has both Foundation and Analytics pre-installed) or a combination of the Foundation and Analytics downloadable installers.   To confirm that Foundation is communicating with Analytics, perform the following steps:   On the ThingWorx Foundation left-side navigation column, click Analytics > Analytics Builder > Settings.   At the top-right in the Analytics Server Version field, ensure that you see an appropriate version number.     NOTE:  If you use your own dataset, it's possible that you're results in the following steps will differ from those created by the provided-dataset. If you were unable to generate a 30,000+ entry dataset in the last guide, then you may download testCSVfile.csv attached here,instead. You will also need to download and extract vibration_metadata.zip which describes each column of the dataset. On the left, click Analytics Builder > Data.   Under Datasets, click New....   In the Dataset Name field, enter simulated_dataset. In the File Containing Dataset Data section, search for and select testCSVfile.csv. In the File Containing Dataset Field Configuration section, search for and select vibration_metadata.json.   Click Submit. Note that the time it takes to import the dataset is determined by its size.       Step 3: Simulated Signals and Profiles    The Signals section of ThingWorx Analytics looks for the most statistically correlated single field in the dataset which relates to your selected goal.   This doesn't necessarily indicate that it is the cause of your goal, whether maximizing or minimizing. It just means that the dataset indicates that this single field happens to correlate with the goal that you desire.   On the left, click Analytics Builder > Signals.   At the top, click New….   In the Signal Name field, enter simulated_signal. In the Dataset field, select simulated_dataset.   Click Submit. Wait ~30 seconds for Signal State to change to COMPLETED     Unfortunately, our results aren't very good. Or, more accurately, they're too good.   Our simulated dataset has some noise in it from adding random values to our five frequency bands on each our two sensors. However, ThingWorx Analytics has instantly seen through that noise and discarded it. Instead, it's only detected that s2_fb5 isn't relevant.   If you look back at the Use the EMS to Create an Engine Simulator guide, you'll see that s2_fb5 has the same base value between both a "good grease" and a "bad grease" condition, i.e. a base of 190.   This does show already that Analytics is working, though. Since s2_fb5 didn't change between good and bad grease conditions, our Signal analysis is indicating that it's not relevant to our model.   Profiles   Now, let's do the same for a Profile.   The Profiles section of ThingWorx Analytics looks for combinations of data which are highly correlated with your desired goal.   On the left, click Analytics Builder > Profiles.   Click New....   In the Profile Name field, enter simulated_profile. In the Dataset field, select simulated_dataset.   Click Submit. Wait ~30 seconds for the Profile State to change to COMPLETED.     Just like with Signals, our Profile is too good. In fact, Analytics is indicating that just s1_fb2 by itself is the primary indicator of good vs. bad grease conditions.   This is likely due to random chance. The random noise added to s1_fb2 just happened to be slightly less than the other frequency bands, so everything else was discarded.   Regardless, ThingWorx Analytics is quickly seeing through our simulated data.   Next, we'll actually create a Model using the simulated dataset.     Click here to view Part 2 of this guide  
View full tip
Adaptive Machine Messaging Protocol The Adaptive Machine Messaging Protocol (AMMP) is a simple, byte-efficient, lightweight messaging protocol used to facilitate Internet of Things (IoT) communications and to build IoT connectivity into your product. Using a RESTful API, AMMP provides a semantic structure for IoT information exchange and leverages HTTPS as the means for sending and receiving messages between an edge device and the Axeda® Machine Cloud®. AMMP uses JavaScript Object Notation (JSON) allowing any device that is capable of making an HTTP transmission to interact with the Axeda Platform. Utilizing a common network transport that is friendly to local network proxies and firewalls, and at the same time using JSON for a compact, human-readable, language-independent, and easily constructed data representation, AMMP simplifies device communication and reduces the work needed to connect to the Axeda Machine Cloud. For complete information about the Adaptive Machine Messaging Protocol, refer to the Adaptive Machine Messaging Protocol (AMMP) Technical Reference. AMMP Toolkits The AMMP Toolkits are libraries that allows you to connect your devices to the Axeda Platform using AMMP.  The AMMP Toolkits support transmission of data, alarms, events, locations; error handling and reporting; as well as exchanging files with the Axeda Platform. AMMP Android-Based Toolkit The AMMP Android-Based Toolkit library conforms to the AMMP Protocol Version 1.1. AMMP Android Toolkit AMMP Android Toolkit Developers Reference AMMP Protocol v1.1 Technical Reference AMMP Java-Based ToolkitThe AMMP Java-Based Toolkit library conforms to the AMMP Protocol Version 1.1.  AMMP Java Toolkit AMMP Java Toolkit Developers Reference AMMP Protocol v1.1 Technical Reference AMMP C-Based ToolkitThe AMMP C-Based Toolkit library conforms to the AMMP Protocol Version 1.1. AMMP C Toolkit AMMP C Toolkit Developers Reference AMMP Protocol v1.1 Technical Reference The above resources may be found at the PTC Support Portal.
View full tip
  Learn how to connect ThingWorx Kepware Server to Foundation.   Guide Concept   This guide will teach you how to create a backend Data Model in ThingWorx Foundation that works with ThingWorx Kepware Server to collect data from an Allen-Bradley PLC and send it to ThingWorx Foundation.   You'll learn how to   Create a Data Model in ThingWorx Foundation that accepts information from ThingWorx Kepware Server   NOTE:  The estimated time to complete this guide is 30 minutes       Step 1: Learning Path Overview   Assuming you are using this guide as part of the Rockwell Automation Learning Path, then you have now completed each of the following installations:   Connected Components Workbench ThingWorx Kepware Server ThingWorx Foundation (for Windows)   You’ve also connected an Allen-Bradley PLC to Connected Components Workbench and then to ThingWorx Kepware Server.   In this guide, we’ll propagate that information further from ThingWorx Kepware Server into ThingWorx Foundation.     Step 2: Create Gateway   To make a connection between ThingWorx Kepware Server and Foundation Server, you must first create a Thing. WARNING: To avoid a timeout error, create a Thing in ThingWorx Foundation BEFORE attempting to make the connection in ThingWorx Kepware Server. In ThingWorx Composer, click Browse. On the left, click MODELING -> Things.   Click + NEW. In the Name field, enter IndConn_Server, including matching capitalization. In the Description field, enter an appropriate description, such as Industrial Gateway Thing to connect to ThingWorx Kepware Server.   If Project is not already set, click the + in the Project text box and select the PTCDefaultProject. In the Base Thing Template field, enter indus, then select the IndustrialGateway Thing template from the sorted list. Click Save.     Step 3: Connect to Foundation   Now that you’ve created an Industrial Gateway Thing and an Application Key, you can configure ThingWorx Kepware Server to connect to ThingWorx Foundation. Return to the ThingWorx Kepware Server Windows application. Right-click Project. Select Properties.   In the Property Editor pop-up, click ThingWorx. In the Enable field, select Yes from the drop-down. In the Host field, enter the IP address of your ThingWorx Foundation server. Enter the Port number. If you are using the “hosted” Developer Portal trial, enter 443.   In the Application Key field, copy and paste the Application Key you just created. In the Trust self-signed certificates field, select Yes from the drop-down. In the Trust all certificates field, select Yes from the drop-down. In the Disable encryption field, select No from the drop-down if you are using a secure port. Select Yes if you are using an http port. Type IndConn_Server in the Thing Name field, including matching capitalization. If you are connecting with a remote instance of ThingWorx Foundation and you expect any breaks or latency in your connection, enable Store and Forward. Click Apply in the pop-up. Click Ok. In the ThingWorx Kepware Server Event window at the bottom, you should see a message indicating Connected to ThingWorx.   NOTE: If you do not see the “Connected” message, repeat the steps above, ensuring that all information is correct. In particular, check the Host, Port, and Thing name fields for errors.     Step 4: Bind Industrial Tag   Now that you’ve established a connection, you can use ThingWorx Foundation to inspect all available information in ThingWorx Kepware Server.   This step will create a new Thing in ThingWorx Foundation representing two output coils of the PLC.   Create Thing for PLC coils   In ThingWorx Foundation Composer, on the left, click MODELING -> Industrial Connections, then click IndConn_Server.   At the top, click Discover. Expand Channel2, then click myPLC. Select check-boxes next to Coil2 and Coil3, then click Bind to New Entity.   Scroll to select RemoteThing, then click OK.   Enter the name PLCcoils, If Project is not already set, click the + in the Project text box and select the PTCDefaultProject. then click Save.   Test ThingWorx Foundation to PLC Communication   Click the Properties and Alerts tab. Confirm that the isConnected Property has a check in the Value field, indicating a good connection between ThingWorx Kepware Server and the PLC. Click the pencil icon in the Coil3 line to open the edit panel, click the True radio button, then click the save checkmark button.   You should here a soft click from the PLC and the Output 3 indicator will illuminate. ThingWorx Foundation is now controlling the PLC through its connection to ThingWorx Kepware Server.     Step 5: Troubleshooting   If the connection to the PLC stops working and there is a Thumbs Down icon next to your properties, the ThingWorx Kepware Server trial edition drivers are not connected to your PLC. The trial edition stops running after 2 hours and must be stopped and restarted. Right-click on ThingWorx Kepware Server icon in system tray.   Click Stop Runtime service. Wait a minute for the process to stop, then click Start Runtime service. If Connected Components Workbench does not connect to PLC, check the IP address of the PLC using RS Linx Classic software that was installed as part of Connected Components Workbench. RS Linx Classic is located Start > All Programs > Rockwell Software > RSLinx > RSLinx Classic Click AB_ETHIP-1, Ethernet and IP addresses of connected PLCs will be discovered NOTE: A changed PLC IP Address (typically seen through Connected Components Workbench) will require an IP Address change in ThingWorx Kepware Server settings.       Step 6: Next Steps   Congratulations! You've successfully completed the Model an Allen-Bradley PLC guide. You've learned how to:   Create a data model that can accept information from ThingWorx Kepware Server Connect ThingWorx Kepware Server to Foundation   The next guide in the Using an Allen-Bradley PLC with ThingWorx learning path is Visualize an Allen-Bradley PLC.   Learn More     Capability      Resource Analyze Monitor an SMT Assembly Line   Additional Resources   For additional information on ThingWorx Kepware Server:     Resource              Link Documentation Kepware documentation Support Kepware Support site
View full tip
Email an attachment using bytes from a FileInfo Parameters: fileId - the identifier to a FileInfo that has been previously uploaded to the FileStore filename - the name of the attachment toaddress - the email address to send to fromaddress - the email address to send from import com.axeda.drm.util.Emailer; import com.axeda.drm.sdk.contact.Email import javax.mail.internet.AddressException; import javax.mail.internet.InternetAddress; import static com.axeda.sdk.v2.dsl.Bridges.* import com.axeda.services.v2.FileInfoCriteria import org.apache.commons.io.IOUtils import java.security.MessageDigest try {   String fromaddress = parameters.fromaddress   String toaddress = parameters.toaddress   def fileId = parameters.fileId   def filename = parameters.filename   String subject = "Axeda Test Attachment"   String body = "<html><head/><body><p style='background:blue;'>This email has an attachment and a blue background.</p></body></html>"   def thefile = new File(filename)   def inputStream = fileInfoBridge.getFileData(fileId)   byte[] bytes = IOUtils.toByteArray(inputStream);   thefile.setBytes(bytes)   def random_hash = md5('r');   def contentType = "multipart/mixed; boundary=--\"$random_hash\"\r\n"   def htmlType = "text/html" sendEmail(fromaddress, toaddress, subject,  body, contentType, thefile, false, htmlType) } catch (Exception e) { logger.error(e.localizedMessage) } return true def md5(String s) {     MessageDigest digest = MessageDigest.getInstance("MD5")     digest.update(s.bytes);     new BigInteger(1, digest.digest()).toString(16).padLeft(32, '0') } public void sendEmail(String fromAddress, String toAddress,String subject, String body, String encoding, File file, boolean compress, String mimeType) {     try {         Emailer.getInstance().send([new InternetAddress(toAddress)],new InternetAddress(fromAddress), subject,body, encoding, [file] as File[], compress, mimeType);     } catch (Exception ae) {         logger.error(ae.localizedMessage);     } }
View full tip
Announcements