cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

ThingWorx Navigate is now Windchill Navigate Learn More

IoT & Connectivity Tips

Sort by:
First we need to Understand below terms: Quantitative Variable: A quantitative variable is naturally measured as a number for which meaningful arithmetic operations make sense. Examples: Height, age, crop yield, GPA, salary, temperature, area, air pollution index (measured in parts per million), etc. Categorical variable: Any variable that is not quantitative is categorical. Categorical variables take a value that is one of several possible categories. As naturally measured, categorical variables have no numerical meaning. Examples: Hair color, gender, field of study, college attended, political affiliation, status of disease infection. Ordinal Variables: An ordinal variable is a categorical variable for which the possible values are ordered. Ordinal variables can be considered “in between” categorical and quantitative variables. Example: Educational level might be categorized as     1: Elementary school education     2: High school graduate     3: Some college     4: College graduate     5: Graduate degree •    In this example (and for many ordinal variables), the quantitative differences between the categories are uneven, even though the differences between the labels are the same. (e.g., the difference between 1 and 2 is four years, whereas the difference between 2 and 3 could be anything from part of a year to several years) •    Thus it does not make sense to take a mean of the values. •    Common mistake: Treating ordinal variables like quantitative variables without thinking about whether this is appropriate in the particular situation at hand. Ordinal regression: In statistics, ordinal regression (also called "ordinal classification") is a type of regression analysis used for predicting an ordinal variable. The Ordinal Regression procedure allows you to build models, generate predictions, and evaluate the importance of various predictor variables in cases where the dependent (target) variable is ordinal in nature. Ordinal dependents and linear regression: When you are trying to predict ordinal responses, the usual linear regression models don't work very well. Those methods can work only by assuming that the outcome (dependent) variable is measured on an interval scale. Because this is not true for ordinal outcome variables, the simplifying assumptions on which linear regression relies are not satisfied, and thus the regression model may not accurately reflect the relationships in the data. In particular, linear regression is sensitive to the way you define categories of the target variable. With an ordinal variable, the important thing is the ordering of categories. So, if you collapse two adjacent categories into one larger category, you are making only a small change, and models built using the old and new categorizations should be very similar. Unfortunately, because linear regression is sensitive to the categorization used, a model built before merging categories could be quite different from one built after. Below are some examples pf ordered logistic regression: Example 1: A marketing research firm wants to investigate what factors influence the size of soda (small, medium, large or extra large) that people order at a fast-food chain. These factors may include what type of sandwich is ordered (burger or chicken), whether or not fries are also ordered, and age of the consumer. While the outcome variable, size of soda, is obviously ordered, the difference between the various sizes is not consistent. The difference between small and medium is 10 ounces, between medium and large 8, and between large and extra large 12. Example 2: A researcher is interested in what factors influence modaling in Olympic swimming. Relevant predictors include at training hours, diet, age, and popularity of swimming in the athlete’s home country. The researcher believes that the distance between gold and silver is larger than the distance between silver and bronze. Example 3: A study looks at factors that influence the decision of whether to apply to graduate school. College juniors are asked if they are unlikely, somewhat likely, or very likely to apply to graduate school. Hence, our outcome variable has three categories. Data on parental educational status, whether the undergraduate institution is public or private, and current GPA is also collected. The researchers have reason to believe that the “distances” between these three points are not equal. For example, the “distance” between “unlikely” and “somewhat likely” may be shorter than the distance between “somewhat likely” and “very likely”. How to use and get result by Ordinal Regression: Clink this link for PDF                                                                                                                                                                                                                                                                                                                        PDF source: http://www.norusis.com
View full tip
Ran into this recently thought I share an approach to getting a table with multi-column distinct yet retaining all the columns of the row. If you use Distinct, you get only the Columns you do Distinct on. This isn't very helpful if you want the 'latest' or the 'first occurrences'  of records in your table with a combination of fields being unique. For example I had Process, Part, Dimension and Point for which I had multiple value and date time entries, but I only wanted the latest entries. Following is how I solved it, if you have a better way please leave a comment! P.S.: for the query I used the awesome query builder available in the snippet section! --------------------------------------- var q1Result = Things["MyThing"].QueryStreamEntriesWithData({maxItems:99999, query:query1}); //Below creates a temporary measurement table to store the latest meaurement values var params = {                 infoTableName : "InfoTable",                 dataShapeName : "MyDatashape.DS" }; // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(MyDataShape.DS) var tempTable1 = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params); // Extract only the latest measurements for the PART from the measurement result table 'q1Result' //The way we are going to reduce this to unique measurements is //1. records are in reverse order of date time //2. get distinct by Process Part Dim Point //3. Step through and match against distinct set //4. First match goes into final set //5. Upon match remove from distinct set //6. If no match then skip record //7. If no more distinct match records break loop var params = {                 t: q1Result /* INFOTABLE */,                 columns: 'ProcessID,PartID,Dimension,Point' /* STRING */ }; // result: INFOTABLE var distinctResult = Resources["InfoTableFunctions"].Distinct(params); for (var x = 0; x < q1Result.rows.length; x++) {     var query = {       "filters": {         "type": "AND",         "filters": [           {             "fieldName": "ProcessID",             "type": "EQ",             "value": q1Result.rows .ProcessID           },           {             "fieldName": "PartID",             "type": "EQ",             "value": q1Result.rows .PartID          },           {             "fieldName": "Dimension",             "type": "EQ",             "value": q1Result.rows .Dimension           },           {             "fieldName": "Point",             "type": "EQ",             "value": q1Result.rows .Point           }         ]       }     };   var params = {                 t: distinctResult /* INFOTABLE */,                 query: query /* QUERY */ }; // result: INFOTABLE var matchResult = Resources["InfoTableFunctions"].Query(params);     if (matchResult.rows.length == 1) {         tempTable1.AddRow(q1Result.rows );            var params = {             t: distinctResult /* INFOTABLE */,             query: query /* QUERY */         };         // result: INFOTABLE         var distinctResult = Resources["InfoTableFunctions"].DeleteQuery(params);         if (distinctResult.rows.length == 0) {                        break                    }            }    } //I now have a tempTable1 with the full rows and the 4 fields distinct result = tempTable1
View full tip
This is a useful trick for rolling up metrics in Thingworx across various levels of a hierarchy by using Networks, ThingShapes, and recursive service definitions. Say that you have a hierachy of Things in your model such as a Global view, a Region view, and a Store view -- this could be a Mfg Plant, a Building, an Asset, etc; whatever the core metric producing Thing is in your model -- where your Store has KPIs that you want to roll up across regions and globally. First, create a template for each of your hierarchical levels. In my case it is a GlobalTemplate, RegionalTemplate, and StoreTemplate. Add a property to your StoreTemplate that will be the KPI. Now, create a Thing for the Globe, and each of your Regions and Stores. Add them to a hierachical Network as such: Now, we need to create a ThingShape to aggregate our KPIs and apply it to the Global, Regional, and Store template. Now we will define a recursive funciton on our ThingShape called GetKPI ​and define it with the following: //define our base case, when the thing template we are on is the lowest level of our hierarchy, in this case the StoreTemplate if (me.thingTemplate == "StoreTemplate") {     //in our base case, the result is just the property for the metric we want to aggregate     result = me.someMetric } else {     //otherwise, we are at some other level in the hierarchy and we need to get our child connections from the network     //this gets all the things below us in the network     var params = {         name: me.name /* STRING */     };     // result: INFOTABLE dataShape: NetworkConnection     var network = Networks["Network"].GetChildConnections(params);     //loop through each of the things below us in the hierarchy and recursively add the result of GetKPI() to our result     result = 0;     for each (var row in network.rows) {             result += Things[row.to].GetKPI();     } } This is a simple case of just summing up a single property, but we can take this further using the Union and Aggregate snippets provided by thingworx to do other kinds of summarization. First add a new property called someAvgMetric ​to our StoreTemplate, and define a new service GetKPIProperties as such, with an InfoTable result, on the StoreTemplate varparams = {     propertyNames: {"items": ["someMetric", "someAvgMetric"]} /* JSON */ }; // result: INFOTABLE dataShape: "undefined" var result = me.GetNamedProperties(params); Now, define a new service on our ThingShape to utilize this service as our base case, and aggregate the resulting InfoTable when necessary. We'll call this service GetKPIAggregates: //define our base case if (me.thingTemplate == "StoreTemplate") {     //this function will be on the StoreTemplate, and returns the base infotable     result = me.GetKPIProperties() } else {     //grab our network     var params = {         name: me.name /* STRING */     };     // result: INFOTABLE dataShape: NetworkConnection     var network = Networks["Network"].GetChildConnections(params);     //need to create an empty infotable to union into. I glossed over this, but you'll need a datashape here     //create empty infotable     var result = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape({ infoTableName: "InfoTable", dataShapeName: "KPIDataShape" });     //loop through and union each of our results to our new infotable     for each (var row in network.rows) {         var params = {             t1: result /* INFOTABLE */,             t2: Things[row.to].GetKPIAggregates() /* INFOTABLE */         };         var result = Resources["InfoTableFunctions"].Union(params);     }     //aggregate each of our fields     var params = {         t: result /* INFOTABLE */,         columns: "someMetric,someAvgMetric" /* STRING */,         aggregates: "SUM,AVERAGE" /* STRING */,         groupByColumns: undefined /* STRING */     };     // result: INFOTABLE     var result = Resources["InfoTableFunctions"].Aggregate(params);     //need to loop through each of our field names and make them match our base infotable     // infotable datashape iteration     var dataShapeFields = result.dataShape.fields;     for (var fieldName in dataShapeFields) {         var stringName = dataShapeFields[fieldName].name;         var params = {             t: result /* INFOTABLE */,             from: stringName /* STRING */,             to: stringName.split("_")[1] /* STRING */         };         // result: INFOTABLE         var result = Resources["InfoTableFunctions"].RenameField(params);     } } Now, in our mashups, we can use a DynamicThingShape and call our GetKPIs service at any level in our network, and our data will be aggregated correctly for whatever level we are at in the hierarchy!
View full tip
In this blog I will be testing the SAPODataConnector using the SAP Gateway - Demo Consumption System.   Overview   The SAPODataConnector enables the connection to the SAP Netweaver Gateway through the ODdata specification. It is a specialized implementation of the ODataConnector. See Integration Connectors for documentation.   It relies on three components : Integration Runtime : microservice that runs outside of ThingWorx and has to be deployed separately, it uses Web Socket to communicate with the ThingWorx platform (similar to EMS). Integration Subsystem : available by default since 7.4 (not extension needed) Integration Connector : SAPODataConnector available by default in 8.0 (not extension needed)   ThingWorx can use OAuth to access SAP, but in this blog I will just use basic authentication.   SAP Netweaver Gateway Demo system registration   1. Create an account on the Gateway Demo system (credentials to be used on the connector are sent by email) 2. Verify that the account has access to the basic OData sample service : https://sapes4.sapdevcenter.com/sap/opu/odata/IWBEP/GWSAMPLE_BASIC/   Integration Runtime microservice setup   1. Follow WindchillSwaggerConnector hands-on (7.4) - Integration Runtime microservice setup Note: Only one Integration Runtime instance is required for all your Integration Connectors (Multiple instances are supported for High Availability and scale).   SAPODataConnector setup   Use the New Composer UI (some setting, such as API maps, are not available in the ThingWorx legacy composer)     1. Create a DataShape that is used to map the attributes being retrieved from SAP SAPObjectDS : Id (STRING), Name (STRING), Price (NUMBER) 2. Create a Thing named TestSAPConnector that uses SAPODataConnector as thing template 3. Setup the SAP Netweaver Gateway connection under TestSAPConnector > Configuration Generic Connector Connection Settings Authentication Type = fixed HTTP Connector Connection Settings Username = <SAP Gateway user> Password = < SAP Gateway pwd> Base URL : https://sapes4.sapdevcenter.com/sap Relative URL : /opu/odata/IWBEP/GWSAMPLE_BASIC/ Connection URL : /opu/odata/IWBEP/GWSAMPLE_BASIC/$metadata 4. Create the API maps and service under TestSAPConnector > API Maps (New Composer only) Mapping ID : sap EndPoint : getProductSet Select DataShape : SAPObjectDS (created at step 1) and map the following attributes : Name <- Name Id <- ProductID Price <- Price Pick "Create a Service from this mapping"     Testing our Connector   Test the TestSAPConnector::getProductSet service (keep all the input parameters blank)
View full tip
Persistent properties are stored in ThingWorx database while non-persistent properties are stored in memory. This means that the persistent values do not get erased or deleted if the thing restarts or platform restarts. The persistent properties can also be retrieved in the same way as non-persistent properties. To explain better how we can retrieve persistent data, please consider the below example: I took a device group which has multiple devices and defined 2 persistent properties. One is serial number and the other is firmware version number. Now in a mashup builder, the user gives the serial number and the corresponding firmware version will be retrieved. I achieved this by writing services for that thing. Created a Data shape with the name DeviceData and defined two filed definitions. One is Serial number and the other is firmware version. Created a thing template with the name DevGroup. Added a property in the thing template with the name DeviceData and selected the basetype as info table. Also selected the previously created data shape (Device Data) in data shape field. And made this property as persistent data by selecting Is Persistent. Saved the property. Created two devices with the names Device1 and Device2. Both the devices use the above template. Now for each device set the property values by clicking the set button in properties link. These values will be persistent, meaning they do not change even after refresh or when you restart ThingWorx. You can even set these properties at run time by just creating a service. Created GetFirmversion service which retrieves the firm version of given service number. Created mashup as shown below: . Once you select Device1, enter the serial number in the numeric entry field and click query button, the firmware version of the given serial number is displayed along with the data of device1. If we enter a wrong number, no data will be displayed. You can also set an error message instead of displaying empty values. Similar case when we select the device2 data. This is one way to retrieve persistent data. You can also obtain the same in many ways.
View full tip
Welcome to the ThingWorx Manufacturing Apps Community! The ThingWorx Manufacturing Apps are easy to deploy, pre-configured role-based starter apps that are built on PTC’s industry-leading IoT platform, ThingWorx. These Apps provide manufacturers with real-time visibility into operational information, improved decision making, accelerated time to value, and unmatched flexibility to drive factory performance.   This Community page is open to all users-- including licensed ThingWorx users, Express (“freemium”) users, or anyone interested in trying the Apps. Tech Support community advocates serve users on this site, and are here to answer your questions about downloading, installing, and configuring the ThingWorx Manufacturing Apps.     A. Sign up: ThingWorx Manufacturing Apps Community: PTC account credentials are needed to participate in the ThingWorx Community. If you have not yet registered a PTC eSupport account, start with the Basic Account Creation page.   Manufacturing Apps Web portal: Register a login for the ThingWorx Manufacturing Apps web portal, where you can download the free trial and navigate to the additional resources discussed below.     B. Download: Choose a download/packaging option to get started.   i. Express/Freemium Installer (best for users who are new to ThingWorx): If you want to quickly install ThingWorx Manufacturing Apps (including ThingWorx) use the following installer: Download the Express/Freemium Installer   ii. 30-day Developer Kit trial: To experience the capabilities of the ThingWorx Platform with the Manufacturing Apps and create your own Apps: Download the 30-day Developer Kit trial   iii. Import as a ThingWorx Extension (for users with a Manufacturing Apps entitlement-- including ThingWorx commercial customers, PTC employees, and PTC Partners): ThingWorx Manufacturing apps can be imported as ThingWorx extensions into an existing ThingWorx Platform install (v8.1.0). To locate the download, open the PTC Software Download Page and expand the following folders:   ThingWorx Platform | Release 8.x | ThingWorx Manufacturing Apps Extension | Most Recent Datacode     C. Learn After downloading the installer or extensions, begin with Installation and Configuration.   Follow the steps laid out in the ThingWorx Manufacturing Apps Setup and Configuration Guide 8.2   Find helpful getting-started guides and videos available within the 'Get Started' section of the ThingWorx Manufacturing Apps Portal.     D. Customize Once you have successfully downloaded, installed, and configured the Manufacturing Apps, begin to explore the deeper potential of the Apps and the ThingWorx Platform.   Follow along with the discussion and steps contained in the ThingWorx Manufacturing Apps and Service Apps Customization Guide  8.2   Also contained within the the 'Get Started' page of the ThingWorx Manufacturing Apps Portal, find the "Evolve and Expand" section, featuring: -Custom Plant Layout application -Custom Asset Advisor application -Global Plant View application -Thingworx Manufacturing Apps Technical Lab with Sigma Tile (Raspberry Pi application) -Configuring the Apps with demo data set and simulator -Additional Advanced Documentation     E. Get help / give feedback / interact Use the ThingWorx Manufacturing Apps Community page as a resource to find documentation, peruse past forum threads, or post a question to start a discussion! For advanced troubleshooting, licensed users are encouraged to submit support tickets to the PTC My eSupport portal.
View full tip
/* Define a DataShape used in an InfoTable Parameter for this service call */ twDataShape* sampleInfoTableAsParameterDs = twDataShape_Create(twDataShapeEntry_Create("ColumnA",NO_DESCRIPTION,TW_STRING)); twDataShape_AddEntry(sampleInfoTableAsParameterDs,twDataShapeEntry_Create("ColumnB",NO_DESCRIPTION,TW_NUMBER)); twDataShape_AddEntry(sampleInfoTableAsParameterDs,twDataShapeEntry_Create("ColumnC",NO_DESCRIPTION,TW_BOOLEAN)); twDataShape_SetName(sampleInfoTableAsParameterDs,"SampleInfoTableAsParameterDataShape");      /* Define Input Parameter that is an InfoTable of Shape SampleInfoTableAsParameterDataShape */ twDataShapeEntry* infoTableDsEntry = twDataShapeEntry_Create("itParam",NULL,TW_INFOTABLE); twDataShapeEntry_AddAspect(infoTableDsEntry, "dataShape", twPrimitive_CreateFromString("SampleInfoTableAsParameterDataShape", TRUE));    twDataShape* inputParametersDefinitionDs = twDataShape_Create(infoTableDsEntry);   /* Register remote function */ twApi_RegisterService(TW_THING, SERVICE_INTEGRATION_THINGNAME, "testMultiRowInfotable", NO_DESCRIPTION,   inputParametersDefinitionDs, TW_NOTHING, NULL, PlatformCallsServiceWithMultiRowInfoTableServiceImpl, NULL); /* Note that you will have to manually create the datashape in ThingWorx before attempting to add this remote service to your Thing. */
View full tip
The AddStreamEntries​ snippet does not offer too much information, except that it needs an InfoTable as input. It is however based on the InfoTable for the AddStreamEntity service.     To use the AddStreamEntries table, an InfoTable based on sourceType, values, location, source, timestamp​ and ​tags​ must be used.   In this example, I started with a new Thing based on a ​Stream​ template and the following DataShape:     This DataShape must be converted into an InfoTable with is used as the ​values​ parameter. It's important that the ​timestamp​ parameter has distinct values! Otherwise values matching the same timestamp will be overwritten!   We don't really need the sourceType​ as ThingWorx will automatically determine the type by knowing the source and which kind of Entity Type it is.   I created a new ​MyStreamThing​ with a new service, filling the InfoTable and the Stream. The result is the following code which will add 5 rows to the Stream:     // *** SET UP META DATA FOR INFO TABLE ***   // create a new InfoTable based on AddStreamEntries parameters (timestamp, location, source, sourceType, tags, values)   var myInfoTable = { dataShape: { fieldDefinitions : {} }, rows: [] };   myInfoTable.dataShape.fieldDefinitions['timestamp']  = { name: 'timestamp', baseType: 'DATETIME' }; myInfoTable.dataShape.fieldDefinitions['location']  = { name: 'location', baseType: 'LOCATION' }; myInfoTable.dataShape.fieldDefinitions['source']    = { name: 'source', baseType: 'STRING' }; myInfoTable.dataShape.fieldDefinitions['sourceType'] = { name: 'sourceType', baseType: 'STRING' }; myInfoTable.dataShape.fieldDefinitions['tags']      = { name: 'tags', baseType: 'TAGS' }; myInfoTable.dataShape.fieldDefinitions['values']    = { name: 'values', baseType: 'INFOTABLE' };   // *** SET UP ACTUAL VALUES FOR INFO TABLE ***   // create new meta data   var tags = new Array(); var timestamp = new Date(); var location = new Object(); location.latitude = 0; location.longitude = 0; location.elevation = 0; location.units = "WGS84";   // add rows to InfoTable (~5 times)   for (i=0; i<5; i++) {       // create new values based on Stream DataShape       var params = {           infoTableName : "InfoTable",           dataShapeName : "Cxx-DS"     };       var values = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);       // add something to the values to make them unique       // create and add new row based on Stream DataShape     // only a single line allowed!       var newValues = new Object();     newValues.a = "aaa" + i; // STRING - isPrimaryKey = true     newValues.b = "bbb" + i; // STRING     newValues.c = "ccc" + i; // STRING       values.AddRow(newValues);       // create new InfoTable row based on meta data & values     // add 10 ms to each object, to make it's timestamp unique     // otherwise entries with the same timestamp will be overwritten       var newEntry = new Object();     newEntry.timestamp = new Date(Date.now() + (i * 10));     newEntry.location = location;     newEntry.source = me.name;     newEntry.tags = tags;     newEntry.values = values;       // add new Info Table row to Info Table           myInfoTable.rows = newEntry;       }       // *** ADD myInfoTable (HOLDING MULITPLE STREAM ENTRIES) TO STREAM       // add stream entries in the InfoTable       var params = {           values: myInfoTable /* INFOTABLE */     };       // no return       Things["MyStreamThing"].AddStreamEntries(params);   To verify the values have been added correctly, call the ​GetStreamEntriesWithData​ service on the ​MyStreamThing​
View full tip
This video is the 2 nd part of a series of 3 videos walking you through how to setup ThingWatcher for Anomaly Detection. In this video you will learn how to use “Discover UI” from the “New Composer” to bind simulated data coming through KEPServer for Anomaly Detection.   Updated Link for access to this video:  Anomaly Detection 8.0: Configuring Anomaly Alerts:  Part 2 of 3
View full tip
One recurring question that comes up after a customer has been using the scripting capabilities of the Axeda Platform for some time is how to create function libraries that are reusable, in order to reduce the amount of copy and pasting (and testing) that is done to create new functionality.  Below I demonstrate a mechanism for how to accomplish this.  Some things that are typically included in such a library are: Customized ExtendedObject access methods (CRUD) - ExtendedObjects must be created before they can be used, so this can be encapsulated per customer requirements DataItem manipulation Gathering lists of Assets based on criteria Accessing the ExternalCredentialsBridge For those unfamiliar with Custom Objects I suggest some resources at the end of this document to get started.  The first thing we want to do is create a script that is going to be our "Library of Functions": FunctionLibrary.groovy: class GroovyChild {     String hello() {         return "Hello"     }     String world() {         return "World"     } } return new GroovyChild() This can the be subsequently called like this: FunctionCaller.groovy: import static com.axeda.sdk.v2.dsl.Bridges.* import com.axeda.services.v2.CustomObjectCriteria import com.axeda.services.v2.CustomObjectType import com.axeda.services.v2.CustomObject CustomObjectCriteria cOC1 = new CustomObjectCriteria() cOC1.setName 'FunctionLibrary' def co = customObjectBridge.findOne cOC1 result = customObjectBridge.execute(co.label ) return ['Content-Type': 'application/text', 'Content': result.hello() + ' ' + result.world() ] A developer would be wise to add in null checking on some of the function returns and do some error reporting if it cannot find and execute the FunctionLibrary.  This is only means a a means to start the users on the path of building their own reusable content libraries. Regard -Chris Kaminski PTC/Axeda Customer Support References: Axeda® Platform Web Services Developer's Reference, v2 REST 6.8.3 August 2015 Axeda® v2 API/Services Developer's Reference Version 6.8.3 August 2015 Axeda® v1 API Developer’s Reference Guide Version 6.8 August 2014 Documentation Map for Axeda® 6.8.2 January 2015
View full tip
In the process of working with a customer, I was curious as to the throughput of a file sent via the Axeda Connected Content feature to one of the Axeda Agent Gateways.   I took a random 50 megabyte blob of data (/dev/urandom) and sent it to one of my test Gateways via a Package deployment: DEBUG   xgEnterpriseProxy: Enterprise Queue Empty INFO    xgSM:  ... Download percent done = 11% INFO    xgSM:  ... Download percent done = 21% INFO    xgSM:  ... Download percent done = 31% INFO    xgSM:  ... Download percent done = 41% INFO    xgSM:  ... Download percent done = 51% INFO    xgSM:  ... Download percent done = 61% INFO    xgSM:  ... Download percent done = 71% INFO    xgSM:  ... Download percent done = 81% INFO    xgSM:  ... Download percent done = 91% INFO    xgSM:  ... Download percent done = 100% DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Download time is 4 seconds DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Upgrading.  Backing up files to C:\temp\CFKGW\AxedaBackup DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Extracting downloaded files from DefaultProject\CFKGW\Downloads\141581_143281.tar.gz to directory C:\temp\CFKGW\ DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Extraction Finished About 12MB per second.  This was a sandbox in the PTC On-Demand Center.  Not bad, but not necessarily representative of a real production system.  This sandbox doesn't have 1000 devices trying to get this file at once.  So some benchmarking in your configuration and environment certainly needs to be done. So that done, I thought I'd up the ante - 700 megabytes this time! DEBUG   xgEnterpriseProxy: Enterprise Queue Empty INFO    xgSM:  ... Download percent done = 10% INFO    xgSM:  ... Download percent done = 20% INFO    xgSM:  ... Download percent done = 30% INFO    xgSM:  ... Download percent done = 40% INFO    xgSM:  ... Download percent done = 50% INFO    xgSM:  ... Download percent done = 60% INFO    xgSM:  ... Download percent done = 70% INFO    xgSM:  ... Download percent done = 80% INFO    xgSM:  ... Download percent done = 90% INFO    xgSM:  ... Download percent done = 100% DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Download time is 66 seconds So 10MB per second. Directory of C:\temp\cfkgw 05/17/2017  01:32 PM    <DIR>          . 05/17/2017  01:32 PM    <DIR>          .. 05/17/2017  01:03 PM         1,048,576 1mb.dat 05/17/2017  01:04 PM        52,428,800 50meg-randomdata.dat 05/17/2017  01:32 PM       734,003,200 700mb.dat 05/17/2017  01:03 PM    <DIR>          AxedaBackup                3 File(s)    787,480,576 bytes Not bad at all! 
View full tip
If you ever tested mashup rendering on mobile phones, you probably experienced that the mashup was not sizing to fit your mobile display. This "MobileHeader" extension enables to auto adapt the mashup to mobile displays.   It adds the following parameters to the HTML header: <meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=0"> <meta name="apple-mobile-web-app-capable" content="yes"> <meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">   In the composer just drop the "MobileHeader" extension into a section of the mashup.   This extension was tested until version 7.4.
View full tip
It usually happens that we need to copy a large file to ThingWorx server periodically, and what's worse, the big file is changing(like a log file). This sample give a simpler way to implement. The main idea in the sample is: 1. Lower the management burden from ThingWorx server and instead it put all the work in edge SDK side 2. Save network burden with only uploading the incremented file and append it to the older file on ThingWorx server   Java SDK version in this sample: 6.0.1-255
View full tip
This is part 3 out of 3 videos on Getting Started with ThingWorx Analytics During this video you will learn:   Executing a “Signals” Job Getting the results of the “Signals” Job Executing a “Training Model” Job Getting the results of the “Training Model” Job   Updated Link for access to this video:  Getting Started with ThingWorx Analytics: Part 3 of 3
View full tip
This simple example creates an infotable of hierarchical data from an existing datashape that can be used in a tree or D3 Tree widget.  Attachments are provides that give a PDF document as well as the ThingWorx entities for the example.  If you were just interested in the service code, it is listed below: var params = {   infoTableName : "InfoTable",   dataShapeName : "TreeDataShape" }; // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(TreeDataShape) var result = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params); var NewRow = {}; NewRow.Parent = "No Parent"; NewRow.Child = "Enterprise"; NewRow.ChildDescription = "Enterprise"; result.AddRow(NewRow); NewRow.Parent = "Enterprise"; NewRow.Child = "Site1"; NewRow.ChildDescription = "Wilmington Plant"; result.AddRow(NewRow); NewRow.Parent = "Site1"; NewRow.Child = "Site1.Line1"; NewRow.ChildDescription = "Blow Molding"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line1"; NewRow.Child = "Site1.Line1.Asset1"; NewRow.ChildDescription = "Preform Staging"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line1"; NewRow.Child = "Site1.Line1.Asset2"; NewRow.ChildDescription = "Blow Molder"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line1"; NewRow.Child = "Site1.Line1.Asset3"; NewRow.ChildDescription = "Bottle Unscrambler"; result.AddRow(NewRow); NewRow.Parent = "Site1"; NewRow.Child = "Site1.Line2"; NewRow.ChildDescription = "Filling"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line2"; NewRow.Child = "Site1.Line2.Asset1"; NewRow.ChildDescription = "Rinser"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line2"; NewRow.Child = "Site1.Line2.Asset2"; NewRow.ChildDescription = "Filler"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line2"; NewRow.Child = "Site1.Line2.Asset3"; NewRow.ChildDescription = "Capper"; result.AddRow(NewRow); NewRow.Parent = "Site1"; NewRow.Child = "Site1.Line3"; NewRow.ChildDescription = "Packaging"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line3"; NewRow.Child = "Site1.Line3.Asset1"; NewRow.ChildDescription = "Printer Labeler"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line3"; NewRow.Child = "Site1.Line3.Asset2"; NewRow.ChildDescription = "Packer"; result.AddRow(NewRow); NewRow.Parent = "Site1.Line3"; NewRow.Child = "Site1.Line3.Asset3"; NewRow.ChildDescription = "Palletizing"; result.AddRow(NewRow); NewRow.Parent = "Enterprise"; NewRow.Child = "Site2"; NewRow.ChildDescription = "Mobile Plant"; result.AddRow(NewRow); NewRow.Parent = "Site2"; NewRow.Child = "Site2.Line1"; NewRow.ChildDescription = "Blow Molding"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line1"; NewRow.Child = "Site2.Line1.Asset1"; NewRow.ChildDescription = "Preform Staging"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line1"; NewRow.Child = "Site2.Line1.Asset2"; NewRow.ChildDescription = "Blow Molder"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line1"; NewRow.Child = "Site2.Line1.Asset3"; NewRow.ChildDescription = "Bottle Unscrambler"; result.AddRow(NewRow); NewRow.Parent = "Site2"; NewRow.Child = "Site2.Line2"; NewRow.ChildDescription = "Filling"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line2"; NewRow.Child = "Site2.Line2.Asset1"; NewRow.ChildDescription = "Rinser"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line2"; NewRow.Child = "Site2.Line2.Asset2"; NewRow.ChildDescription = "Filler"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line2"; NewRow.Child = "Site2.Line2.Asset3"; NewRow.ChildDescription = "Capper"; result.AddRow(NewRow); NewRow.Parent = "Site2"; NewRow.Child = "Site2.Line3"; NewRow.ChildDescription = "Packaging"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line3"; NewRow.Child = "Site2.Line3.Asset1"; NewRow.ChildDescription = "Printer Labeler"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line3"; NewRow.Child = "Site2.Line3.Asset2"; NewRow.ChildDescription = "Packer"; result.AddRow(NewRow); NewRow.Parent = "Site2.Line3"; NewRow.Child = "Site2.Line3.Asset3"; NewRow.ChildDescription = "Palletizing"; result.AddRow(NewRow);
View full tip
Disclaimer: example was provided by Hatcher Chad - chad@onfarmsystems.com   //   // For this example, we'll have an Math service   // which takes two numbers, and an operation.   // The result will be that operation performed on the two inputs.       //   // We either need an Application Key,   // or user credentials to perform the reads and writes.   // App keys are a little safer.   // In this demo, we'll store it on the Entity as a Property.   var appKey = me.appKey;       //   // The service name needs to be unique and not already in use.   var serviceName = "MyMath";       //   // What are the inputs to the service?   // We'll define them nicely here, but manipulate this object later.   var parameters = {   "op" : "STRING",   "x" : "NUMBER",   "y" : "NUMBER"   };       //   // What datatype does the service return?   // If it's an infotable,   // then you'll also have to specify the data shape   // as part of the resultType's aspect,   // but I won't demonstrate that here.   var output = "NUMBER";       //   // What is the actual service script?   // We'll define it here as an array of lines, and then join them together.   var serviceScript = [   "var result = (function() {",   " switch(op) {",   " case \"add\": return x + y;",   " case \"sub\": return x - y;",   " case \"mult\": return x * y;",   " case \"div\": return x / y;",   " default: return op in Math ? Math[op](x, y) : 0;",   " };",   "})();",   ].join("\n");       ////////       //   // Let's convert the friendly parameter definition   // into the structure that ThingWorx uses:   var parameterDefinitions = Object.keys(parameters).reduce(function(parameterDefinitions, parameterName, index) {   var parameterType = parameters[parameterName];   parameterDefinitions[parameterName] = {   "name": parameterName,   "aspects": {},   "description": "",   "baseType": parameterType,   "ordinal": index   };   return parameterDefinitions;   }, {});       //   // Now let's set up our service definition and implementation.   var definition = {   "isAllowOverride": false,   "isOpen": false,   "sourceType": "Unknown",   "parameterDefinitions": parameterDefinitions,   "name": serviceName,   "aspects": {   "isAsync": false   },   "isLocalOnly": false,   "description": "",   "isPrivate": false,   "sourceName": "",   "category": "",   "resultType": {   "name": "result",   "aspects": {},   "description": "",   "baseType": output,   "ordinal": 0   }   };       var implementation = {   "name": serviceName,   "description": "",   "handlerName": "Script",   "configurationTables": {   "Script": {   "isMultiRow": false,   "name": "Script",   "description": "Script",   "rows": [{   "code": serviceScript   }],   "ordinal": 0,   "dataShape": {   "fieldDefinitions": {   "code": {   "name": "code",   "aspects": {},   "description": "code",   "baseType": "STRING",   "ordinal": 0   }   }   }   }   }   };       ////////       //   // Here are the URLs we'll need in order to make updates.   // You can change the thing name ('ServiceModifier' here)   // to something else.   // If you use credentials instead of an app key,   // then you can remove the appKey parameter here,   // but you'll have to add the username and password   // to the two ContentLoaderFunctions calls.   var url = {   export : "http://127.0.0.1:8080/Thingworx/Things/ServiceModifier?Accept=application/json&appKey="+appKey,   import : "http://127.0.0.1:8080/Thingworx/Things/ServiceModifier?appKey="+appKey   };       //   // We can download the entity to modify as a JSON object.   // Older versions of ThingWorx might not support this.   var config = Resources.ContentLoaderFunctions.GetJSON({   url : url.export,   });       //   // We have to modify both the 'effectiveShape',   // as well as the 'thingShape'.   config.effectiveShape.serviceDefinitions[serviceName] = definition;   config.effectiveShape.serviceImplementations[serviceName] = implementation;       config.thingShape.serviceDefinitions[serviceName] = definition;   config.thingShape.serviceImplementations[serviceName] = implementation;       // Finally, we can push our updates back into ThingWorx.   Resources.ContentLoaderFunctions.PutText({   url : url.export,   content : JSON.stringify(config),   contentType : "application/json",   });       // The end.
View full tip
This Javascript snippet creates a random value between those limits:   var dbl_Value = Math.floor(Math.random()*(max-min+1)+min);
View full tip
This example is to achieve to update objects in Windchill thru extensions. It is really hard to find a resource for Windchill extension's services to take an advantage of them. So, I wrote a simple example to update objects in Windchill from Thingworx.   There are three data shapes needed to do this. One is "PTC.PLM.WindchillPartUfids" which has only "value" field (String) in it and another is "PTC.PLM.WindchillPartCheckedOutDS" which has a "ufid" field (String). Last one is "PTC.PLM.WindchillPartPropertyDS" which has a "ufid" field (String) and fields for "attributes". For an instance of the last data shape, there might be three fields as "ufid", "partPrice" and "quantity" to update parts. In this example, this data shape has two fields which are "ufid" and "almProjectId".   In this example, this needs two input parameters. One is ufid (String) and almProjectId (String). If you need to have multiple objects to update at once, you can use InfoTable type as an "ufid" input parameter instead of String type.   Note that this is an example code and need to handle exceptions if needed.     // To var params = {     infoTableName : "InfoTable",     dataShapeName : "PTC.PLM.WindchillPartUfids" };   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.PLM.WindchillPartUfids) var ufids = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);   // PTC.PLM.WindchillPartUfids entry object var newValue = new Object(); newValue.value = ufid; // STRING   ufids.AddRow(newValue);   // Check out var params = {     ufids: ufids /* INFOTABLE */,     comment: undefined /* STRING */,     dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */ };   // checkedOutObjs: INFOTABLE dataShape: "undefined" var checkedOutObjsFromService = me.CheckOut(params);   var params = {     infoTableName : "InfoTable",     dataShapeName : "PTC.PLM.WindchillPartUfids" };   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.PLM.WindchillPartUfids) var checkedOutObjs = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);   try {     var tableLength = checkedOutObjsFromService.rows.length;       for (var x = 0; x < tableLength; x++) {         var row = checkedOutObjsFromService.rows;               // PTC.PLM.WindchillPartUfids entry object         var checkedOutObj = new Object();         checkedOutObj.value = row.ufid.substring(0,row.ufid.lastIndexOf(":")); // STRING               //logger.warn("UFID : " + checkedOutObj.value);         checkedOutObjs.AddRow(checkedOutObj);           /* Update Objects in Windchill */         var params = {             infoTableName : "InfoTable",             dataShapeName : "PTC.PLM.WindchillPartPropertyDS"         };           // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.ALM.WindchillPartPropertyDS)         var wcInfoTable = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);           // PTC.ALM.WindchillPartPropertyDS entry object         var newEntry = new Object();         newEntry.ufid = checkedOutObj.value; // STRING         newEntry.almProjectId = almProjectId; // STRING           wcInfoTable.AddRow(newEntry);           var params = {             objects: wcInfoTable /* INFOTABLE */,             modification: "REPLACE" /* STRING */,             dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */         };           // result: INFOTABLE dataShape: "undefined"         var result = me.Update(params);     }   } catch(err) {     logger.warn("ERROR Catched");     var params = {         ufids: ufids /* INFOTABLE */,         dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */     };       // result: INFOTABLE dataShape: "undefined"     var result = me.CancelCheckOut(params);  }   var params = {     ufids: checkedOutObjs /* INFOTABLE */,     comment: undefined /* STRING */,     dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */ };   // result: INFOTABLE dataShape: "undefined" var result = me.CheckIn(params);
View full tip
Sampling Strategy​ This Blog Post will cover the 4 sampling Strategies that are available in ThingWorx Analytics.  It will tell you how the sampling strategy runs behind the scenes, when you may want to use that strategy, and will give you the pros and cons of each strategy. SAMPLE_WITH_REPLACEMENT This strategy is not often used by professionals but still may be useful in certain circumstances.  When you sample with replacement, the value that you randomly selected is then returned to the sample pool.  So there is a chance that you can have the same record multiple times in your sample. Example Let’s say you have a hat that contain 3 cards with different people’s names on them. John Sarah Tom Let’s say you make 2 random selections. The first selection you pull out the name Tom. When you sample with replacement, you would put the name Tom back into the hat and then randomly select a card again.  For your second selection, it is possible to get another name like Sarah, or the same one you selected, Tom. Pros May find improved models in smaller datasets with low row counts Cons The Accuracy of the model may be artificially inflated due to duplicates in the sample SAMPLE_WITHOUT_REPLACEMENT This is the default setting in ThingWorx Analytics and the most commonly used sampling strategy by professionals.  The way this strategy works is after the value is randomly selected from the sample pool, it is not returned.  This ensures that all the values that are selected for the sample, are unique. Example Let’s say you have a hat that contain 3 cards with different people’s names on them. John Sarah Tom Let’s say you make 2 random selections. The first selection you pull out the name Tom. When you sample without replacement, you would randomly select a card from the hat again without adding the card Tom.  For your second selection, you could only get the Sarah or John card. Pros This is the sampling strategy that is most commonly used It will deliver the best results in most cases Cons May not be the best choice if the desired goal is underrepresented in the dataset UPSAMPLE_AND_SAMPLE_WITHOUT_REPLACEMENT This is useful when the desired goal is underrepresented in the dataset.  The features that represent the desired outcome of the goal are copied multiple times so they represent a larger share of the total dataset. Example Let’s say you are trying to discover if a patient is at risk for developing a rare condition, like chronic kidney failure, that affects around .5% of the US population.  In this case, the most accurate model that would be generated would say that no one will get this condition, and according to the numbers, it would be right 99.5% of the time.  But in reality, this is not helpful at all to the use case since you want to know if the patient is at risk of developing the condition. To avoid this from happening, copies are made of the records where the patient did develop the condition so it represents a larger share of the dataset.  Doing this will give ThingWorx Analytics more examples to help it generate a more accurate model. Pros Patterns from the original dataset remain intact Cons Longer training time DOWNSAMPLE_AND_SAMPLE_WITHOUT_REPLACEMENT This is also useful when the desired goal is underrepresented in the dataset. In downsample and sample without replacement, some features that do not represent the desired goal outcome are removed. This is done to increase the desired features percentage of the dataset. Example Let’s continue using the medical example from above.  Instead of creating copies of the desired records, undesired record are removed from the dataset.  This causes the records where patients did develop the condition to occupy a larger percentage of the dataset. Pros Shorter training time Cons Patterns from the original dataset may be lost
View full tip
This video is the 2nd part, of a series of two videos, walking you through the configuration of Analysis Event which is applied for Real-Time Scoring. This part 2 video will walk you through the configuration of Analysis Event for Real-Time Scoring, and validate that a predictions job has been executed based on new input data.   Updated Link for access to this video:  Analytics Manger 7.4: Configure Analysis Event & Real Time Scoring Part 2 of 2
View full tip
The following code snippet will retrieve a months worth of data from the system and return it as a CSV document suitable for import into your spreadsheet or reporting tool of choice. import static com.axeda.sdk.v2.dsl.Bridges.* import com.axeda.drm.sdk.Context import com.axeda.common.sdk.id.Identifier import com.axeda.services.v2.* import com.axeda.sdk.v2.exception.* def ac = new AuditCriteria() ac.fromDate = Date.parse('yyyy-MM-dd', '2017-03-01') ac.toDate   = Date.parse('yyyy-MM-dd', '2017-03-31') def retString = '' tcount = 0 while ( (results = auditBridge.find(ac)) != null  && tcount < results .totalCount) {   results.audits.each { res ->     retString += "${res?.user?.id},${res?.asset?.serialNumber},${res?.category},${res.message},${res.date}\n"     tcount++   }   ac.pageNumber = ac.pageNumber + 1 } return retString
View full tip
Announcements