cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

ThingWorx Navigate is now Windchill Navigate Learn More

IoT & Connectivity Tips

Sort by:
Video Author:                     Asia Garrouj Original Post Date:            March 31, 2017 Applicable Releases:        ThingWorx Analytics 7.4 to 8.1   Description: This video will walk you through the first steps on how to set-up Analytics Manager for Real-Time Scoring and demonstrate how to share your predictive model from Analytics Builder into Analytics Manager, as well as to test the shared model.    
View full tip
Video Author:                     Asia Garrouj Original Post Date:            March 31, 2017 Applicable Releases:        ThingWorx Analytics 7.4 to 8.1   Description: This video will walk you through the first steps on how to set-up Analytics Manager for Real-Time Scoring and demonstrate how to create an Analysis Provider and start the ThingPredictor Agent.     Please Note: In this video, the startup command for the Agent has changed in Release 8.1.  Please refer to the PTC Help center  
View full tip
Video Author:                     Asia Garrouj Original Post Date:            December 9, 2016 Applicable Releases:        ThingWorx Analytics 52.0 to 8.1   Description: This video walks you through how to upload data and shows the configuration settings.   Please Note: In this video, the shown configuration settings page is different for ThingWorx Analytics 8.1.  
View full tip
Video Author:                     Mohammed Amine Chehaibi Original Post Date:            December 2, 2016 Applicable Releases:        ThingWorx Analytics 52 to 8.0   Description: In this video, we will be using Postman to: Create a dataset Enter the dataset configuration Upload the CSV data file to the ThingWorx Analytics Server  
View full tip
Video Author:                    Christophe Morfin Original Post Date:            September 13, 2016 Applicable Releases:        ThingWorx Analytics 52.1 to 8.1 Description: In this video we cover the different configuration steps required for ThingWorx Analytics Builder extension.   Please Note: This video uses Classic Composer.  The same operations can be done using the New Composer starting with version 8.0 as illustrated in the Help Center For release 8.1, the Settings menu differs from previous versions, see What's New in ThingWorx Analytics Builder 8.1 between times 00:12 sec to 00:40 sec for an up to date menu selection.  
View full tip
Video Author:                     Christophe Morfin Original Post Date:            September 13, 2016 Applicable Releases:        ThingWorx Analytics 52.1 to 8.1 ​ Description: A short introduction to ThingWorx Analytics Builder The import of the ThingWorx Analytics Builder Extension  
View full tip
I have put together a small sample of how to get property values from a Windows Powershell command into Thingworx through an agent using the Java SDK. In order to use this you need to import entities from ExampleExport.xml and then run SteamSensorClient.java passing in the parameters shown in run-configuration.txt (URL, port and AppKey must be adapted for your system). ExampleExport.xml is a sample file distributed with the Java SDK which I’ve also included in the zipfile attached to this post. You need to go in Thingworx Composer to Import/Export … Import from File … Entities … Single File … Choose File … Import. Further instructions / details are given in this short video: Video Link : 2181
View full tip
Sometimes you need to do something on a schedule. Axeda Platform is primarily focused on processing events as they happen. But in the case where some action needs to take place periodically, there are Rule Timers. A Rule Timer has a schedule to run, and a list of rules its associated with. Rule Timer Schedules A schedule is defined by a string using the cron syntax. This syntax is extremely flexible and powerful, but can be hard to understand. The fields are as follows: Seconds (0-59) Minutes (0-59) Hours (0-23) Day-of-Month (1-31) Month (1-12) OR jan,feb,mar, apr ... Day-of-Week (0-7) (Sunday is 0 or 7) OR sun, mon, tue, wed, thu, fri, sat Year (optional field) Some examples "0 0 12 ? * WED" - which means "every Wednesday at 12:00 pm" "0 0/5 * * * ?" - means Fire every 5 minutes "0 0 2 * * ?" - means Fire at 2am every day Note: Rule Timer schedules are in GMT/UTC. Associated Expression Rules The Rule timer has no other purpose but to run Expression Rules. These rules can be System or Asset. SystemTimer means an Expression Rule with type set to SystemTimer. This rule will be run once per scheduled time. A system timer can run a script to export data every day, or enable some rules at the end of a beta program. The other type is an Expression Rule with type AssetTimer. This rule will be run for all associated assets at the scheduled time. Say you want to save the max speed every day for a fleet of vehicles. You have an Expression Rule that's associated with the model Vehicle. The rule  A Rule Timer is set to run every day. Rule SetMax Type Data If: Speed > MaxSpeed Then: SetDataItem("MaxSpeed", Speed) Rule DailyMax Type AssetTimer If MaxSpeed > 0 Then: SetDataItem("DailyMax", MaxSpeed) && SetDataItem("MaxSpeed", 0) Associated to Vehicles RuleTimer NightlyUpdate Schedule "0 0 9 * * * ?" Associated rule: DailyMax The SetMax rule stores the max speed in a dataitem for each Vehicle asset. The NightlyUpdate timer runs at 9am UTC (which is 3am Eastern US, and midnight PST.) It writes the max into a DailyMax dataitem and clears the MaxSpeed to get ready for another day. The effect of the timer is that each Vehicle asset will process an event and run any rules that apply to it. Note! If you create the MaxSpeed dataitem through the model wizard, it doesn't have an initial value! So the rule IF Speed > MaxSpeed will not do the comparison until the first day when the timer sets its value to be 0.
View full tip
This Expert Session consists of the general overview for the multitenancy and platform security. It  discusses the available security levels, necessary basic resources, as well as provides information on the system user, and also includes several examples on how-to. It’s assumed that the audience is familiar with the Composer and its navigation.     For full-sized viewing, click on the YouTube link in the player controls.   Visit the Online Success Guide to access our Expert Session videos at any time as well as additional information about ThingWorx training and services.
View full tip
This Expert Session will provide you with an in depth explanation behind how Signals are calculated in ThingWorx Analytics, what purpose they serve, and why we use them.  Some basic mathematical concepts are discussed so viewers will have a better idea of how ThingWorx Analytics operates behind the scenes.     For full-sized viewing, click on the YouTube link in the player controls.   Visit the Online Success Guide to access our Expert Session videos at any time as well as additional information about ThingWorx training and services.
View full tip
This Expert Session goes over ways to identify and develop a successful use case for ThingWorx Analytics. The example use case presented here is on employee retention in a fictional company with the goal of maximizing employee retention . This presentation will provide you with all the fundamentals you need to develop your own ThingWorx Analytics use cases from the ground up.     For full-sized viewing, click on the YouTube link in the player controls.   Visit the Online Success Guide to access our Expert Session videos at any time as well as additional information about ThingWorx training and services.
View full tip
Ran into this recently thought I share an approach to getting a table with multi-column distinct yet retaining all the columns of the row. If you use Distinct, you get only the Columns you do Distinct on. This isn't very helpful if you want the 'latest' or the 'first occurrences'  of records in your table with a combination of fields being unique. For example I had Process, Part, Dimension and Point for which I had multiple value and date time entries, but I only wanted the latest entries. Following is how I solved it, if you have a better way please leave a comment! P.S.: for the query I used the awesome query builder available in the snippet section! --------------------------------------- var q1Result = Things["MyThing"].QueryStreamEntriesWithData({maxItems:99999, query:query1}); //Below creates a temporary measurement table to store the latest meaurement values var params = {                 infoTableName : "InfoTable",                 dataShapeName : "MyDatashape.DS" }; // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(MyDataShape.DS) var tempTable1 = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params); // Extract only the latest measurements for the PART from the measurement result table 'q1Result' //The way we are going to reduce this to unique measurements is //1. records are in reverse order of date time //2. get distinct by Process Part Dim Point //3. Step through and match against distinct set //4. First match goes into final set //5. Upon match remove from distinct set //6. If no match then skip record //7. If no more distinct match records break loop var params = {                 t: q1Result /* INFOTABLE */,                 columns: 'ProcessID,PartID,Dimension,Point' /* STRING */ }; // result: INFOTABLE var distinctResult = Resources["InfoTableFunctions"].Distinct(params); for (var x = 0; x < q1Result.rows.length; x++) {     var query = {       "filters": {         "type": "AND",         "filters": [           {             "fieldName": "ProcessID",             "type": "EQ",             "value": q1Result.rows .ProcessID           },           {             "fieldName": "PartID",             "type": "EQ",             "value": q1Result.rows .PartID          },           {             "fieldName": "Dimension",             "type": "EQ",             "value": q1Result.rows .Dimension           },           {             "fieldName": "Point",             "type": "EQ",             "value": q1Result.rows .Point           }         ]       }     };   var params = {                 t: distinctResult /* INFOTABLE */,                 query: query /* QUERY */ }; // result: INFOTABLE var matchResult = Resources["InfoTableFunctions"].Query(params);     if (matchResult.rows.length == 1) {         tempTable1.AddRow(q1Result.rows );            var params = {             t: distinctResult /* INFOTABLE */,             query: query /* QUERY */         };         // result: INFOTABLE         var distinctResult = Resources["InfoTableFunctions"].DeleteQuery(params);         if (distinctResult.rows.length == 0) {                        break                    }            }    } //I now have a tempTable1 with the full rows and the 4 fields distinct result = tempTable1
View full tip
/* Define a DataShape used in an InfoTable Parameter for this service call */ twDataShape* sampleInfoTableAsParameterDs = twDataShape_Create(twDataShapeEntry_Create("ColumnA",NO_DESCRIPTION,TW_STRING)); twDataShape_AddEntry(sampleInfoTableAsParameterDs,twDataShapeEntry_Create("ColumnB",NO_DESCRIPTION,TW_NUMBER)); twDataShape_AddEntry(sampleInfoTableAsParameterDs,twDataShapeEntry_Create("ColumnC",NO_DESCRIPTION,TW_BOOLEAN)); twDataShape_SetName(sampleInfoTableAsParameterDs,"SampleInfoTableAsParameterDataShape");      /* Define Input Parameter that is an InfoTable of Shape SampleInfoTableAsParameterDataShape */ twDataShapeEntry* infoTableDsEntry = twDataShapeEntry_Create("itParam",NULL,TW_INFOTABLE); twDataShapeEntry_AddAspect(infoTableDsEntry, "dataShape", twPrimitive_CreateFromString("SampleInfoTableAsParameterDataShape", TRUE));    twDataShape* inputParametersDefinitionDs = twDataShape_Create(infoTableDsEntry);   /* Register remote function */ twApi_RegisterService(TW_THING, SERVICE_INTEGRATION_THINGNAME, "testMultiRowInfotable", NO_DESCRIPTION,   inputParametersDefinitionDs, TW_NOTHING, NULL, PlatformCallsServiceWithMultiRowInfoTableServiceImpl, NULL); /* Note that you will have to manually create the datashape in ThingWorx before attempting to add this remote service to your Thing. */
View full tip
This video is the 2 nd part of a series of 3 videos walking you through how to setup ThingWatcher for Anomaly Detection. In this video you will learn how to use “Discover UI” from the “New Composer” to bind simulated data coming through KEPServer for Anomaly Detection.   Updated Link for access to this video:  Anomaly Detection 8.0: Configuring Anomaly Alerts:  Part 2 of 3
View full tip
One recurring question that comes up after a customer has been using the scripting capabilities of the Axeda Platform for some time is how to create function libraries that are reusable, in order to reduce the amount of copy and pasting (and testing) that is done to create new functionality.  Below I demonstrate a mechanism for how to accomplish this.  Some things that are typically included in such a library are: Customized ExtendedObject access methods (CRUD) - ExtendedObjects must be created before they can be used, so this can be encapsulated per customer requirements DataItem manipulation Gathering lists of Assets based on criteria Accessing the ExternalCredentialsBridge For those unfamiliar with Custom Objects I suggest some resources at the end of this document to get started.  The first thing we want to do is create a script that is going to be our "Library of Functions": FunctionLibrary.groovy: class GroovyChild {     String hello() {         return "Hello"     }     String world() {         return "World"     } } return new GroovyChild() This can the be subsequently called like this: FunctionCaller.groovy: import static com.axeda.sdk.v2.dsl.Bridges.* import com.axeda.services.v2.CustomObjectCriteria import com.axeda.services.v2.CustomObjectType import com.axeda.services.v2.CustomObject CustomObjectCriteria cOC1 = new CustomObjectCriteria() cOC1.setName 'FunctionLibrary' def co = customObjectBridge.findOne cOC1 result = customObjectBridge.execute(co.label ) return ['Content-Type': 'application/text', 'Content': result.hello() + ' ' + result.world() ] A developer would be wise to add in null checking on some of the function returns and do some error reporting if it cannot find and execute the FunctionLibrary.  This is only means a a means to start the users on the path of building their own reusable content libraries. Regard -Chris Kaminski PTC/Axeda Customer Support References: Axeda® Platform Web Services Developer's Reference, v2 REST 6.8.3 August 2015 Axeda® v2 API/Services Developer's Reference Version 6.8.3 August 2015 Axeda® v1 API Developer’s Reference Guide Version 6.8 August 2014 Documentation Map for Axeda® 6.8.2 January 2015
View full tip
In the process of working with a customer, I was curious as to the throughput of a file sent via the Axeda Connected Content feature to one of the Axeda Agent Gateways.   I took a random 50 megabyte blob of data (/dev/urandom) and sent it to one of my test Gateways via a Package deployment: DEBUG   xgEnterpriseProxy: Enterprise Queue Empty INFO    xgSM:  ... Download percent done = 11% INFO    xgSM:  ... Download percent done = 21% INFO    xgSM:  ... Download percent done = 31% INFO    xgSM:  ... Download percent done = 41% INFO    xgSM:  ... Download percent done = 51% INFO    xgSM:  ... Download percent done = 61% INFO    xgSM:  ... Download percent done = 71% INFO    xgSM:  ... Download percent done = 81% INFO    xgSM:  ... Download percent done = 91% INFO    xgSM:  ... Download percent done = 100% DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Download time is 4 seconds DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Upgrading.  Backing up files to C:\temp\CFKGW\AxedaBackup DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Extracting downloaded files from DefaultProject\CFKGW\Downloads\141581_143281.tar.gz to directory C:\temp\CFKGW\ DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Extraction Finished About 12MB per second.  This was a sandbox in the PTC On-Demand Center.  Not bad, but not necessarily representative of a real production system.  This sandbox doesn't have 1000 devices trying to get this file at once.  So some benchmarking in your configuration and environment certainly needs to be done. So that done, I thought I'd up the ante - 700 megabytes this time! DEBUG   xgEnterpriseProxy: Enterprise Queue Empty INFO    xgSM:  ... Download percent done = 10% INFO    xgSM:  ... Download percent done = 20% INFO    xgSM:  ... Download percent done = 30% INFO    xgSM:  ... Download percent done = 40% INFO    xgSM:  ... Download percent done = 50% INFO    xgSM:  ... Download percent done = 60% INFO    xgSM:  ... Download percent done = 70% INFO    xgSM:  ... Download percent done = 80% INFO    xgSM:  ... Download percent done = 90% INFO    xgSM:  ... Download percent done = 100% DEBUG   xgSM: >>  INTERNAL DEBUG MESSAGE << :  Download time is 66 seconds So 10MB per second. Directory of C:\temp\cfkgw 05/17/2017  01:32 PM    <DIR>          . 05/17/2017  01:32 PM    <DIR>          .. 05/17/2017  01:03 PM         1,048,576 1mb.dat 05/17/2017  01:04 PM        52,428,800 50meg-randomdata.dat 05/17/2017  01:32 PM       734,003,200 700mb.dat 05/17/2017  01:03 PM    <DIR>          AxedaBackup                3 File(s)    787,480,576 bytes Not bad at all! 
View full tip
This is part 3 out of 3 videos on Getting Started with ThingWorx Analytics During this video you will learn:   Executing a “Signals” Job Getting the results of the “Signals” Job Executing a “Training Model” Job Getting the results of the “Training Model” Job   Updated Link for access to this video:  Getting Started with ThingWorx Analytics: Part 3 of 3
View full tip
Hi,   I have just been playing around with the Edge MicroServer on a couple of Raspberry Pi's.  I obviously want them connected to ThingWorx, however they stop when the SSH session closes which isn't ideal.  I thought about doing something really quick and dirty using 'nohup', but this could have lead to many running processes, and wouldn't still not have started automatically when the Pi booted.  So, I did it right instead using init.d daemon configurations.   There are two files; one for the EMS and one for the Lua Script Resource.  You need to put these in /etc/init.d, then make sure they are owned by root.   sudo chown root /etc/init.d/thingworx* sudo chgrp root /etc/init.d/thingworx*   You'll need to modify the paths in the first lines of these files to match where you have your microserver folder.  Then you need to update the init.d daemon configs and enable the services.   sudo update-rc.d thingworx-ems defaults sudo update-rc.d thingworx-ems enable sudo update-rc.d thingworx-lsr defaults sudo update-rc.d thingworx-lsr enable   You can then start (stop, etc.) like this:   sudo service thingworx-ems start sudo service thingworx-lsr start   They should both also start automatically after a reboot.   Regards,   Greg
View full tip
This example is to achieve to update objects in Windchill thru extensions. It is really hard to find a resource for Windchill extension's services to take an advantage of them. So, I wrote a simple example to update objects in Windchill from Thingworx.   There are three data shapes needed to do this. One is "PTC.PLM.WindchillPartUfids" which has only "value" field (String) in it and another is "PTC.PLM.WindchillPartCheckedOutDS" which has a "ufid" field (String). Last one is "PTC.PLM.WindchillPartPropertyDS" which has a "ufid" field (String) and fields for "attributes". For an instance of the last data shape, there might be three fields as "ufid", "partPrice" and "quantity" to update parts. In this example, this data shape has two fields which are "ufid" and "almProjectId".   In this example, this needs two input parameters. One is ufid (String) and almProjectId (String). If you need to have multiple objects to update at once, you can use InfoTable type as an "ufid" input parameter instead of String type.   Note that this is an example code and need to handle exceptions if needed.     // To var params = {     infoTableName : "InfoTable",     dataShapeName : "PTC.PLM.WindchillPartUfids" };   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.PLM.WindchillPartUfids) var ufids = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);   // PTC.PLM.WindchillPartUfids entry object var newValue = new Object(); newValue.value = ufid; // STRING   ufids.AddRow(newValue);   // Check out var params = {     ufids: ufids /* INFOTABLE */,     comment: undefined /* STRING */,     dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */ };   // checkedOutObjs: INFOTABLE dataShape: "undefined" var checkedOutObjsFromService = me.CheckOut(params);   var params = {     infoTableName : "InfoTable",     dataShapeName : "PTC.PLM.WindchillPartUfids" };   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.PLM.WindchillPartUfids) var checkedOutObjs = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);   try {     var tableLength = checkedOutObjsFromService.rows.length;       for (var x = 0; x < tableLength; x++) {         var row = checkedOutObjsFromService.rows;               // PTC.PLM.WindchillPartUfids entry object         var checkedOutObj = new Object();         checkedOutObj.value = row.ufid.substring(0,row.ufid.lastIndexOf(":")); // STRING               //logger.warn("UFID : " + checkedOutObj.value);         checkedOutObjs.AddRow(checkedOutObj);           /* Update Objects in Windchill */         var params = {             infoTableName : "InfoTable",             dataShapeName : "PTC.PLM.WindchillPartPropertyDS"         };           // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(PTC.ALM.WindchillPartPropertyDS)         var wcInfoTable = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params);           // PTC.ALM.WindchillPartPropertyDS entry object         var newEntry = new Object();         newEntry.ufid = checkedOutObj.value; // STRING         newEntry.almProjectId = almProjectId; // STRING           wcInfoTable.AddRow(newEntry);           var params = {             objects: wcInfoTable /* INFOTABLE */,             modification: "REPLACE" /* STRING */,             dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */         };           // result: INFOTABLE dataShape: "undefined"         var result = me.Update(params);     }   } catch(err) {     logger.warn("ERROR Catched");     var params = {         ufids: ufids /* INFOTABLE */,         dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */     };       // result: INFOTABLE dataShape: "undefined"     var result = me.CancelCheckOut(params);  }   var params = {     ufids: checkedOutObjs /* INFOTABLE */,     comment: undefined /* STRING */,     dataShape: "PTC.PLM.WindchillPartCheckedOutDS" /* DATASHAPENAME */ };   // result: INFOTABLE dataShape: "undefined" var result = me.CheckIn(params);
View full tip
Announcements