cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - You can Bookmark boards, posts or articles that you'd like to access again easily! X

IoT Tips

Sort by:
Calling external services from M2M applications is a critical aspect of building end-to-end solutions.  Knowing how to apply network timeouts when connecting to external servers can prevent unexpected and problematic network hang-ups. Let's investigate how to create a safe networking flow using HttpClient, HttpBuilder, and Apache’s FTPClient class. Background Custom Objects called from Expression Rules have a configurable maximum execution time.  This is set by the com.axeda.drm.rules.statistics.rule-time-threshold property.  Without this safeguard in place long running or misbehaved Custom Objects can cause internal processing queues to fill and the server will suffer a performance degradation. In Java (and Groovy) all network calls internally use InputStream.read() to establish the socket connection and to read data from the socket.  It is possible for faulty external servers (such as an FTP server) to hang and not properly respond.  This means that the InputStream.read() method will continuously wait for the server to respond with data, and the server will never respond.  According to the Java spec, InputStream.read() may be uninterruptable while it is waiting for data.  This means that if a Custom Object has exceeded the com.axeda.drm.rules.statistics.rule-time-threshold the Rule Sniper will still not be able to interrupt the Custom Object’s execution if it is waiting on InputStream.read().  Because the Custom Object cannot be stopped, the internal processing queues will eventually fill. Even though InputStream.read() is uninterruptable it is still possible to set timeouts for it to be able to give up on a connection.  Beyond that, we want to make sure that the connection is completely disconnected. Types of Timeouts There are typically two types of timeouts that should be set when making calls over the web: the Connection Timeout and the Socket Timeout.  The Connection Timeout is the maximum amount of time that should be allowed when establishing the bi-directional socket connection between the client and server.  Behind the scenes socket connection involves resolving the domain name of the server to an IP address, and then the server opening a port to connect with the client’s port.  The Socket Timeout is the timeout that limits the amount of time each socket operation is allowed to take.  It limits the amount of time InputStream.read() will listen for a server’s response.  If a server is faulty or overloaded it may take a long time (or forever) to respond to a request.  This timeout limits the amount of time the client will wait for the server to respond. When making any calls from a Custom Object to an external server (either making WebService calls, or FTP transfers), you should always set the Connection Timeout and the Socket Timeout.  Always try to keep the timeouts as reasonably small as possible.  Failure to do so could unexpectedly impact your Axeda server.  Consider a Custom Object that takes an average of 10 seconds to run is called to make an external WebService call once a minute. This will not cause any issues and the  system will be stable.  If the external server suddenly has a performance degredation and now the external WebService call takes over a minute to run, the execution queue will eventually fill, causing performance degradation to the Axeda system.  To protect against this scenario, set the timeouts to limit the call to one minute, and log whenever the time limit is exceeded. Examples Provided below are examples of properly set timeouts and thorough connection management use HttpClient, HttpBuilder, and FTPClient.  All of these examples assume they are being executed from Custom Objects. By default, set the Connection Timeout to 10 seconds.  In normal circumstances, connections should not take more then 10 seconds.  If they are exceeding this time there is a good chance of networking issues between the client and server. The Socket Timeout can vary per use-case.  The examples provided set the Socket Timeout to 30 seconds, which should be sufficient for typical WebService calls and small to medium sized FTP file transfers.  Depending exactly on what is being done, the timout may have to be increased.  If you expect the call to go over 5 minutes please contact Axeda Support to investigate increasing  com.axeda.drm.rules.statistics.rule-time-threshold property (which defaults to 5 minutes). ​HttpClient​ //HttpClient import org.apache.http.client.HttpClient import org.apache.http.impl.client.DefaultHttpClient import org.apache.http.client.methods.HttpGet import org.apache.http.HttpResponse import org.apache.http.params.BasicHttpParams import org.apache.http.params.HttpParams import org.apache.http.params.HttpConnectionParams int TENSECONDS  = 10*1000 int THIRTYSECONDS = 30*1000 final HttpParams httpParams = new BasicHttpParams() //Establishing the connection should take <10 seconds in most circumstances HttpConnectionParams.setConnectionTimeout(httpParams, TENSECONDS) //The data transfer/call should take <30 seconds.  Adjust as necessary if receiving large data sets. HttpConnectionParams.setSoTimeout(httpParams, THIRTYSECONDS) HttpClient hc = new DefaultHttpClient(httpParams) try {   //Simply get the contents of http://www.axeda.com and log it to the Custom Object Log   HttpGet get = new HttpGet("http://www.axeda.com")   HttpResponse response = hc.execute(get)   BufferedReader br = new BufferedReader( new InputStreamReader( response.getEntity().getContent()))   br.readLines().each {     logger.info it   } } finally {   //Make sure to shutdown the connectionManager   hc.getConnectionManager().shutdown() } return true https://gist.github.com/axeda/5189092/raw/2f7b93c5f96ed8f445df4364b885486bc6fa1feb/HttpClientTimeouts.groovy HttpBuilder import groovyx.net.http.HTTPBuilder import static groovyx.net.http.ContentType.* import static groovyx.net.http.Method.* int TENSECONDS  = 10*1000; int THIRTYSECONDS = 30*1000; HTTPBuilder builder = new HTTPBuilder('http://www.axeda.com') //HTTPBuilder has no direct methods to add timeouts.  We have to add them to the HttpParams of the underlying HttpClient builder.getClient().getParams().setParameter("http.connection.timeout", new Integer(TENSECONDS)) builder.getClient().getParams().setParameter("http.socket.timeout", new Integer(THIRTYSECONDS)) try {   //Simply get the contents of http://www.axeda.com and log it to the Custom Object Log   builder.request(GET, TEXT){     response.success = { resp, res ->       res.readLines().each {         logger.info it       }       }   } } finally {   //Make sure to always shut down the HTTPBuilder when you’re done with it   builder.shutdown() } return true https://gist.github.com/axeda/5189102/raw/66bb3a4f4f096681847de1d2d38971e6293c4c6b/HttpBuilderTimeouts.groovy FtpClient Apache’s FTPClient has a third type of timeout, the Default Timeout.  The Default Timeout is a timeout that further ensures that socket timeouts are always used.  Note: Default Timeout does not set a timeout for the .connect() method. import org.apache.commons.net.ftp.* import java.io.InputStream import java.io.ByteArrayInputStream String ftphost = "127.0.0.1" String ftpuser = "test" String ftppwd = "test" int ftpport = 21 String ftpDir = "tmp/FTP" int TENSECONDS  = 10*1000 int THIRTYSECONDS = 30*1000 //Declare FTP client FTPClient ftp = new FTPClient() try {   ftp.setConnectTimeout(TENSECONDS)   ftp.setDefaultTimeout(TENSECONDS)   ftp.connect(ftphost, ftpport)   //30 seconds to log on.  Also 30 seconds to change to working directory.   ftp.setSoTimeout(THIRTYSECONDS)   def reply = ftp.getReplyCode()   if (!FTPReply.isPositiveCompletion(reply))   {     throw new Exception("Unable to connect to FTP server")   }   if (!ftp.login(ftpuser, ftppwd))   {     throw new Exception("Unable to login to FTP server")   }   if (!ftp.changeWorkingDirectory(ftpDir))   {     throw new Exception("Unable to change working directory on FTP server")   }   //Change the timeout here for a large file transfer that will take over 30 seconds   //ftp.setSoTimeout(THIRTYSECONDS);   ftp.setFileType(FTPClient.ASCII_FILE_TYPE)   ftp.enterLocalPassiveMode()   String filetxt = "Some String file content"   InputStream is = new ByteArrayInputStream(filetxt.getBytes('US-ASCII'))   try   {     if (!ftp.storeFile("myFile.txt", is))     {       throw new Exception("Unable to write file to FTP server")     }   } finally   {     //Make sure to always close the inputStream     is.close()   } } catch(Exception e) {   //handle exceptions here by logging or auditing } finally {   //if the IO is timed out or force disconnected, exceptions may be thrown when trying to logout/disconnect   try   {     //10 seconds to log off.  Also 10 seconds to disconnect.     ftp.setSoTimeout(TENSECONDS);     ftp.logout();     //depending on the state of the server the .logout() may throw an exception,     //we want to ensure complete disconnect.   }   catch(Exception innerException)   {       //You potentially just want to log that there was a logout exception.     }   finally   {     //Make sure to always disconnect.  If not, there is a chance you will leave hanging sockects     ftp.disconnect();   } } return true https://gist.github.com/axeda/5189120/raw/83545305a38d03b6a73a80fbf4999be3d6b3e74e/FtpClientConnectionTimeouts.groovy
View full tip
This groovy script will return a list of DeviceGroups based off a given Asset's serialnumber. import com.axeda.drm.sdk.device.*; import com.axeda.drm.sdk.Context; Context sysContext = Context.create(); DeviceGroupFinder dgf = new DeviceGroupFinder(sysContext); DeviceFinder devFinder = new DeviceFinder(sysContext); devFinder.setSerialNumber("[your serialnumber here]"); Device myDevice= devFinder.find(); dgf.setDeviceId(myDevice.getId()); List<DeviceGroup> allGroups = dgf.findAll(); allGroups.each { group -> logger.debug(group.getName()); }
View full tip
The following script is a component of the Axeda Connected Configuration (CMDB) feature.  It is used to provide configuration data for controlling package deployments via Connected Content (SCM). ​ ConfigItem_CRU.groovy *Takes a POST request, not parameters import static com.axeda.sdk.v2.dsl.Bridges.* import com.axeda.drm.sdk.scripto.Request import com.axeda.services.v2.ConfigurationItem import com.axeda.services.v2.ConfigurationItemCriteria import com.axeda.services.v2.AssetConfiguration import com.axeda.services.v2.Asset import com.axeda.services.v2.ExecutionResult import groovy.json.JsonSlurper import net.sf.json.JSONObject import groovy.xml.MarkupBuilder /** * ConfigItem_CRU.groovy * ----------------------- * * Reads in json from an http post request and reads, adds, deletes or updates Configuration Items. * * * @note this parses a post and does not take any additional parameters. * * @author sara streeter <sstreeter@axeda.com> */ def contentType = "application/json" final def serviceName = "ConfigItem_CRU" def response = [:] def writer = new StringWriter() def xml = new MarkupBuilder(writer) try {     // BUSINESS LOGIC BEGIN     def assetId     def validationOnly     def validationResponse = ""     List<ConfigurationItem> configItemList     if (Request?.body != null && Request?.body !="") {         def slurper = new JsonSlurper()         def request = slurper.parseText(Request?.body)         assetId = request.result.assetId         validationOnly = request.result.validationOnly?.toBoolean()         if (request.result.items != null && request.result.items.size() > 0){             configItemList = request.result.items.inject([]) { target, item ->               if (item && item.path != "" && item.key != "" && item.path != null && item.key != null){                     ConfigurationItem configItem = new ConfigurationItem()                     configItem.path = item.path + item.key                     configItem.value = item.value                     target << configItem                 }                 target             }         }     }       if (assetId != null) {               def asset = assetBridge.find([assetId])[0]             AssetConfiguration config = assetConfigurationBridge.getAssetConfiguration(assetId, "")               def itemToDelete                        if (config == null) {                     createConfigXML(xml)                     AssetConfiguration configToCreate = assetConfigurationBridge.fromXml(writer.toString(), asset.id)                     ExecutionResult result = assetConfigurationBridge.create(configToCreate)                     AssetConfiguration config2 = assetConfigurationBridge.getAssetConfiguration(asset.id, "")                     config = config2                     itemToDelete = "/Item"                 }                 if (configItemList != null && configItemList?.size() > 0){                 List<ConfigurationItem> compareList = config.items                 def intersectingCompareItems = compareList.inject(["save": [], "delete": []]) { map, item ->                     // find whether to delete                     def foundItem = configItemList.findAll{ compare -> item?.path == compare?.path && item?.value == compare?.value  }                     map[foundItem.size() > 0 ? "save" : "delete"] << item                     map                 }               intersectingCompareItems.delete = intersectingCompareItems.delete.collect{it.path}               if (itemToDelete){                 intersectingCompareItems.delete.add(itemToDelete)               }                 def intersectingConfigItems = configItemList.inject(["old": [], "new": []]) { map, item ->                     // find whether it's old                     def foundItem = compareList.findAll{ compare -> item?.path == compare?.path && item?.value == compare?.value }                     map[foundItem.size() > 0 ? "old" : "new"] << item                     map                 }                 assetConfigurationBridge.deleteConfigurationItems(config, intersectingCompareItems.delete)                 assetConfigurationBridge.appendConfigurationItems(config, intersectingConfigItems.new)               def exResult = assetConfigurationBridge.validate(config)               if (exResult.successful){                     validationResponse = "success"                     if (!validationOnly){                         assetConfigurationBridge.update(config)                     }               }                 else {                     validationResponse = exResult.failures[0]?.details                 }             }             response = [                 assetId: assetId,                 items: config?.items?.collect { item ->                 def origpath = item.path                 def lastSlash = origpath.lastIndexOf("/")                 def key = origpath.substring(lastSlash + 1, origpath.length())                        def path = origpath.replace("/" + key, "")                 path += "/"                     [                         path: path,                         key: key,                         value: item.value                     ]                 },                 validationResponse: validationResponse             ]       }         else {             throw new Exception("Error: Asset Id must be provided.")         } } catch (Exception ex) {       logger.error ex   response = [           error:  [                   type: "Backend Application Error"                   , msg: ex.getLocalizedMessage()           ]   ] } return ['Content-Type': 'application/json', 'Content': JSONObject.fromObject(response).toString(2)] /** * Create the Success response. * * @param xml : The xml response.<br> * @param info : If this is set to "1" the info element will be included in the response.<br> * @param infos : Collection of information to include within the info element of the response.<br> */ private void createConfigXML(xml) {     xml.Item() }  
View full tip
In ThingWorx Analytics, you have the possibility to use an external model for scoring. In this written tutorial, I would like to provide an overview of how you can use a model developed in Python, using the scikit-learn library in ThingWorx Analytics. The provided attachment contains an archive with the following files: iris_data.csv: A dataset for pattern recognition that has a categorical goal. You can click here to read more about this dataset TestRFToPmml.ipynb: A Jupyter notebook file with the source code for the Python model as well as the steps to export it to PMML RF_Iris.pmml: The PMML file with the model that you can directly upload in Analytics without going through the steps of training the model in Python The tutorial assumes you already have some knowledge of ThingWorx and ThingWorx Analytics. Also, if you plan to run the Python code and train the model yourself, you need to have Jupyter notebook installed (I used the one from the Anaconda distribution). For demonstration purposes, I have created a very simple random forest model in Python. To convert the model to PMML, I have used the sklearn2pmml library. Because ThingWorx Analytics supports PMML format 4.3, you need to install sklearn2pmml version 0.56.2 (the highest version that supports PMML 4.3). To read more about this library, please click here Furthermore, to use your model with the older version of the sklearn2pmml, I have installed scikit-learn version 0.23.2.  You will find the commands to install the two libraries in the first two cells of the notebook.   Code Walkthrough The first step is to import the required libraries (please note that pandas library is also required to transform the .csv to a Dataframe object):   import pandas from sklearn.ensemble import RandomForestClassifier from sklearn2pmml import sklearn2pmml from sklearn.model_selection import GridSearchCV from sklearn2pmml.pipeline import PMMLPipeline   After importing the required libraries, we convert the iris_data.csv to a pandas dataframe and then create the features (X) as well as the goal (Y) vectors:   iris_df = pandas.read_csv("iris_data.csv") iris_X = iris_df[iris_df.columns.difference(["class"])] iris_y = iris_df["class"]   To best tune the random forest, we will use the GridSearchCSV and cross-validation. We want to test what parameters have the best validation metrics and for this, we will use a utility function that will print the results:   def print_results(results): print('BEST PARAMS: {}\n'.format(results.best_params_)) means = results.cv_results_['mean_test_score'] stds = results.cv_results_['std_test_score'] for mean, std, params in zip(means, stds, results.cv_results_['params']): print('{} (+/-{}) for {}'.format(round(mean, 3), round(std * 2, 3), params))   We create the random forest model and train it with different numbers of estimators and maximum depth. We will then call the previous function to compare the results for the different parameters:   rf = RandomForestClassifier() parameters = { 'n_estimators': [5, 50, 250], 'max_depth': [2, 4, 8, 16, 32, None] } cv = GridSearchCV(rf, parameters, cv=5) cv.fit(iris_X, iris_y) print_results(cv)   To convert the model to a PMML file, we need to create a PMMLPipeline object, in which we pass the RandomForestClassifier with the tuning parameters we identified in the previous step (please note that in your case, the parameters can be different than in my example). You can check the sklearn2pmml  documentation  to see other examples for creating this PMMLPipeline object :   pipeline = PMMLPipeline([ ("classifier", RandomForestClassifier(max_depth=4,n_estimators=5)) ]) pipeline.fit(iris_X, iris_y)   Then we perform the export:   sklearn2pmml(pipeline, "RF_Iris.pmml", with_repr = True)   The model has now been exported as a PMML file in the same folder as the Jupyter Notebook file and we can upload it to ThingWorx Analytics.   Uploading and Exploring the PMML in Analytics To upload and use the model for scoring, there are two steps that you need to do: First, the PMML file needs to be uploaded to a ThingWorx File Repository Then, go to your Analytics Results thing (the name should be YourAnalyticsGateway_ResultsThing) and execute the service UploadModelFromRepository. Here you will need to specify the repository name and path for your PMML file, as well as a name for your model (and optionally a description)   If everything goes well, the result of the service will be an id. You can save this id to a separate file because you will use it later on. You can verify the status of this model and if it’s ready to use by executing the service GetDetails:   Assuming you want to use the PMML for scoring, but you were not the one to develop the model, maybe you don’t know what the expected inputs and the output of the model are. There are two services that can help you with this: QueryInputFields – to verify the fields expected as input parameters for a scoring job   QueryOutputFields – to verify the expected output of the model The resultType input parameter can be either MODELS or CLUSTERS, depending on the type of model,    Using the PMML for Scoring With all this information at hand, we are now ready to use this PMML for real-time scoring. In a Thing of your choice, define a service to test out the scoring for the PMML we have just uploaded. Create a new service with an infotable as the output (don’t add a datashape). The input data for scoring will be hardcoded in the service, but you can also add it as service input parameters and pass them via a Mashup or from another source. The script will be as follows:   // Values: INFOTABLE dataShape: "" let datasetRef = DataShapes["AnalyticsDatasetRef"].CreateValues(); // Values: INFOTABLE dataShape: "" let data = DataShapes["IrisData"].CreateValues(); data.AddRow({ sepal_length: 2.7, sepal_width: 3.1, petal_length: 2.1, petal_width: 0.4 }); datasetRef.AddRow({ data: data}); // predictiveScores: INFOTABLE dataShape: "" let result = Things["AnalyticsServer_PredictionThing"].RealtimeScore({ modelUri: "results:/models/" + "97471e07-137a-41bb-9f29-f43f107bf9ca", //replace with your own id datasetRef: datasetRef /* INFOTABLE */, });   Once you execute the service, the output should look like this (as we would have expected, according to the output fields in the PMML model):   As you have seen, it is easy to use a model built in Python in ThingWorx Analytics. Please note that you may use it only for scoring, and the model will not appear in Analytics Builder since you have created it on a different platform. If you have any questions about this brief written tutorial, let me know.
View full tip
Data Model Implementation Guide Part 3   Step 7: Unique Components Thing Templates   All of the shared component groups have been created. The next stage is creating the unique component group of ThingTemplates. Each of the below sections will cover one ThingTemplate, how the final property configuration should look, and any other aspects that should be added. The breakdown for the unique component group ThingTemplates is as follows:   Robotic Arm Properties   The properties for the RoboticArm ThingTemplate are as follows: Name Base Type Aspects Data Change Type TimeSincePickup NUMBER, Min Value: 0 Persistent and Logged ALWAYS Axis1 String Persistent and Logged VALUE Axis2 String Persistent and Logged VALUE Axis3 String Persistent and Logged VALUE ClampPressure NUMBER, Min Value: 0 Persistent and Logged ALWAYS ClampStatus String Persistent and Logged ALWAYS   Your properties should match the below configurations.   Pneumatic Gate Properties   The properties for the PneumaticGate ThingTemplate are as follows: Name Base Type Aspects Data Change Type GateStatus String Persistent and Logged ALWAYS   Your properties should match the below configurations.   Conveyor Belt Properties   The properties for the ConveyorBelt ThingTemplate are as follows: Name Base Type Aspects Data Change Type BeltSpeed INTEGER, Min Value: 0 Persistent and Logged ALWAYS BeltTemp INTEGER, Min Value: 0 Persistent and Logged ALWAYS BeltRPM INTEGER, Min Value: 0 Persistent and Logged ALWAYS   Your properties should match the below configurations.   Quality Control Camera   Properties   The properties for the QualityControlCamera ThingTemplate are as follows: Name Base Type Aspects Data Change Type QualityReading INTEGER, Min Value: 0 Persistent and Logged ALWAYS QualitySettings String Persistent and Logged ALWAYS CurrentImage IMAGE Persistent and Logged ALWAYS   Your properties should match the below configurations.   Event   Create a new Event named BadQuality. Select AlertStatus as the Data Shape. Your Event should match the below configurations:     Step 8: Data Tables and Data Shapes   For the Data Model we created, an individual DataTable would be best utilized for each products, production orders, and maintenance requests. Utilizing DataTables will allow us to store and track all of these items within our application. In order to have DataTables, we will need DataShapes to create the schema that each DataTable will follow. This database creation aspect can be considered a part of the Data Model design or a part of the Database Design. Nevertheless, the question of whether to create DataTables is based on the question of needed real time information or needed static information. Products, production orders, and maintenance requests can be considered static data. Tracking the location of a moving truck can be considered a need for real time data. This scenario calls for using DataTables, but a real time application will often have places where Streams and ValueStreams are utilized (DataShapes will also be needed for Streams and ValueStreams). NOTE: The DataShapes (schemas) shown below are for a simplified example. There are different ways you can create your database setup based on your own needs and wants. DataTable Name DataShape Purpose MaintenanceRequestDataTable MaintenanceRequest Store information about all maintenanced requests created ProductDataTable ProductDataShape Store information about the product line ProductionOrderDataTable ProductionOrderDataShape Store all information about production orders that have been placed   Maintenance Requests DataShape   The maintenance requests DataShape needs to be trackable (unique) and contain helpful information to complete the request. The DataShape fields are as follows: Name Base Type Additional Info Primary Key ID String NONE YES Title String NONE NO Request String NONE NO CompletionDate DATETIME NONE NO   Unless you’ve decided to change things around, your DataShape fields should match the below configurations.   Products DataShape   The product DataShape needs to be trackable (unique) and contain information about the product. The DataShape fields are as follows: Name Base Type Additional Info Primary Key ProductId String NONE YES Product String NONE NO Description String NONE NO Cost NUMBER Minimum: 0 NO   Unless you’ve decided to change things around, your DataShape fields should match the below configurations.   Production Order DataShape   The production order DataShape needs to be trackable (unique), contain information that would allow the operator and manager to know where it is in production, and information to help make decisions. The DataShape fields are as follows: Name Base Type Additional Info Primary Key OrderId String NONE YES Product InfoTable: DataShape: ProductDataShape NONE NO ProductionStage String NONE NO OrderDate DATETIME NONE NO DueDate DATETIME NONE NO   Unless you’ve decided to change things around, your DataShape fields should match the below configurations.     Step 9: SystemConnections Implementation   We have created the ThingTemplates and ThingShapes that will be utilized within our Data Model for creating instances (Things). Before we finish the build out of our Data Model, let's create the Services that will be necessary for the MaintenanceSystem and ProductionOrderSystem Things.    This guide will not cover the JavaScript and business logic aspect of creating an application. When complete with the below sections, see the Summary page for how to create that level of definition.       Maintenance System   This is the system managed by the maintenance user and geared towards their needs.   Properties   The properties for the MaintenanceSystem Thing are as follows:     Name Base Type Aspects Data Change Type  MaintEngineerCredentials  PASSWORD  Persistent  VALUE    Your properties should match the below configurations.         Services    The Services for the MaintenanceSystem Thing are as follows:    Service Name  Parameters  Output Base Type Purpose   GetAllMaintenanceRequests  NONE  InfoTable: MaintenanceRequest  Get all of the maintenance requests filed for the maintenance user.  GetFilteredMaintenanceRequests  String: TitleFilter  InfoTable: MaintenanceRequest  Get a filtered version of all maintenance requests filed for the maintenance user.  UpdateMaintenanceRequests  String: RequestTitle  NOTHING  Update a maintenance request already in the system.    Use the same method for creating Services that were provided in earlier sections. Your Services list should match the below configurations.     Production Order System   This is the system utilized by the operator and product manager users and geared towards their needs.   Services   The Services for the ProductionOrderSystem Thing are as follows:      Service Name  Parameters Output Base Type   AssignProductionOrders String: Operator, String: ProductOrder  NOTHING   CreateProductionOrders  String: OrderNumber, String: Product, DATETIME: DueDate  NOTHING  DeleteProductionOrders  String: ProductOrder  NOTHING  GetFilteredProductionOrders  String: ProductOrder  InfoTable: ProductionOrder  GetProductionLineList  NONE  InfoTable: ProductDataShape  GetUnfilteredProductionOrders  NONE  InfoTable: ProductionOrder  MarkSelfOperator  NONE  BOOLEAN  UpdateProductionOrdersOP  String: ProductOrder, String: UpdatedInformation  NOTHING  UpdateProductionOrdersPM  String: ProductOrder, String: UpdatedInformation  NOTHING   Use the same method for creating Services that were provided in earlier sections. Your Services list should match the below configurations.       Challenge Yourself     Complete the implementation of the Data Model shown below by creating the Thing instances of the ThingTemplates we have created. When finish, add more to the Data Model. Some ideas are below.         Ideas for what can be added to this Data Model: #  Idea  1 Add users and permissions   2  Add Mashups to view maintenance requests, products, and production orders  3  Add business logic to the Data Model   Step 10: Next Steps     Congratulations! You've successfully completed the Data Model Implementation Guide. This guide has given you the basic tools to: Create Things, Thing Templates, and Thing Shapes Add Events and Subscriptions   The next guide in the Design and Implement Data Models to Enable Predictive Analytics learning path is Create Custom Business Logic.  
View full tip
Data Model Implementation Guide Part 2   Step 4: SystemConnector Thing Template   After grouping our second set of common functionality and information, we came up with the list below for the second Thing Template to create, SystemConnector with 3 Properties. The breakdown for the SystemConnector Thing Template is as follows:   Follow the below instruction to create this Entity and get the implementation phase of your development cycle going.   System Connector Properties   Let's jump right in. In the ThingWorx Composer, click the + New at the top of the screen.        2. Select Thing Template in the dropdown. 3. In the name field, enter SystemConnector and select a Project (ie, PTCDefaultProject). 4. For the Base Thing Template field, select GenericThing. 5. Click Save. 6. Switch to the Properties and Alerts tab. 7. Click the plus button to add a new Property.   The Properties for the SystemConnector Thing Template are as follows: Name Base Type Aspects Data Change Type EndPointConfig String Persistent and Logged VALUE OperatorCredentials PASSWORD Persistent VALUE ProdManagerCredentials PASSWORD Persistent VALUE Follow the next steps for all the Properties shown in our template property table. Click Add. Enter the name of the property (ie, EndPointConfig). Select the Base Type of the proprty from the dropdown. Check the checkboxes for the property Aspects. Select the Data Change Type from the dropdown.   Click Done when finished creating the property. Your Properties should match the below configurations.            Step 5: HazardousAsset Thing Template     After another round of prioritizing and grouping common functionality and information, we came up with the third Thing Template to create, HazardousAsset. It is a child of the LineAsset Thing Template with one added Service. The breakdown for the HazardousAsset Thing Template is as follows:   Hazardous Asset Service   In the ThingWorx Composer, click the + New at the top of the screen. 2. Select Thing Template in the dropdown. 3. For the Base Thing Template field, select LineAsset and select a Project (PTCDefaultProject). 4. In the name field, enter HazardousAsset. 5.  Click Save then edit to store all changes now. 6.  Switch to the Services tab. 7.  Click Add. 8.  Enter EmergencyShutdown as the name of the service. 9. Switch to the Me/Entities tab. 10. Expand Properties. 11. Click the arrow next to the State property. 12. Modify the generated code to match the following:       me.State = "Danger!! Emergency Shutdown";       Your first Service is complete! 13. Click Done. 14. Click Save to save your changes. Your Service should match the below configurations.     Step 6: InventoryManager Thing Shape   This time around, we will create our first ThingShape, InventoryManager with 1 Property. The breakdown for the InventoryManager Thing Shape is as follows:   Follow the below instruction to create this Entity and get the implementation phase of your development cycle going. System Connector Properties The properties for the InventoryManager Thing Shape are as follows: Name Base Type Aspects Data Change Type ProductCount INTEGER Min Value:0 Persistent and Logged ALWAYS In the ThingWorx Composer, click the + New at the top of the screen. Select Thing Shape in the dropdown. In the name field, enter InventoryManager and select a Project (ie, PTCDefaultProject).       4. Click Save then Edit to store all changes now.         5. Switch to the Properties tab.        6. Click Add.       7. Enter ProductCount as the name of the property.       8. Select the Base Type of the proprty from the dropdown (ie, INTEGER).       9. Check the checkboxes for the property Aspects.      10. Select the Data Change Type from the dropdown.            11. Click Done when finished creating the property. Your Properties should match the below configurations.   Add Thing Shape to Template   We can see that there is some overlap in the components of our HazardousAsset and LineAsset ThingTemplates. In particular, both want information about the product count. Because HazardousAsset inherits from LineAsset, would only need to change LineAsset. Follow the steps below to perform this change: Open the LineAsset Thing Template. In the Implemented Shapes field, enter and select InventoryManager. Save changes.         Click here to view Part 3 of this guide.   
View full tip
This script dumps all the alarms for a model or asset to JSON. Parameters (one or the other must be provided): modelName - (OPTIONAL) String - name of the model assetId - (OPTIONAL) String - id of the asset import com.axeda.common.sdk.id.Identifier import com.axeda.drm.sdk.Context import com.axeda.drm.sdk.audit.AuditCategory import com.axeda.drm.sdk.audit.AuditMessage import com.axeda.drm.sdk.scripto.Request import groovy.json.* import net.sf.json.JSONObject import java.net.URLDecoder import static com.axeda.sdk.v2.dsl.Bridges.* import com.axeda.services.v2.CustomObjectCriteria import com.axeda.services.v2.CustomObjectType import com.axeda.services.v2.CustomObject import com.axeda.services.v2.ExecutionResult import com.axeda.services.v2.ExtendedMap import com.axeda.drm.sdk.device.Model import com.axeda.drm.sdk.device.ModelFinder import com.axeda.drm.sdk.device.DeviceFinder import com.axeda.drm.sdk.device.Device import com.axeda.services.v2.ModelCriteria import com.axeda.services.v2.ModelType import com.axeda.services.v2.FindModelResult import com.axeda.services.v2.AssetCriteria import com.axeda.services.v2.FindAssetResult import com.axeda.services.v2.AlarmCriteria import com.axeda.sdk.v2.bridge.MobileLocationBridge import com.axeda.drm.sdk.mobilelocation.MobileLocationFinder import com.axeda.drm.sdk.mobilelocation.CurrentMobileLocationFinder import com.axeda.drm.sdk.mobilelocation.MobileLocation import com.axeda.drm.sdk.data.AlarmState import com.axeda.drm.sdk.data.AlarmFinder import com.axeda.drm.sdk.data.Alarm import com.axeda.platform.sdk.v1.services.ServiceFactory import com.axeda.drm.sdk.data.CurrentDataFinder import com.axeda.drm.sdk.device.DataItem import com.axeda.drm.sdk.data.HistoricalDataFinder import com.axeda.drm.sdk.data.DataValue import com.axeda.drm.sdk.data.DataValueList import com.axeda.platform.sdk.v1.services.extobject.ExtendedObjectSearchCriteria import com.axeda.common.date.DateRange import com.axeda.common.date.ExplicitDateRange /** * GetModel_Or_Asset_Alarms.groovy * ----------------------- * * Returns assets with organizations, alarms, and current mobile location. * * @params * modelName (OPTIONAL) Str - the name of the model to retrieve assets * assetId (OPTIONAL) Long - the id of the asset - one of the two is REQUIRED * * * @author sara streeter <sstreeter@axeda.com> * */ /** * initialize our global variables * json = the contents of our response * infoString = a stringBuilder used to collect debug information during the script * contentType = the content type we will return * scriptname = The name of this Script, used in multiple places */ def json = new groovy.json.JsonBuilder() def infoString = new StringBuilder() def contentType = "application/json" def scriptName = "GetModel_Or_Asset_Alarms.groovy" def root = [:] def timings = [:] timings.dataItemList = 0 timings.currentdata = 0 timings.histdata = 0 timings.wholescript = 0 timings.alarms = 0 timings.loop = 0 timings.filter = 0 timings.devices = 0 timings.geocode = 0 wholestart = System.currentTimeMillis() final def Context CONTEXT = Context.getSDKContext() def deviceList List<Device> devices try {     /* BUSINESS LOGIC GOES HERE */       def modelName = Request.parameters.modelName     def assetId     def alarms     AlarmFinder alarmFinder = new AlarmFinder(CONTEXT)       if (Request.parameters.assetId != null && Request.parameters.assetId != ""){         assetId = Request.parameters.assetId         DeviceFinder deviceFinder = new DeviceFinder(CONTEXT, new Identifier(assetId as Long));         def device = deviceFinder.find()         if (device){             alarmFinder.setDevice(device)             modelName = device.model.name         }     }     else if (modelName){               try{         modelName = new URLDecoder().decode(modelName)         }         catch(e){ logger.info(e.localizedMessage) }         if (modelName != null && modelName !=""){             ModelFinder modelFinder = new ModelFinder(CONTEXT)             modelFinder.setName(modelName)             Model model = modelFinder.find()                      if (model){                 modelName = model?.name                 alarmFinder.setModel(model)             }         }      }       alarms = alarmFinder.findAll()     // build the json from the models          root = [              "result": [              "model": modelName,              "assetId": assetId,              "alarms":alarms?.inject([]){ aList, alarm ->                    aList << [                         "deviceId": alarm.device?.id?.value,                        "deviceName": alarm.device.name,                        "deviceSerial": alarm.device.serialNumber,                         "name": alarm.name,                         "id": alarm.id.value,                         "state": alarm.state.name,                         "description": alarm.description,                         "severity": alarm.severity,                         "timestamp": alarm.date.time                    ]                                      aList               }             ]          ]     /* BUSINESS LOGIC ENDS HERE */ } catch (Exception e) {     def errorCode = "123456"     processException(scriptName,json,e,errorCode) } finally {     timings.wholescript = System.currentTimeMillis() - wholestart     root += [params: Request.parameters]     root += [timings: timings] } return ['Content-Type': 'application/json', 'Content': JSONObject.fromObject(root).toString(2)] /* * * ACTIVE CODE ENDS HERE * */ //---------------------------------------------------------------// /* * * HELPER METHODS START BELOW * */ /** * Wrap-up the response in our standard return map * @param contentType The global contentType variable * @param response The contents of the response (String ONLY) */ private def createReturnMap(String contentType, String response) {     return ["Content-Type": contentType,"Content":response] } /*     Processes the contents of an Exception and add it to the Errors collection     @param json The markup builder */ private def processException(String scriptName, JsonBuilder json, Exception e, String code) {     // catch the exception output     def logStringWriter = new StringWriter()     e.printStackTrace(new PrintWriter(logStringWriter))     logger.error("Exception occurred in ${scriptName}: ${logStringWriter.toString()}")     /*         Construct the error response         - errorCode Will be an element from an agreed upon enum         - errorMessage The text of the exception      */     json.errors  {         error {             errorCode   "${code}"             message     "[${scriptName}]: " + e.getMessage()             timestamp   "${System.currentTimeMillis()}"         }     }     return json } /*     Log a message. This will log a message and add it to info String     @param logger The injected logger     @param scriptName The name of the script being executed     @param info The infoString to append to     @param message The actual message to log */ private def logMessage(def logger, String scriptName, StringBuilder info, String message) {     logger.info(message)     info.append(message+"\n") } /*     Audit a message. This will store a message in the Audit log, based on the supplied category.     @param category The category for this audit message. One of: "scripting", "network", "device" or "data". Anything not recognized will be treated as "data".     @param message The actual message to audit     @param assetId If supplied, will associate the audit message with the asset at this ID */ private def auditMessage(String category, String message, String assetId) {     AuditCategory auditCategory = null     switch (category) {         case "scripting":             auditCategory = AuditCategory.SCRIPTING;             break;         case "network":             auditCategory = AuditCategory.NETWORK;             break;         case "device":             auditCategory = AuditCategory.DEVICE_COMMUNICATION;             break;         default:             auditCategory = AuditCategory.DATA_MANAGEMENT;             break;     }     if (assetId == null) {         new AuditMessage(Context.create(),"com.axeda.drm.rules.functions.AuditLogAction",auditCategory,[message]).store()     } else {         new AuditMessage(Context.create(),"com.axeda.drm.rules.functions.AuditLogAction",auditCategory,[message],new Identifier(Long.valueOf(assetId))).store()     } } def findOrCreateExtendedMap(String name){        // should take a name of Extended Map and output an object of type Extended Map, if it outputs null we throw an Exception        def outcome = [:]        outcome.extendedMap        ExtendedMap extendedMap = extendedMapBridge.find(name)        if (!extendedMap){             extendedMap = new ExtendedMap(name: name)            extendedMapBridge.create(extendedMap)        }        if (extendedMap) {         ExecutionResult result = new ExecutionResult()         result.setSuccessful(true)         result.setTotalCount(1)         outcome.result = result         outcome.extendedMap = extendedMap        }        else {            ExecutionResult result = new ExecutionResult()            result.setSuccessful(false)            result.setTotalCount(1)            outcome.result = result        }         return outcome    }    def retrieveModels(){       // retrieves the list populated by a separate script        def outcome = [:]        outcome.modelList        ModelCriteria modelCriteria = new ModelCriteria()        modelCriteria.type = ModelType.STANDALONE        FindModelResult modelResult = modelBridge.find(modelCriteria)        if (modelResult.models.size() > 0){         ExecutionResult result = new ExecutionResult()         result.setSuccessful(true)         result.setTotalCount(1)         outcome.result = result         outcome.modelList = modelResult.models        }        else {            ExecutionResult result = new ExecutionResult()            result.setSuccessful(false)            result.setTotalCount(1)            outcome.result = result        }         return outcome    }    def returnModelsWithAssets(List<com.axeda.services.v2.Model> modelList){        def outcome = [:]        outcome.modelList        outcome.message        if (!modelList || modelList?.size() ==0){            ExecutionResult result = new ExecutionResult()           result.setSuccessful(false)           result.setTotalCount(1)           outcome.result = result           outcome.message = "returnModelsWithAssets: Model list was not supplied or was of size zero."           return outcome        }        DeviceFinder deviceFinder = new DeviceFinder(CONTEXT)        ModelFinder modelFinder = new ModelFinder(CONTEXT)        List<com.axeda.drm.sdk.device.Model> sortedList = modelList.inject([]){ target, amodel ->             modelFinder.setName(amodel.modelNumber)            com.axeda.drm.sdk.device.Model bmodel = modelFinder.find()            deviceFinder.setModel(bmodel)            def numAssets = deviceFinder.findAll().size()            if (numAssets > 0 ){                   target << bmodel             }             target        }.sort{ amodel, bmodel ->  amodel.name <=> bmodel.name}        if (sortedList.size() > 0){         ExecutionResult result = new ExecutionResult()         result.setSuccessful(true)         result.setTotalCount(1)         outcome.result = result         outcome.modelList = sortedList        }        else {           ExecutionResult result = new ExecutionResult()           result.setSuccessful(false)           result.setTotalCount(1)           outcome.result = result       }         return outcome    }     def addMapEntry(String mapName, String key, String value){        def outcome = [:]         outcome.key         outcome.value         ExecutionResult result = extendedMapBridge.append(mapName, key, value)         outcome.result = result         if (result.successful){             outcome.key = key             outcome.value = value         }         return outcome    }
View full tip
This script creates a csv file from the audit log filtered by the User Access category, so dates of when users logged in or logged out. *** see update below *** Note:  The csv file has the same name as the Groovy script and does NOT have the .csv extension . To get the .csv extension, the Groovy script has to be renamed to AuditEntryToCSV.csv.groovy .  Suggestions on how to improve this are welcome. *** Update ***: The download works without the renamed groovy script by returning text instead of an input stream.  The script has been modified to illustrate this. Parameters: days - the number of days past to fetch audit logs model_name - the model name of the asset serial_number - the serial number of the asset import com.axeda.drm.sdk.device.ModelFinder import com.axeda.drm.sdk.Context import com.axeda.common.sdk.id.Identifier import com.axeda.drm.sdk.device.Model import com.axeda.drm.sdk.device.DeviceFinder import com.axeda.drm.sdk.device.Device import com.axeda.drm.sdk.audit.AuditCategoryList import com.axeda.drm.sdk.audit.AuditCategory import com.axeda.drm.sdk.audit.AuditEntryFinder import com.axeda.drm.sdk.audit.SortType import com.axeda.drm.sdk.audit.AuditEntry import groovy.xml.MarkupBuilder import com.axeda.platform.sdk.v1.services.ServiceFactory /* * AuditEntryToCSV.groovy * * Creates a csv file from the audit log filtered by the User Access category, so dates of when users logged in or logged out. * * @param days        -   (REQ):Str number of days to search. * @param model_name        -   (REQ):Str name of the model. * @param serial_number        -   (REQ):Str serial number of the device. * * @note - the csv file has the same name as the Groovy script and does NOT have the .csv extension . To get * the .csv extension, the Groovy script has to be renamed to AuditEntryToCSV.csv.groovy . * * @author Sara Streeter <sstreeter@axeda.com> */ def writer = new StringWriter() def xml = new MarkupBuilder(writer) try {    def ctx = Context.getUserContext()    ModelFinder modelFinder = new ModelFinder(ctx)    modelFinder.setName(parameters.model_name)    Model model = modelFinder.find()    DeviceFinder deviceFinder = new DeviceFinder(ctx)    deviceFinder.setSerialNumber(parameters.serial_number)    Device device = deviceFinder.find()    AuditCategoryList acl = new AuditCategoryList()    acl.add(AuditCategory.USER_ACCESS)    long now = System.currentTimeMillis()    Date today = new Date(now)    def paramdays = parameters.days ? parameters.days: 5    long days = 1000 * 60 * 60 * 24 * Integer.valueOf(paramdays)    AuditEntryFinder aef = new AuditEntryFinder(ctx)    aef.setCategories(acl)    aef.setToDate(today)    aef.setFromDate(new Date(now - (days)))    aef.setSortType(SortType.DATE)    aef.sortDescending()    List<AuditEntry> audits = aef.findAll() // use a Data Accumulator to store the information def dataStoreIdentifier = "FILE-CSV-audit_log" def daSvc = new ServiceFactory().dataAccumulatorService if (daSvc.doesAccumulationExist(dataStoreIdentifier, device.id.value)) {     daSvc.deleteAccumulation(dataStoreIdentifier, device.id.value) } // assemble the response    audits.each { AuditEntry audit ->            def row = [                audit?.id.value,                audit?.user?.username,                audit?.date,                audit?.category?.bundleKey,                audit?.message            ]         row = row.join(',')         row += '\n'         daSvc.writeChunk(dataStoreIdentifier, device.id.value, row);        } // stream the data accumulator to create the file    InputStream is = daSvc.streamAccumulation(dataStoreIdentifier, device.id.value) return ['Content-Type': 'text/csv', 'Content-Disposition':'attachment; filename=AuditEntryCSVFile.csv', 'Content': is.text] } catch (def ex) {    xml.Response() {        Fault {            Code('Groovy Exception')            Message(ex.getMessage())            StringWriter sw = new StringWriter();            PrintWriter pw = new PrintWriter(sw);            ex.printStackTrace(pw);            Detail(sw.toString())        }    } logger.info(writer.toString()) }
View full tip
When an Expression Rule of type MobileLocation calls a Groovy script, the script is provided with the implicit object mobileLocation.  This example shows how the mobileLocation object can be used. This Expression Rule calls the Groovy script 'getAddress' to retrieve the location and translate it into a street address: Type:  MobileLocation IF:      some condition e.g. true THEN:  SetDataItem("location", str(ExecuteCustomObject("getAddress"))) The 'getAddress' script uses the mobileLocation object to retrieve the asset's reported location, and then calls a REST service to translate a given latitude and longitude to a street address.  The street address is returned. import groovyx.net.http.RESTClient String rmdHostname =  "http://ws.geonames.org"; if (mobileLocation != null) { rmd = new RESTClient(rmdHostname); try {       def resp = rmd.get( path: 'findNearestAddress',                       query:[lat:mobileLocation.lat , lng:mobileLocation.lng] )        streetnum = resp.data.address.streetNumber.text()        street = resp.data.address.street.text()        town = resp.data.address.adminName2.text()        state = resp.data.address.adminCode1.text()        postalcode = resp.data.address.postalcode.text()            return streetnum + " " + street + " " +  town + " " + state + " " + postalcode } catch (groovyx.net.http.HttpResponseException e) {     e.printStackTrace();     } }
View full tip
This groovy script will return a list of users based off a given UserGroup and allows for filtering by username. import com.axeda.drm.sdk.Context import groovyx.net.http.HTTPBuilder import static groovyx.net.http.ContentType.* import static groovyx.net.http.Method.* import net.sf.json.JSONObject import groovy.json.* import com.axeda.drm.sdk.data.* import com.axeda.drm.sdk.device.* import com.axeda.drm.sdk.user.UserFinder import com.axeda.drm.sdk.user.User import com.axeda.drm.sdk.user.UserGroupFinder //-------------------------------------------------------------------------------------------------------------------- // Example of getting Users from a User Group and filtering by username // //-------------------------------------------------------------------------------------------------------------------- def response = [:] def result = [] try {     final def CONTEXT = Context.create(parameters.username)       UserFinder uFinder = new UserFinder(CONTEXT)     UserGroupFinder ugFinder = new UserGroupFinder(CONTEXT)     List userGroups = getUserGroupsList(ugFinder, "*Demo*")     List SmithsInDemoGroup = userGroups.collect{ usergroup ->         usergroup.getUsers().findResults{ user ->             if (user.username =~ /Smith/){                                      user                                      }                      }     }.flatten()     SmithsInDemoGroup.each{ u ->         result << u.fullName     }   response = [     result: [             items: result     ]   ] } catch (Exception e) {     def m = ""     e.message.each { ex -> m += ex }     response = [                 faultcode: 'Groovy Exception',                 faultstring: m             ]; } return ['Content-Type': 'application/json', 'Content': JSONObject.fromObject(response).toString(2)];   def getUserGroupsList(UserGroupFinder ugFinder, String name){     ugFinder.setName(StringQuery.like(name))     def userGroup = ugFinder.findOne()     List userGroups = new ArrayList();     userGroups.add(userGroup);     return userGroups }
View full tip
When an Expression Rule of type Data calls a Groovy script, the script is provided with the implicit object dataItems.  This example shows how the dataItems object can be used.to get the dataitem information (value, name, type and update time) import com.axeda.drm.sdk.data.* import com.axeda.drm.sdk.device.DataItem try {         def deviceName = context.device.name         // implicit object dataItems passes a list of dataItem objects         def dataItemsList = dataItems         for(dio in dataItemsList) {                logger.info("Checking " + dio.name + " Value: " + dio.value)                if(dio.name == "updateTime") {                        logger.info("Found: " + dio.name + " Value: " + dio.value + " Type: " +    dio.perceptType + " Last Updated: " + new Date(dio.timeInMillis)) // perceptType = analog, digital or string                }         } } catch (Exception e) {         logger.error e.message }
View full tip
Applicable Releases: ThingWorx Platform 7.0 to 8.5   Description:   Main concepts and best practices for devops methodology such as Naming Conventions Setup and management of environments for development and testing Import/Export process and application deployment Use of Tags and Project to control your development Coding Standards Validation best practices         For project packaging and deployment, make sure to check the content about Solution Central created after this session was released
View full tip
import com.axeda.drm.sdk.Context import com.axeda.drm.sdk.device.ModelFinder import com.axeda.drm.sdk.device.Model import com.axeda.drm.sdk.device.DeviceFinder import com.axeda.drm.sdk.data.CurrentDataFinder import com.axeda.drm.sdk.device.Device import com.axeda.drm.sdk.data.HistoricalDataFinder import net.sf.json.JSONObject /* * DataItemEachDevice.groovy * * Find the current data item and historical data items for all assets in a given model. * * @param model_name        -   (REQ):Str name of the model. * @param data_item_name    -   (REQ):Str name of the data item to query on. * @param from_time         -   (REQ):Long millisecond timestamp to begin query from. * @param to_time           -   (REQ):Long millisecond timestamp to end query at. * * @note from_time and to_time should be provided because it limits the query size. * * @author Sara Streeter <sstreeter@axeda.com> */ def response = [:] // measure the script run time def timeProfiles = [:] def scriptStartTime = new Date() try { // getUserContext is supported as of release 6.1.5 and higher     final def CONTEXT = Context.getUserContext() // confirm that required parameters have been provided     validateParameters(actual: parameters, expected: ["model_name", "data_item_name", "from_time", "to_time"]) // find the model     def modelFinder = new ModelFinder(CONTEXT)     modelFinder.setName(parameters.model_name)     Model model = modelFinder.findOne() // throw exception if no model found     if (!model) {         throw new Exception("No model found for ${parameters.model_name}.")     } // find all assets of that model     def assetFinder = new DeviceFinder(CONTEXT)     assetFinder.setModel(model)     def assets = assetFinder.findAll() // find the current and historical data values for each asset //note: since device will be set on the datafinders going forward, a dummy device is set on instantiation which is not actually stored     def currentDataFinder = new CurrentDataFinder(CONTEXT, new Device(CONTEXT, "placeholder", model))     def historicalDataFinder = new HistoricalDataFinder(CONTEXT, new Device(CONTEXT, "placeholder", model))     historicalDataFinder.startDate = new Date(parameters.from_time as Long)     historicalDataFinder.endDate = new Date(parameters.to_time as Long) // assemble the response     assets = assets.collect { Device asset ->         currentDataFinder.device = asset         def currentValue = currentDataFinder.find(parameters.data_item_name)         historicalDataFinder.device = asset         def valueList = historicalDataFinder.find(currentValue?.dataItem)         [                 id: asset.id.value,                 name: asset.name,                 serialNumber: asset.serialNumber,                 model: [id: asset.model.id.value, name: asset.model.name],                 current_data: currentValue.asString(),                 historical_data: valueList.collect { [timestamp: it.getTimestamp().format("yyyyMMdd HH:mm"), value: it.asString()] }         ]     }     response = [result: [items: assets]] } catch (def ex) {     logger.error ex     response += [             error: [                     type: "Backend Application Error", msg: ex.getLocalizedMessage()             ]     ] } finally { // create and output the running time profile     timeProfiles << createTimeProfile("DataItemEachDevice", scriptStartTime, new Date())     response += [params: parameters, meta: [:], timeProfiles: timeProfiles] } return ['Content-Type': 'application/json', 'Content': JSONObject.fromObject(response).toString(2)] private Map createTimeProfile(String label, Date startTime, Date endTime) {     [             (label): [                     startTime: [timestamp: startTime.time, readable: startTime.toString()],                     endTime: [timestamp: endTime.time, readable: endTime.toString()],                     profile: [                             elapsed_millis: endTime.time - startTime.time,                             elapsed_secs: (endTime.time - startTime.time) / 1000                     ]             ]     ] } private validateParameters(Map args) {     if (!args.containsKey("actual")) {         throw new Exception("validateParameters(args) requires 'actual' key.")     }     if (!args.containsKey("expected")) {         throw new Exception("validateParameters(args) requires 'expected' key.")     }     def config = [             require_username: false     ]     Map actualParameters = args.actual.clone() as Map     List expectedParameters = args.expected     config.each { key, value ->         if (args.options?.containsKey(key)) {             config[key] = args.options[key]         }     }     if (!config.require_username) { actualParameters.remove("username") }     expectedParameters.each { paramName ->         if (!actualParameters.containsKey(paramName) || !actualParameters[paramName]) {             throw new IllegalArgumentException(                     "Parameter '${paramName}' was not found in the query; '${paramName}' is a reqd. parameter.")         }     } } Sample Output: {   "result": {     "items": [{       "id": 4240,       "name": "ASVM_9",       "serialNumber": "ASVM_9",       "model": {         "id": 1535,         "name": "SimVM4"       },       "current_data": "142.0",       "historical_data": [{         "timestamp": "20120331 17:00", "value": "142.0"       }, {         "timestamp": "20120331 16:59", "value": "143.0"       }, {         "timestamp": "20120331 16:59", "value": "144.0"       }, {         "timestamp": "20120331 16:58", "value": "145.0"       }, {         "timestamp": "20120331 16:58", "value": "146.0"       }, {         "timestamp": "20120331 16:57", "value": "147.0"       }, {         "timestamp": "20120331 16:57", "value": "148.0"       }, {         "timestamp": "20120330 19:30",         "value": "0.0"       }]     }, {       "id": 4246,       "name": "ASVM_12",       "serialNumber": "ASVM_12",       "model": {         "id": 1535,         "name": "SimVM4"       },       "current_data": "138.0",       "historical_data": [{         "timestamp": "20120331 17:00",        "value": "138.0"       }, {         "timestamp": "20120331 17:00",        "value": "139.0"       }, {         "timestamp": "20120331 16:59",        "value": "140.0"       }, {         "timestamp": "20120331 16:59",        "value": "141.0"       }, {         "timestamp": "20120331 16:59",        "value": "142.0"       }, {         "timestamp": "20120331 16:59",        "value": "143.0"       }, {         "timestamp": "20120330 19:32",         "value": "0.0"       }]      //      // MORE ASSETS HERE      //     }]   },   "params": {     "username": "sstreeter",     "from_time": "1332272219000",     "data_item_name": "CurrentStock",     "sessionid": "JOQ5I7ofRXYA-RnA37Vk93bRUH718yoFF5 9p0JbCnfyoHolFprf",     "model_name": "SimVM4",     "to_time": "1335469008000"   },   "meta": {},   "timeProfiles": {     "DataItemEachDevice": {       "startTime": {         "timestamp": 1335469168725,         "readable": "Thu Apr 26 19:39:28 GMT 2012"       },       "endTime": {         "timestamp": 1335469180569,         "readable": "Thu Apr 26 19:39:40 GMT 2012"       },       "profile": {         "elapsed_millis": 11844,         "elapsed_secs": 11.844       }     }   } }
View full tip
Email an attachment using bytes from a FileInfo Parameters: fileId - the identifier to a FileInfo that has been previously uploaded to the FileStore filename - the name of the attachment toaddress - the email address to send to fromaddress - the email address to send from import com.axeda.drm.util.Emailer; import com.axeda.drm.sdk.contact.Email import javax.mail.internet.AddressException; import javax.mail.internet.InternetAddress; import static com.axeda.sdk.v2.dsl.Bridges.* import com.axeda.services.v2.FileInfoCriteria import org.apache.commons.io.IOUtils import java.security.MessageDigest try {   String fromaddress = parameters.fromaddress   String toaddress = parameters.toaddress   def fileId = parameters.fileId   def filename = parameters.filename   String subject = "Axeda Test Attachment"   String body = "<html><head/><body><p style='background:blue;'>This email has an attachment and a blue background.</p></body></html>"   def thefile = new File(filename)   def inputStream = fileInfoBridge.getFileData(fileId)   byte[] bytes = IOUtils.toByteArray(inputStream);   thefile.setBytes(bytes)   def random_hash = md5('r');   def contentType = "multipart/mixed; boundary=--\"$random_hash\"\r\n"   def htmlType = "text/html" sendEmail(fromaddress, toaddress, subject,  body, contentType, thefile, false, htmlType) } catch (Exception e) { logger.error(e.localizedMessage) } return true def md5(String s) {     MessageDigest digest = MessageDigest.getInstance("MD5")     digest.update(s.bytes);     new BigInteger(1, digest.digest()).toString(16).padLeft(32, '0') } public void sendEmail(String fromAddress, String toAddress,String subject, String body, String encoding, File file, boolean compress, String mimeType) {     try {         Emailer.getInstance().send([new InternetAddress(toAddress)],new InternetAddress(fromAddress), subject,body, encoding, [file] as File[], compress, mimeType);     } catch (Exception ae) {         logger.error(ae.localizedMessage);     } }
View full tip
This Groovy script gets the weather forecast for a given lat/long by calling an external web service. Use in an Expression rule like this: If: something Then: SetDataItem ("precipitation", round(ExecuteCustomObject ("GetPrecipitation", location) )) This sets the dataitem "precipitation" to the value returned by this script. Parameters: Variable Name               Display Name location                         localtion (lat, lon) import org.apache.commons.httpclient.methods.* import org.apache.commons.httpclient.* import java.text.SimpleDateFormat def location = parameters.location.toString() def locparts = location.split(',') def lat = locparts[0] def lon = locparts[1] def hostname = " www.weather.gov" def url = "/forecasts/xml/sample_products/browser_interface/ndfdXMLclient.php" String ndfdElement = "pop12" // see http://www.weather.gov/forecasts/xml/docs/elementInputNames.php def perceptTimeFormat = new SimpleDateFormat ("yyyy-MM-dd'T'HH:mm:ss"); def cal = Calendar.getInstance(); Date startDate = cal.getTime() cal.add(Calendar.HOUR,12) Date endDate = cal.getTime() def client = new HttpClient () HostConfiguration host = client.getHostConfiguration() host.setHost(hostname, 80, "http") GetMethod get = new GetMethod (url) NameValuePair [] params = new NameValuePair [6] params[0] = new NameValuePair ("lat", lat); params[1] = new NameValuePair ("lon", lon); params[2] = new NameValuePair ("product", 'time-series'); params[3] = new NameValuePair ("begin", perceptTimeFormat.format(startDate)); params[4] = new NameValuePair ("end", perceptTimeFormat.format(endDate)); params[5] = new NameValuePair (ndfdElement, ndfdElement); get.setQueryString(params) client.executeMethod(host, get); message = "Status:" + get.getStatusText() content = get.getResponseBodyAsString() get.releaseConnection() // parse result XML and compute average def dwml = new XmlSlurper ().parseText(content) readings = dwml.data.parameters."probability-of-precipitation".value.collect { Integer.parseInt(it.toString()) } average = readings.sum() / readings.size() //logger.info "Expected precipitation for $location is $readings" return readings[0]
View full tip
This code snippet shows how to add an existing Device to an existing DeviceGroup using a custom Groovy script executed by the Scripto web service. To call the script create a URL of the following form: http://<HOST>/services/v1/rest/Scripto/execute/addDeviceToDeviceGroup?us... NOTE: Text in angled brackets (< >) indicates a variable. Alternatively, this script can be called by an Expression Rule using the following form: If: Registration.first Then: ExecuteCustomObject("addDeviceToDeviceGroup","<ASSET_ID>","<GROUP_NAME>") It is worth noting that it is important when creating the Groovy script that the parameters be created in the order of the parameter list. import net.sf.json.JSONObject import com.axeda.drm.sdk.device.DeviceGroupFinder import com.axeda.drm.sdk.device.DeviceGroup import com.axeda.drm.sdk.Context import com.axeda.common.sdk.id.Identifier import com.axeda.drm.sdk.device.DeviceFinder def response = [:], status try {   if (parameters.assetId == null) { throw new IllegalArgumentException("parameter 'assetId' was not provided.")}   if (parameters.groupName == null) { throw new IllegalArgumentException("parameter 'groupName was not provided.")}   final def CONTEXT = Context.create(parameters.username)   def dgf = new DeviceGroupFinder(CONTEXT)   dgf.setName(parameters.groupName)   def group = dgf.find()   if (group == null) {     logger.error "could not retrieve group with name of '${parameters.groupName}'"     throw new Exception("could not retrieve group with id of '${parameters.groupName}'")   }   def df = new DeviceFinder(CONTEXT)   df.setId(new Identifier(parameters.assetId))   def device = df.find()   if (device == null) {     logger.error "could not retrieve asset with id of '${parameters.assetId}'"     throw new Exception("could not retrieve asset with id of '${parameters.assetId}'")   }   group.addDevice(device)   group.store()   // do a check to make sure the device is associated with the group.   group = dgf.find()   def devices = group.getDevices()   status = devices.contains(device) ? "success" : "failure"   // prepare the response.   response = [parameters: parameters, status: status] } catch (def e) {   logger.error e.getMessage()   response = [faultcode: e.getCause(), faultstring: e.getMessage()] } return ['Content-Type': 'application/json', 'Content': JSONObject.fromObject(response).toString(2)];
View full tip
This script finds an existing Expression Rule and applies it to an asset (via asset includes). Parameters: model - model name serial - serial number exprRuleName - name of the Expression Rule import static com.axeda.sdk.v2.dsl.Bridges.* import net.sf.json.JSONObject import com.axeda.drm.sdk.scripto.Request import com.axeda.services.v2.Asset import com.axeda.services.v2.AssetReference import com.axeda.services.v2.AssetCollection import com.axeda.services.v2.AssetCriteria import com.axeda.services.v2.ExpressionRule import com.axeda.services.v2.ExpressionRuleCriteria /* * ApplyExpRuleToAsset.groovy * * Finds an existing Expression Rule and includes an asset into it. * * @param model        -   (REQ):Str model of the asset. * @param serial        -   (REQ):Str serial number of the asset. * @param exprRuleName        -   (REQ):Str name of the Expression Rule. * * @author Sara Streeter <sstreeter@axeda.com> */ def response = [:] def root = [:] try {    AssetCriteria assetCriteria = new AssetCriteria()    assetCriteria.modelNumber = Request.parameters.model    assetCriteria.serialNumber = Request.parameters.serial    def findAssetResult = assetBridge.find(assetCriteria)    def asset = findAssetResult.assets[0]    ExpressionRuleCriteria expressionRuleCriteria = new ExpressionRuleCriteria()    expressionRuleCriteria.name = Request.parameters.exprRuleName    def expressionRuleFindResult = expressionRuleBridge.find(expressionRuleCriteria)    def expressionRule = expressionRuleFindResult.expressionRules[0]   def expAssets =  expressionRule.includedAssets.add(asset)   expressionRuleBridge.update(expressionRule)   response = [        "expressionRule":expressionRule.name,       "includedAsset": asset.serialNumber        ] } catch (Exception e) {      response = [             faultcode: 'Groovy Exception',             faultstring: e.message     ]; } return ["Content-Type": "application/json","Content":JSONObject.fromObject(response).toString(2)]
View full tip
This code snippet creates then deletes a data item to illustrate CRUD technique. Parameter:  model_number import com.axeda.drm.sdk.Context import com.axeda.drm.sdk.device.ModelFinder import com.axeda.drm.sdk.device.Model import com.axeda.drm.sdk.device.DeviceFinder import com.axeda.drm.sdk.data.CurrentDataFinder import com.axeda.drm.sdk.device.Device import com.axeda.drm.sdk.data.HistoricalDataFinder import groovy.xml.MarkupBuilder import com.axeda.drm.sdk.device.DataItem import com.axeda.drm.services.device.DataItemType /* * DeleteDataItem.groovy * * Delete a data item. * * @param model_number        -   (REQ):Str name of the model. * * @author Sara Streeter <sstreeter@axeda.com> */ def response = [:] def writer = new StringWriter() def xml = new MarkupBuilder(writer) try { // getUserContext is supported as of release 6.1.5 and higher     final def CONTEXT = Context.getUserContext() // find the model     def modelFinder = new ModelFinder(CONTEXT)     modelFinder.setName(parameters.model_name)     Model model = modelFinder.findOne() // throw exception if no model found     if (!model) {         throw new Exception("No model found for ${parameters.model_name}.")     } // Add a dummy data item DataItem dataitem = new DataItem(CONTEXT, model, DataItemType.STRING, "MyDataItem"); dataitem.store(); // find the data items on the model model.dataItems.each{     logger.info(it.name)     if (it.name=="MyDataItem"){         it.delete()     } } } catch (def ex) {       xml.Response() {     Fault {           Code('Groovy Exception')           Message(ex.getMessage())           StringWriter sw = new StringWriter();           PrintWriter pw = new PrintWriter(sw);           ex.printStackTrace(pw);           Detail(sw.toString())         }       } } return ['Content-Type': 'text/xml', 'Content': writer.toString()]
View full tip
For a recent project, I was needing to find all of the children in a Network Hierarchy of a particular template type... so I put together a little script that I thought I'd share. Maybe this will be useful to others as well.   In my situation, this script lived in the Location template. This was useful so that I could find all the Sensor Things under any particular node, no matter how deep they are.   For example, given a network like this: Location 1 Sensor 1 Location 1A Sensor 2 Sensor 3 Location 1AA Sensor 4 Location 1B Sensor 5 If you run this service in Location 1, you'll get an InfoTable with these Things: Sensor 1 Sensor 2 Sensor 3 Sensor 4 Sensor 5 From Location 1A: Sensor 2 Sensor 3 Sensor 4 From Location 1AA: Sensor 4 From Location 1B: Sensor 5   For this service, these are the inputs/outputs: Inputs: none Output: InfoTable of type NetworkConnection   // CreateInfoTableFromDataShape(infoTableName:STRING("InfoTable"), dataShapeName:STRING):INFOTABLE(AlertSummary) let result = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape({ infoTableName : "InfoTable", dataShapeName : "NetworkConnection" }); // since the hierarchy could contain locations or sensors, need to recursively loop down to get all the sensors function findChildrenSensors(thingName) { let childrenThings = Networks["Hierarchy_NW"].GetChildConnections({ name: thingName /* STRING */ }); for each (var row in childrenThings.rows) { // row.to has the name of the child Thing if (Things[row.to].IsDerivedFromTemplate({thingTemplateName: "Location_TT"})) { findChildrenSensors(row.to); } else if (Things[row.to].IsDerivedFromTemplate({thingTemplateName: "Sensor_TT"})) { result.AddRow(row); } } } findChildrenSensors(me.name);    
View full tip
This script finds all the data items both current and historical on all the assets of a model and outputs them as XML. Parameters: model_name from_time to_time import com.axeda.drm.sdk.Context import com.axeda.drm.sdk.device.ModelFinder import com.axeda.drm.sdk.device.Model import com.axeda.drm.sdk.device.DeviceFinder import com.axeda.drm.sdk.data.CurrentDataFinder import com.axeda.drm.sdk.device.Device import com.axeda.drm.sdk.data.HistoricalDataFinder import groovy.xml.MarkupBuilder /* * AllDataItems2XML.groovy * * Find all the historical and current data items for all assets in a given model. * * @param model_name        -   (REQ):Str name of the model. * @param from_time         -   (REQ):Long millisecond timestamp to begin query from. * @param to_time           -   (REQ):Long millisecond timestamp to end query at. * * @note from_time and to_time should be provided because it limits the query size. * * @author Sara Streeter <sstreeter@axeda.com> */ def response = [:] def writer = new StringWriter() def xml = new MarkupBuilder(writer) // measure the script run time def timeProfiles = [:] def scriptStartTime = new Date() try { // getUserContext is supported as of release 6.1.5 and higher     final def CONTEXT = Context.getUserContext() // confirm that required parameters have been provided     validateParameters(actual: parameters, expected: ["model_name", "from_time", "to_time"]) // find the model     def modelFinder = new ModelFinder(CONTEXT)     modelFinder.setName(parameters.model_name)     Model model = modelFinder.findOne() // throw exception if no model found     if (!model) {         throw new Exception("No model found for ${parameters.model_name}.")     } // find all assets of that model     def assetFinder = new DeviceFinder(CONTEXT)     assetFinder.setModel(model)     def assets = assetFinder.findAll() // find the current and historical data values for each asset //note: since device will be set on the datafinders going forward, a dummy device is set on instantiation which is not actually stored     def currentDataFinder = new CurrentDataFinder(CONTEXT, new Device(CONTEXT, "placeholder", model))     def historicalDataFinder = new HistoricalDataFinder(CONTEXT, new Device(CONTEXT, "placeholder", model))     historicalDataFinder.startDate = new Date(parameters.from_time as Long)     historicalDataFinder.endDate = new Date(parameters.to_time as Long) // assemble the response     xml.Response(){         assets.each { Device asset ->             currentDataFinder.device = asset             def currentValueList = currentDataFinder.find()             historicalDataFinder.device = asset             def valueList = historicalDataFinder.find()             Asset(){                     id(asset.id.value)                     name( asset.name)                     serial_number(asset.serialNumber)                     model_id( asset.model.id.value)                     model_name(asset.model.name)                     current_data(){                         currentValueList.each{ data ->                         timestamp( data?.getTimestamp()?.format("yyyyMMdd HH:mm"))                          name(data?.dataItem?.name)                          value( data?.asString())                     }}                     historical_data(){                         valueList.each { data ->                         timestamp( data?.getTimestamp()?.format("yyyyMMdd HH:mm"))                          name(data?.dataItem?.name)                          value( data?.asString())                     }}             }         }     } } catch (def ex) {       xml.Response() {     Fault {           Code('Groovy Exception')           Message(ex.getMessage())           StringWriter sw = new StringWriter();           PrintWriter pw = new PrintWriter(sw);           ex.printStackTrace(pw);           Detail(sw.toString())         }       } } return ['Content-Type': 'text/xml', 'Content': writer.toString()] private Map createTimeProfile(String label, Date startTime, Date endTime) {     [             (label): [                     startTime: [timestamp: startTime.time, readable: startTime.toString()],                     endTime: [timestamp: endTime.time, readable: endTime.toString()],                     profile: [                             elapsed_millis: endTime.time - startTime.time,                             elapsed_secs: (endTime.time - startTime.time) / 1000                     ]             ]     ] } private validateParameters(Map args) {     if (!args.containsKey("actual")) {         throw new Exception("validateParameters(args) requires 'actual' key.")     }     if (!args.containsKey("expected")) {         throw new Exception("validateParameters(args) requires 'expected' key.")     }     def config = [             require_username: false     ]     Map actualParameters = args.actual.clone() as Map     List expectedParameters = args.expected     config.each { key, value ->         if (args.options?.containsKey(key)) {             config[key] = args.options[key]         }     }     if (!config.require_username) { actualParameters.remove("username") }     expectedParameters.each { paramName ->         if (!actualParameters.containsKey(paramName) || !actualParameters[paramName]) {             throw new IllegalArgumentException(                     "Parameter '${paramName}' was not found in the query; '${paramName}' is a reqd. parameter.")         }     } } Sample Output: <Response>   <Asset>   <id>2864</id>   <name>keg24</name>   <serial_number>keg24</serial_number>   <model_id>1081</model_id>   <model_name>Kegerator</model_name>   <current_data>   <timestamp>20111103 14:44</timestamp>   <name>currKegPercentage</name>   <value>34.0</value>   <timestamp>20111103 14:38</timestamp>   <name>currTempF</name>   <value>43.0</value>   </current_data>   <historical_data />   </Asset>   <Asset>   <id>2861</id>   <name>keg28</name>   <serial_number>keg28</serial_number>   <model_id>1081</model_id>   <model_name>Kegerator</model_name>   <current_data>   <timestamp />   <name>currKegPercentage</name>   <value>?</value>   <timestamp>20111103 14:21</timestamp>   <name>currTempF</name>   <value>43.0</value>   </current_data>   <historical_data />   </Asset>   <Asset>   <id>2863</id>   <name>keg21</name>   <serial_number>keg21</serial_number>   <model_id>1081</model_id>   <model_name>Kegerator</model_name>   <current_data>   <timestamp />   <name>currKegPercentage</name>   <value>?</value>   <timestamp>20111103 14:39</timestamp>   <name>currTempF</name>   <value>42.0</value>   </current_data>   <historical_data />   </Asset>   <Asset>   <id>2862</id>   <name>keg25</name>   <serial_number>keg25</serial_number>   <model_id>1081</model_id>   <model_name>Kegerator</model_name>   <current_data>   <timestamp>20111103 14:36</timestamp>   <name>currKegPercentage</name>   <value>34.0</value>   <timestamp />   <name>currTempF</name>   <value>?</value>   </current_data>   <historical_data />   </Asset>   <Asset>   <id>2867</id>   <name>keg29</name>   <serial_number>keg29</serial_number>   <model_id>1081</model_id>   <model_name>Kegerator</model_name>   <current_data>   <timestamp>20111103 14:48</timestamp>   <name>currKegPercentage</name>   <value>35.0</value>   <timestamp />   <name>currTempF</name>   <value>?</value>   </current_data>   <historical_data />   </Asset>   <Asset>   <id>2865</id>   <name>keg27</name>   <serial_number>keg27</serial_number>   <model_id>1081</model_id>   <model_name>Kegerator</model_name>   <current_data>   <timestamp>20111103 14:39</timestamp>   <name>currKegPercentage</name>   <value>34.0</value>   <timestamp>20111103 14:44</timestamp>   <name>currTempF</name>   <value>42.0</value>   </current_data>   <historical_data />   </Asset>   <Asset>   <id>2866</id>   <name>keg23</name>   <serial_number>keg23</serial_number>   <model_id>1081</model_id>   <model_name>Kegerator</model_name>   <current_data>   <timestamp>20111103 14:46</timestamp>   <name>currKegPercentage</name>   <value>34.0</value>   <timestamp />   <name>currTempF</name>   <value>?</value>   </current_data>   <historical_data />   </Asset> </Response>
View full tip