cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Did you know you can set a signature that will be added to all your posts? Set it here! X

IoT Tips

Sort by:
The natively exposed ThingWorx Platform performance metrics can be extremely valuable to understanding overall platform performance and certain of the core subsystem operations, however as a development platform this doesn't give any visibility into what your built solution is or is not doing.   Here is an amazing little trick that you can use to embed custom performance metrics into your application so that they show up automatically in your Prometheus monitoring system. What you do with these metrics is up to your creativity (with some constraints of course). Imaging a request counter for specific services which may be incredibly important or costly to run, or an exception metric that is incremented each time you catch an exception, or a query result size metric that informs you of how much data is being queried from the database.   Refer to Resources > MetricsServices: GetCounterMetric GetGaugeMetric IncrementCounterMetric DecrementCounterMetric SetGaugeMetric You'll need to give your metric a name - identified by key - and this is meant to be dotted notation* which will then be converted to underscores when the metric is exposed on the OpenMetrics endpoint.  Use sections/domains in the dotted notation to structure your metrics in-line with your application design.   COUNTER type metrics are the most commonly used and relate to things happening through time.  They are an index which will get timestamped as they're collected by Prometheus so that you will be able to look back in time and analyse and investigate what happened when and what the scale or impact was.  After the fact functions and queries will need to be applied to make these metrics most useful (delta over time, increase, rate per second).   Common examples of counter type metrics are: requests, executions, bytes transferred, rows queried, seconds elapsed, execution time.     Resources["MetricServices"].IncrementCounterMetric({ basetype: "LONG", value: 1, key: "__PTC_Reported.integration.mes.requests", aggregate: false });     GAUGE type metrics are point-in-time status of some thing being measured.   Common gauge type metrics are: CPU load/utilization, memory utilization, free disk space, used disk space, busy/active threads.     Resources["MetricServices"].SetGaugeMetric({ basetype: "NUMBER", value: 12, key: "__PTC_Reported.Users.ConnectedOperatorCount", aggregate: true });     Be aware of the aggregate flag, as it will make this custom metric cluster level which can have some unintended consequences.  Normally you always want performance metrics for the specific node as you then see what work is happening where and can confirm that it is being properly distributed within the cluster.  There are some situations however where you might want the cluster aggregation however, like with this concurrently connected operators.   Happy Monitoring!  
View full tip
    Install a Connected Components Workbench to program an Allen-Bradley PLC   Guide Concept   In this guide, you'll install Rockwell Automation's Connected Components Workbench, which facilitates programming of an Allen-Bradley PLC.   You'll learn how to   Create a Rockwell Automation Account Download software from Rockwell Automation's website Install Connected Components Workbench   NOTE: The estimated time to complete this guide is 60 minutes       Step 1: Learning Path Overview   Welcome to the Rockwell Automation Learning Path!   This first guide explains the steps to get up and running with Rockwell's Connected Components Workbench, a software program to facilitate connecting to and configuring Programmable Logic Controllers, i.e. PLCs.   PLCs are commonly used in factories and other automation scenarios to control minor aspects, such as turnning particular devices on or off based on a particular situation.   Note that you don't necessarily have to run through this guide as part of the Learning Path. If you simply want to learn how to install Connected Components Workbench, this guide can still be useful to you.   But assuming you are using this guide as part of the Rockwell Automation Learning Path, then the first elements will be to install your software, i.e.:   Connected Components Workbench ThingWorx Kepware Server ThingWorx Foundation (for Windows)   You'll then connect an Allen-Bradley PLC to Connected Components Workbench and then to ThingWorx Kepware Server.   Next, we'll propogate that information further from ThingWorx Kepware Server into Foundation. Finally, we'll use Foundation's Mashup Builder to construct a GUI which can both receive and send information to the PLC.   We hope you enjoy this Learning Path.       Step 2: Signup   To access Rockwell Automation software, you first have to create an account on the Rockwell Automation website.   Go to the Rockwell Automation site.   In the top-right, click the "person symbol" for Account.   On the right, click Create an Account.   Enter a valid email address which you control and click Continue.   Enter your first and last names, your country, and click Continue.   Enter your job information and click Continue.   Enter and re-enter a password, check the EULA agreement box, and click Create Account.   Rockwell will send a verification email to the address you previously entered.   Locate the email and click Verify My Email Address.   A new browser tab will open with your verification; click Proceed to Sign In.   Sign-in with your verified email address.       Step 3: Download   Now that you have a verified Rockwell account, you will download Connected Components Workbench.   This download is a little different in that it actually has two parts.   You have to download both, and then run a built-in combiner to get access to the actual installation files.   Go to the Connected Components Workbench download site. Click Select Files. On the pop-up, check Connected Components Workbench and click Downloads. On the new pop-up, click DOWNLOAD NOW. On the download Software End-User License Agreement page, click Accept and Download. Move the download into a separate folder. Right-click on the download and select Run as administator. Agree to let the program make changes to your computer.   The download-manager will then begin the download of the actual software. Click Close when the download has completed.   Notice that there is now a new C:\RA\CCW folder.   Right-click on part1.exe and select Run as administator.   On the WinRAR self-extracting archive window, click Extract.   Notice that there is now a new "DVD" folder.   Navigate into the "DVD" folder.   Notice the Setup.exe file, which will be used to install Connected Components Workbench.       Step 4: Install   Now that you have Connected Components Workbench properly downloaded and extracted, you can begin the installation.   Right-click on Setup.exe and select Run as administrator. Click Yes to allow the program to make modifications to your computer.   Click Install now.   On the EULA screen, click Accept all.   Allow the installer to run to completion. This may take ~30 minutes..   Click Restart now to reboot and complete the installation.   After restarting, you may be asked to set your Country/Territory.   Click OK to confirm.   In the Windows "Start" menu, navigate to and expand the Rockwell Automation folder.   Click Connected Components Workbench.       Step 5: Next Steps   Congratulations! You've successfully completed the Connected Components Workbench installation guide.   In this guide, you learned how to:   Create a Rockwell Automation Account Download software from Rockwell's website Install a Connected Components Workbench   The next guide in the Using an Allen-Bradley PLC with ThingWorx learning path is Install ThingWorx Kepware Server.    Learn More    Capability Resource Manage Install ThingWorx Kepware Server   Additional Resources   For additional information on Rockwell:    Resource Link Documentation Rockwell Literature Library Documentation Connected Components Workbench
View full tip
Business logic and actions in a ThingWorx application are driven by events.  Events are interesting or critical property states that a Thing publishes to subscribers.  Events are defined at the thing, thing template, or thing shape level, and can be as simple as a new data value from a device, to complex events from many data points.  An event that is created on a thing shape or thing template level is inherited down into the entities that implement the shape or template that defines the event.   Types of Events Built-in Custom   Built-in Events Every entity in ThingWorx can use several built-in events.  These events are automatically triggered when a prerequisite condition is met.  In some cases, you must provide information that determines the specifics of that prerequisite.   Common built-in events include: Data Change Alert Timer  In ThingWorx, there are standard events and related data packets (defined by Data Shapes).  The most common type of event is a Data Change related to a Thing property.  The Data Change fires when a property value changes. The specifics of this event can be configured from the Data Change Info section of the Property Creation page.  When you define a property, there are many configuration aspects.   Example:   Using DataChangeEvent, you have a few options for setting an event to fire when there is new data for a property: only if the data has changed only if the data evaluates true or false only if the new value changed beyond a defined threshold   The Alert event is also attached to a property and is set up from the Property Definition page.  An Alert event fires when the value of the property reaches a certain threshold or value.  Depending on the base type, you may specify the condition of this event from the Manage Alerts button.   Timer events can be used to run jobs or fire events on a regular basis.  Things themselves can fire specific events such as Thing Start or “special” things, like a Timer type thing that contains special event(s).   How to Set up and Configure Timers   Creating a Custom Event To create a new custom event:   In ThingWorx Composer, open up the Thing, Thing Template, or Thing Shape for edit where the new Event will be added From the sidebar menu under Entity Information, select Events Click Add My Event Provide the event with a name and a data shape that describes the data being sent to a subscriber Click Done and Save   Once created, the event appears on the Subscriptions page for subscribing.  Since the event is custom built, you must specify when it triggers.  This is often accomplished from within a service.   Best Practices There should be a subscriber to the event in the model. The subscriber is sent a data packet and the subscription is initiated. If no one is subscribed to the event (no one is listening), nothing happens. DataChange events hit the event processing subsystem (see Event Processing Subsystem below), which has fewer threads allocated to it than some other subsystems: If queries within subscriptions to data change events take too long, this will cause a back-up in the event processing subsystem Too much activity in this subsystem may result in poor system performance, and even server crashes Be cautious what types of calculations are performed in data change event subscriptions Refer to the following knowledge base article regarding tricky data change event use cases, the implementation of rules for alerts, and conditional operations:  Complicated Event Calculations in ThingWorx Load Test all applications in Sandbox before committing changes to Production to ensure mashups can handle all design choices on large scales   Event Processing Subsystem The Event Processing Subsystem manages event processing for external subscriptions (Things subscribing to other Things) throughout ThingWorx.   Event Queue Processing Settings Base Type Default Notes Min Threads Allocated to Event Processing Pool NUMBER 16   Max Threads Allocated to Event Processing Pool NUMBER 500   Max Queue Entries Before Adding New Working Thread NUMBER 200000 Maximum number of entries to queue before adding a thread to the pool     Additional Information PTC ThingWorx Help Center - Thing Events Event Types Creating and Triggering Custom Events in ThingWorx  
View full tip
Predictive models: ​ Predictive model is one of the best technique to perform predictive analytics. This is the development of models that are trained on historical data and make predictions on new data. These models are built in order to analyse the current data records in combination with some historical data.   Use of Predictive Analytics in Thingworx Analytics and How to Access Predictive Analysis Functionality via Thingworx Analytics   Bias and variance are the two components of imprecision in predictive models. Bias in predictive models is a measure of model rigidity and inflexibility, and means that your model is not capturing all the signal it could from the data. Bias is also known as under-fitting.  Variance on the other hand is a measure of model inconsistency, high variance models tend to perform very well on some data points and really bad on others. This is also known as over-fitting and means that your model is too flexible for the amount of training data you have and ends up picking up noise in addition to the signal.   If your model is performing really well on the training set, but much poorer on the hold-out set, then it’s suffering from high variance. On the other hand if your model is performing poorly on both training and test data sets, it is suffering from high bias.   Techniques to improve:   Add more data: Having more data is always a good idea. It allows the “data to tell for itself,” instead of relying on assumptions and weak correlations. Presence of more data results in better and accurate models. The question is when we should ask for more data? We cannot quantify more data. It depends on the problem you are working on and the algorithm you are implementing, example when we work with time series data, we should look for at least one-year data, And whenever you are dealing with neural network algorithms, you are advised to get more data for training otherwise model won’t generalize.  Feature Engineering: Adding new feature decreases bias on the expense of variance of the model. New features can help algorithms to explain variance of the model in more effective way. When we do hypothesis generation, there should be enough time spent on features required for the model. Then we should create those features from existing data sets. Feature Selection: This is one of the most important aspects of predictive modelling. It is always advisable to choose important features in the model and build the model again only with important and significant features. e. let’s say we have 100 variables. There will be variables which drive most of the variance of a model. If we just select the number of features only on p-value basis, then we may still have more than 50 variables. In that case, you should look for other measures like contribution of individual variable to the model. If 90% variance of the model is explained by only 15 variables then only choose those 15 variables in the final model. Multiple Algorithms: Hitting at the right machine learning algorithm is the ideal approach to achieve higher accuracy. Some algorithms are better suited to a particular type of data sets than others. Hence, we should apply all relevant models and check the performance. Algorithm Tuning: We know that machine learning algorithms are driven by parameters. These parameters majorly influence the outcome of learning process. The objective of parameter tuning is to find the optimum value for each parameter to improve the accuracy of the model. To tune these parameters, you must have a good understanding of these meaning and their individual impact on model. You can repeat this process with a number of well performing models. For example: In random forest, we have various parameters like max_features, number_trees, random_state, oob_score and others. Intuitive optimization of these parameter values will result in better and more accurate models. Cross Validation: Cross Validation is one of the most important concepts in data modeling. It says, try to leave a sample on which you do not train the model and test the model on this sample before finalizing the model. This method helps us to achieve more generalized relationships. Ensemble Methods: This is the most common approach found majorly in winning solutions of Data science competitions. This technique simply combines the result of multiple weak models and produce better results. This can be achieved through many ways.  Bagging: It uses several versions of the same model trained on slightly different samples of the training data to reduce variance without any noticeable effect on bias. Bagging could be computationally intensive esp. in terms of memory. Boosting: is a slightly more complicated concept and relies on training several models successively each trying to learn from the errors of the models preceding it. Boosting decreases bias and hardly affects variance.     
View full tip
Greetings, Community Members! PTC has launched ThingWorx 9.6.0 as of June, and it's now ready for your upgrade! Let's explore the key enhancements in this latest version of ThingWorx.   What are the top three areas of updates in ThingWorx 9.6?   Performance, Scalability, Reliability, and Security: There's a significant boost in file transfer performance between connected devices and the ThingWorx platform. A notable enhancement in server startup performance, with some instances showing an 84% reduction in startup time. Numerous logging improvements, including limitations on log verbosity, log filtration, and configurable log storage capabilities, contribute to the stabilization of the ThingWorx system. Additionally, ThingWorx now supports log extraction to third-party software like Sumologic, Datadog, Splunk, Grafana, etc., utilizing the industry-standard OpenTelemetry framework starting from TWX 9.6. Content Security Policy has been implemented, fortifying ThingWorx against script and data injection attacks, man-in-the-middle (MITM) attacks, and clickjacking. Several tech stack updates in 9.6; support now available for Azure B2C and TLS 1.3 (limited)   Developer Productivity: Introducing a new Collection widget with improved performance, enabling a transition away from the legacy Collection and Repeater widgets. Several other new widgets such as KPI dial widget, Tree selector widget, Progress Tracker widget and other critical enhancements for ThingWorx WebComponents are now available Support for viewing mashup configurations such as layouts, bindings, and widget properties in a read-only mode, helping improve user experience by allowing multiple users to review mashup designs simultaneously without making edits.   Solutions updates: Streamlined continuous improvement with the ability to View and Create Actions in One-Click from performance analysis screens for the Digital Performance Management (DPM) solution. Enhanced performance for the Asset Monitoring & Utilization (AMU) solution, with alarm events creation now handled asynchronously. Several fixes and improvements for Connected Work Cell (CWC), Real-Time Production Performance Monitoring (RTPPM), and DPM such as limits evaluation, messages not displayed, incorrectly calculated KPIs and issues with Running Time on the operator display and more, helping customers achieve continuous improvements in their manufacturing operations   View release notes here and be sure to upgrade to 9.6!   Dilanur Bayraktar ThingWorx Product Management
View full tip
Persistent vs. Logged Properties By Mike Jasperson, VP of IoT EDC   Executive Summary ThingWorx provides several different “aspects” (or storage options) for how property values are saved.  These options each have different implications for performance and scalability.  Understanding those implications is important for designing a scalable IOT solution.   Persistent Properties are best used for non-telemetry data which will change infrequently (for example only a few times in a day) and where historical values are not required.  When overused, Persistent properties can put significant pressure on the database layer of your ThingWorx implementation, leading to poor performance of your IOT application.  As the number of Things in your IOT application scales up, the quantity or frequency of persistent properties per Thing needs to be carefully considered.   Logged Properties are best used for telemetry data where historical values need to be retained, but also for any other value that is expected to change frequently.  Logged properties can create some additional requirements: a process for handling null/default values after restarts, more disk space, and a data retention policy. There are benefits as well, though, like more flexibility and scalability for the ingestion of larger volumes of data.   Persistent + Logged Properties perform database operations of both aspects.  Combined use should be very limited – only properties that update infrequently (a few times a day), and that must be in-memory in the event of a ThingWorx restart.   In-Memory Only Properties are neither persistent nor logged – they are not stored to the database.  These properties can greatly improve scale for values that need to be available for the application to drive UIs or compute other derived values that will be stored.  However, high-frequency updates of in-memory properties can create scale challenges in HA (high availability) ThingWorx configurations where memory state needs to be constantly shared between multiple ThingWorx nodes.     Find a complete summary as well as example cases in the document attached.
View full tip
Connectors allow clients to establish a connection to Tomcat via the HTTP / HTTPS protocol. Tomcat allows for configuring multiple connectors so that users or devices can either connect via HTTP or HTTPS.   Usually users like you and me access websites by just typing the URL in the browser's address bar, e.g. "www.google.com". By default browsers assume that the connection should be established with the HTTP protocol. For HTTPS connections, the protocol has to be specified explictily, e.g. "https://www.google.com"   However - Google automatically forwards HTTP connections automatically as a HTTPS connection, so that all connections are using certificates and are via a secure channel and you will end up on "https://www.google.com" anyway.   To configure ThingWorx to only allow secure connections there are two options...   1) Remove HTTP access   If HTTP access is removed, users can no longer connect to the 80 or 8080 port. ThingWorx will only be accessible on port 443 (or 8443).   If connecting to port 8080 clients will not be redirected. The redirectPort in the Connector is only forwarding requests internally in Tomcat, not switching protocols and ports and not requiring a certificate when being used. The redirected port does not reflect in the client's connection but only manages internal port-forwarding in Tomcat.   By removing the HTTP ports for access any connection on port 80 (or 8080) will end up in an error message that the client cannot connect on this port.   To remove the HTTP ports, edit the <Tomcat>\conf\server.xml and comment out sections like       <!-- commented out to disallow connections on port 80 <Connector port="80" protocol="org.apache.coyote.http11.Http11NioProtocol" connectionTimeout="20000" redirectPort="443" /> -->     Save and restart Tomcat. If opening Tomcat (and ThingWorx) in a browser via http://myServer/ the connection will fail with a "This site can’t be reached", "ERR_CONNECTION_REFUSED" error.   2) Forcing insecure connections through secure ports   Alternatively, port 80 and 8080 can be kept open to still allow users and devices to connect. But instead of only internally forwarding the port, Tomcat can be setup to also forward the client to the new secure port. With this, users and devices can still use e.g. old bookmarks and do not have to explicitly set the HTTPS protocol in the address.   To configure this, edit the <Tomcat>\conf\web.xml and add the following section just before the closing </web-app> tag at the end of the file:     <security-constraint>        <web-resource-collection>              <web-resource-name>HTTPSOnly</web-resource-name>              <url-pattern>/*</url-pattern>        </web-resource-collection>        <user-data-constraint>              <transport-guarantee>CONFIDENTIAL</transport-guarantee>        </user-data-constraint> </security-constraint>     In <Tomcat>\conf\web.xml ensure that all HTTP Connectors (port 80 and 8080) have their redirect port set to the secure HTTPS Connector (usually port 443 or port 8443).   Save and restart Tomcat. If opening Tomcat (and ThingWorx) in a browser via http://myServer/ the connection will now be forwarded to the secure port. The browser will now show the connection as https://myServer/ instead and connections are secure and using certificates.   What next?   Configuring Tomcat to force insecure connection to actually secure HTTPS connection just requires a simple configuration change. If you want to read more about certificates, encryption and how to setup ThingWorx for HTTPS in the first place, be sure to also have a look at   Trust & Encryption - Theory Trust & Encryption - Hands On
View full tip
The KEPServerEX installer download is available from the My Kepware web portal: https://my.kepware.com/mykepware/ It will be necessary to create a new My Kepware account if you have not already done so. When you log in to your My Kepware account, you will be at the My Kepware Home page. In the left column, you will see a link to download the latest release. You can access all of the features of KEPServerEX with this installation, including the Thingworx Native Interface. For more information on installation of KEPServerEX, see this guide: https://www.kepware.com/en-us/products/kepserverex/documents/installation-guide.pdf
View full tip
I created this recently for another group -  might be useful    Video Link to Create a Database connection to you Postgres Thingwork Database 
View full tip
NOTE: Even though I have tried on ODataConnector and SwaggerConnector, these steps below should be working for all the Thingworx Integration Connectors viz. GenericConnector, HTTPConnector, ODataConnector, SAPODataConnector, SwaggerConnector, WindchillSwaggerConnector.   This document guides you to add a custom header in any Thingworx Integration connector. Step 1. Create a Datashape say "CustomHeadersDataShape" and add a string field with Name the same as the header name you want to add. In this case, I want to add a header called "Prefer" so it will look something like              Step 2: Go to the Integration Connector which you want to add this custom header. Navigate to "Services". Under the "Inherited Services", edit/overwrite the "GetCustomHeaderParameters" service by clicking on the edit (pencil) icon. Step 3: In the JavaScript Code sniped section add below code snipped   var params = { infoTableName : "InfoTable", dataShapeName : "CustomHeadersDataShape" }; var result = Resources["InfoTableFunctions"].CreateInfoTableFromDataShape(params); var preferValue = "odata.maxpagesize=50"; var newRow = {"Prefer" : preferValue }; result.AddRow(newRow);   Step 4: Save the service and execute "GetCustomHeaderParameters". You should see something like         Now your custom header "Prefer: odata.maxpagesize=50" is set. further execution of your connector services will consider this header until it is reset.
View full tip
​​​There are four types of Analytics:                                                                 Prescriptive analytics: What should I do about it? Prescriptive analytics is about using data and analytics to improve decisions and therefore the effectiveness of actions.Prescriptive analytics is related to both Descriptive and Predictive analytics. While Descriptive analytics aims to provide insight into what has happened and Predictive analytics helps model and forecast what might happen, Prescriptive analytics seeks to determine the best solution or outcome among various choices, given the known parameters. “Any combination of analytics, math, experiments, simulation, and/or artificial intelligence used to improve the effectiveness of decisions made by humans or by decision logic embedded in applications.”These analytics go beyond descriptive and predictive analytics by recommending one or more possible courses of action. Essentially they predict multiple futures and allow companies to assess a number of possible outcomes based upon their actions. Prescriptive analytics use a combination of techniques and tools such as business rules, algorithms, machine learning and computational modelling procedures. Prescriptive analytics can also suggest decision options for how to take advantage of a future opportunity or mitigate a future risk, and illustrate the implications of each decision option. In practice, prescriptive analytics can continually and automatically process new data to improve the accuracy of predictions and provide better decision options. Prescriptive analytics can be used in two ways: Inform decision logic with analytics: Decision logic needs data as an input to make the decision. The veracity and timeliness of data will insure that the decision logic will operate as expected. It doesn’t matter if the decision logic is that of a person or embedded in an application — in both cases, prescriptive analytics provides the input to the process. Prescriptive analytics can be as simple as aggregate analytics about how much a customer spent on products last month or as sophisticated as a predictive model that predicts the next best offer to a customer. The decision logic may even include an optimization model to determine how much, if any, discount to offer to the customer. Evolve decision logic: Decision logic must evolve to improve or maintain its effectiveness. In some cases, decision logic itself may be flawed or degrade over time. Measuring and analyzing the effectiveness or ineffectiveness of enterprises decisions allows developers to refine or redo decision logic to make it even better. It can be as simple as marketing managers reviewing email conversion rates and adjusting the decision logic to target an additional audience. Alternatively, it can be as sophisticated as embedding a machine learning model in the decision logic for an email marketing campaign to automatically adjust what content is sent to target audiences. Different technologies of Prescriptive analytics to create action: Search and knowledge discovery: Information leads to insights, and insights lead to knowledge. That knowledge enables employees to become smarter about the decisions they make for the benefit of the enterprise. But developers can embed search technology in decision logic to find knowledge used to make decisions in large pools of unstructured big data. Simulation: ​Simulation imitates a real-world process or system over time using a computer model. Because digital simulation relies on a model of the real world, the usefulness and accuracy of simulation to improve decisions depends a lot on the fidelity of the model. Simulation has long been used in multiple industries to test new ideas or how modifications will affect an existing process or system. Mathematical optimization: Mathematical optimization is the process of finding the optimal solution to a problem that has numerically expressed constraints. Machine learning: “Learning” means that the algorithms analyze sets of data to look for patterns and/or correlations that result in insights. Those insights can become deeper and more accurate as the algorithms analyze new data sets. The models created and continuously updated by machine learning can be used as input to decision logic or to improve the decision logic automatically. Paragmetic AI: ​Enterprises can use AI to program machines to continuously learn from new information, build knowledge, and then use that knowledge to make decisions and interact with people and/or other machines.                                               Use of Prescriptive Analytics in ThingWorx Analytics: Thing Optimizer: Thing Optimizer functionality provides the prescriptive scoring and optimization capabilities of ThingWorx Analytics. While predictive scoring allows you to make predictions about future outcomes, prescriptive scoring allows you to see how certain changes might affect future outcomes. After you have generated a prediction model (also called training a model), you can modify the prescriptive attributes in your data (those attributes marked as levers) to alter the predictions. The prescriptive scoring process evaluates each lever attribute, and returns an optimal value for that feature, depending on whether you want to minimize or maximize the goal variable. Prescriptive scoring results include both an original score (the score before any lever attributes are changed) and an optimized score (the score after optimal values are applied to the lever attributes). In addition, for each attribute identified in your data as a lever, original and optimal values are included in the prescriptive scoring results. How to Access Thing Optimizer Functionality: ThingWorx Analytics prescriptive scoring can only be accessed via the REST API Service. Using a REST client, you can access the Scoring service which includes a series of API endpoints to submit scoring requests, retrieve results, list jobs, and more. Requires installation of the ThingWorx Analytics Server. How to avoid mistakes - Below are some common mistakes while doing Prescriptive analytics: Starting digital analytics without a clear goal Ignoring core metrics Choosing overkill analytics tools Creating beautiful reports with little business value Failing to detect tracking errors                                                                                                                                 Image source: Wikipedia, Content: go.forrester.com(Partially)
View full tip
I imagine a lot of people that face this problem might be using Session Parameters, but there is a secret lost Ninja art that allows you to do it with Mashup parameters which is much more contextual and direct. The key is to have Mashup parameters with the same name. End Result Starting out I am on my main mashup, you can see the Tree Data in the Grid below Clicking on the next node now shows the new mashup and the TO field inside. That To value was passed in using a mashup parameter Clicking the next node, you can see it is actually a different mashup, but I am still passing the TO value How is it done: Here is my mashup with the Tree and Contained mashup, you can see the bindings are in place already, but how did I do it, since the Contained Mashup is empty? First create the new mashups with a mashup parameter named the SAME in this case EntityName Here is Mashup2 and you can see the Mashup parameter with the same name EntityName bound to one of the Value Displays Now how do I bind from my main mashup? What you need to do is to temporarily assign one of the Mashups to the Contained Mashup, here I am showing Mashup1 assigned. This will now allow you to bind not just the Mashup Name, but also bind a value to the Mashup Parameter in that Mashup. Just drag your selected row values onto the contained mashup. Here you can see the parameter showing as a property, I just dropped my value on the contained mashup and I can bind to Name (name of the mashup to show) and EntityName (the value I want to pass to the mashup parameter) Now just remove the assigned mashup from the Contained mashup and you’ll note that the bindings stay intact. That’s it!
View full tip
Welcome to the ThingWorx Manufacturing Apps Community! The ThingWorx Manufacturing Apps are easy to deploy, pre-configured role-based starter apps that are built on PTC’s industry-leading IoT platform, ThingWorx. These Apps provide manufacturers with real-time visibility into operational information, improved decision making, accelerated time to value, and unmatched flexibility to drive factory performance.   This Community page is open to all users-- including licensed ThingWorx users, Express (“freemium”) users, or anyone interested in trying the Apps. Tech Support community advocates serve users on this site, and are here to answer your questions about downloading, installing, and configuring the ThingWorx Manufacturing Apps.     A. Sign up: ThingWorx Manufacturing Apps Community: PTC account credentials are needed to participate in the ThingWorx Community. If you have not yet registered a PTC eSupport account, start with the Basic Account Creation page.   Manufacturing Apps Web portal: Register a login for the ThingWorx Manufacturing Apps web portal, where you can download the free trial and navigate to the additional resources discussed below.     B. Download: Choose a download/packaging option to get started.   i. Express/Freemium Installer (best for users who are new to ThingWorx): If you want to quickly install ThingWorx Manufacturing Apps (including ThingWorx) use the following installer: Download the Express/Freemium Installer   ii. 30-day Developer Kit trial: To experience the capabilities of the ThingWorx Platform with the Manufacturing Apps and create your own Apps: Download the 30-day Developer Kit trial   iii. Import as a ThingWorx Extension (for users with a Manufacturing Apps entitlement-- including ThingWorx commercial customers, PTC employees, and PTC Partners): ThingWorx Manufacturing apps can be imported as ThingWorx extensions into an existing ThingWorx Platform install (v8.1.0). To locate the download, open the PTC Software Download Page and expand the following folders:   ThingWorx Platform | Release 8.x | ThingWorx Manufacturing Apps Extension | Most Recent Datacode     C. Learn After downloading the installer or extensions, begin with Installation and Configuration.   Follow the steps laid out in the ThingWorx Manufacturing Apps Setup and Configuration Guide 8.2   Find helpful getting-started guides and videos available within the 'Get Started' section of the ThingWorx Manufacturing Apps Portal.     D. Customize Once you have successfully downloaded, installed, and configured the Manufacturing Apps, begin to explore the deeper potential of the Apps and the ThingWorx Platform.   Follow along with the discussion and steps contained in the ThingWorx Manufacturing Apps and Service Apps Customization Guide  8.2   Also contained within the the 'Get Started' page of the ThingWorx Manufacturing Apps Portal, find the "Evolve and Expand" section, featuring: -Custom Plant Layout application -Custom Asset Advisor application -Global Plant View application -Thingworx Manufacturing Apps Technical Lab with Sigma Tile (Raspberry Pi application) -Configuring the Apps with demo data set and simulator -Additional Advanced Documentation     E. Get help / give feedback / interact Use the ThingWorx Manufacturing Apps Community page as a resource to find documentation, peruse past forum threads, or post a question to start a discussion! For advanced troubleshooting, licensed users are encouraged to submit support tickets to the PTC My eSupport portal.
View full tip
The system user can become a vital point for properly yet conveniently securing your application. From the ThingWorx helpcenter: The system user is a system object in ThingWorx. With the System User, if a service is called from within a service or subscription (a wrapped service call), and the System User is permitted to execute the service, the service will be permitted to execute regardless of the User who initially triggered the sequence of events, scripts, or services. http://support.ptc.com/cs/help/thingworx_hc/thingworx_7.0_hc/index.jspx?id=SystemUser&action=show A few important notes to remember: It is not possible to log in as a system user Adding a system user to the Administrators group will not grant it the administrator permissions Adding a system user to the Everyone organization will not grant it the same visibility As an option, one of the posts on our community provides a script to assign all of the permissions to the system user for a one time set up: https://community.thingworx.com/community/developers/blog/2016/10/28/assigning-the-system-user-through-script Example: 1. Create a new template T1, several things Thing1, Thing2, Thing3 2. Create a new thing NewThing and a new user BlankUser 3. Create a service within NewThing that uses ThingTemplates[“T1"].GetImplementingThings() and give all the permissions to the new non-admin user, BlankUser Now the service on the template T1 can be accessed through the NewThing without explicit permissions for the BlankUser but rather through the system user. When manipulating with data (involving read/write and access to persistence provider), the BlankUser would require more than  just visibility permissions. For example, for a Stream, the following permissions would need to be set up: 1. Visibility on Stream template,StreamProcessingSubsystem, PersistenceProvider 2. Read/write permission on the Stream thing in the use case, created with the Stream template. Similarly, for other sources of data, things, templates and resources involved need visibility and, depending on the scenario, read/write permissions on the specific template.
View full tip
Sometimes it's needed to delete the existing PostgreSQL database, especially if a different major version was installed at first by mistake (for example, 9.6 in place of the supported 9.4). Then it's absolutely necessary to ensure the database is fully deleted and there is no db ghosts. The proper way to uninstall is to go to the postgresql server installation directory and find one uninstall-postgresql file. Double click on the Uninstall-postgresql file to run the un-installer- it will un-install postgresql. In case the uninstall wasn't performed correctly, below are the manual steps to clean it up. One sign of existing "ghost" db, is randomly seeing a second PostgreSQL server in the pgAdmin III or experiencing "error"-less problems when running the ThingWorx installation scripts. To uninstall manually,in this example we will use 9.6 as the version to delete - please replace with your own where needed: Remove the postgresql server installation directory. (rd /s /q "C:\Program Files\PostgreSQL\9.6") Assuming default location. Delete the user 'postgres' (net user postgres /delete) Remove the Registry entries. (HKEY_LOCAL_MACHINE\SOFTWARE\PostgreSQL\Installations\postgresql-9.6) and (HKEY_LOCAL_MACHINE\SOFTWARE\PostgreSQL\Services\postgresql-9.6) Remove the postgresql-9.6 service. (sc delete postgresql-9.6)
View full tip
It's been challenging, in the absence of OOTB Software Content Management (SCM) system within ThingWorx, to track, maintain code changes in different entities or to roll back in case of any entity is wrongly edited or removed. Though there's possibility to some extent to compare the differences i.e. when importing back the entities from the Source Control repository in ThingworxStorage. However this approach itself has it's own limitations. Note : This is not an official document, rather more of a trick to help workaround this challenge. I'll be using following components in this blog to help setup a workable environment. Git VSCode ThingWorx Scheduler Thing First two components i.e. Git & VSCode is something I preferred to use due to their ease and my familiarity. However, you are free to choose your own versioning platform and IDE to review the branches, commits, diff, etc. or simply use GIT Bash to review the branches and all related changes to the code. This blog divides into following structures Installing & setting up code versioning software Installing IDE Creating a Scheduler Thing in ThingWorx Reviewing the changes 1. Installing & setting up code versioning software As mentioned you can use any code versioning platform of your choice, in this blog I'll be using Git. To download and install Git, refer to the Git- Installing Git Setting up the Git Repository Navigate to the \\ThingworxStorage as this is the folder which contains the path to the \repository folder which in turns have SystemRepository folder. Note: Of course more custom repositories could be created if preference is to separate it from the SystemRepository (available OOTB) Once Git is installed, let's initialize the repository. If you are initializing the repository in ThingworxStorage and would like only repository folder to be tracked, be sure to create .gitignore and add following to it: # folders to ignore database esapi exports extensions logs reports certificates # keystore to ignore keystore.jks Note : Simply remove the folder/file from the .gitignore file if you'd like that file/folder to be tracked within the Git repository. Following commands have been issued in the Git Bash which can be started from Windows Start > Git Bash. Then from the Git Bash navigate to the ThingworxStorage/repository folder. Git Command to initialize repository $ git init Git command to check the status of tracked/un-tracked files/folders $ git status This may or may not return list(s) of files/folders that are not tracked/un-tracked. While issuing this command for the first time, it'll show that the repository and its content i.e. SystemRepository folder is not tracked (file/folder names will likely be highlighted in red color. Git command to configure user and then add required files/folders for tracking $ git config --global user.name "" $ git config --global user.email "" $ git add . This will add all the folders/files that are not ignored in the .gitignore file as we created above. $ git commit -m "" This will perform first commit to the master branch, which is the default branch after the initial setup of the git repository $ git branch -a This will list all available branches within this repository $ git branch e.g. $ git branch newfeatureA This will create a new branch with that name, however to switch to that branch checkout command needs to be used. $ git checkout newfeatureA Note this will reflect the change in the command prompt e.g. it'll be switched from  MINGW64 /c/ThingworxStorage/repository (master) to MINGW64 /c/ThingworxStorage/repository (newfeatureA) If there's a warning with files/folders highlighted in Red it may mean that the files/folders are not yet staged (not yet ready for being committed under the new branch and are not tracked under the new branch). To stage them: $ git add . To add them for tracking under this branch $ git commit -m "Initial commit under branch newfeatureA" Above command will commit with the message defined with "-m" parameter 2. Installing IDE Now that the Git is installed and configured for use, there are several options here e.g. using an IDE like VSCode or Atom or any other IDE for that matter, using Git Bash to review the branches and commit codes via command or to use the Git GUI. I'll be using VSCode (because apart from tracking Git repos I can also debug HTML/CSS, XML with minimum setup time and likely these are going to be the languages mostly used when working with ThingWorx entities) and will install certain extensions to simplify the access and reviewing process of the branches containing code changes, new entities those that are created or getting committed from different users, etc To install VSCode refer to the Setting up Visual Studio Code. This will cover all the platforms which you might be working on. Here are the list of extensions that I have installed within VS Code to work with the Git repository. Required Extensions Git Extension Pack Optional Extensions Markdown All in One XML Tools HTML CSS Support Once installed and done with all the above mentioned extensions simply navigate to the VSCode application's File > Open Folder > \\ThingworxStorage\repository This will open the SystemRepository folder which will also populate the GitLense section highlighting the lists of branches, which one is the active branch (marked with check icon) & what uncommitted changes are remaining in which branch see following: To view the history of all the branches we can use the extension that got installed with the Git Extension Pack by pressing keyboard key F1 and then search for Git: View History (git log) > Select all branches; this will provide overview such as this 3. Creating a Scheduler Thing in ThingWorx Now that we have the playground setup, we can either: Export ThingWorx entities manually by navigating to the ThingWorx Composer > Import/Export > Export > Source Control Entities, or Invoke the ExportSourceControlledEntities service automatically (based on a Scheduler's CRON job) available under Resources > SourceControlFunctions To invoke this service automatically, a subscription could be created to the Scheduler Thing's Event which invokes the execution of ExportSourceControlledEntities service periodically. I'm using this service with following input parameters : var params = { path: "/"/* STRING */, endDate: undefined/* DATETIME */, includeDependents: undefined/* BOOLEAN */, collection: undefined/* STRING */, repositoryName: "SystemRepository"/* THINGNAME */, projectName: undefined/* PROJECTNAME */, startDate: undefined/* DATETIME */, tags: undefined/* TAGS */}; // no return Resources["SourceControlFunctions"].ExportSourceControlledEntities(params); Service is only taking path & repositoryName as input. Of course, this can be updated based on the type of entities, datetime, collection type, etc. that you would want to track in your code's versioning system. 4. Reviewing the changes With the help of the Git tool kit installed in the VS Code (covered in section 2. Installing IDE) we can now track the changed / newly created entities immediately (as soon as they are exported to the respository ) in the Source Control section (also accessible via key combination Ctrl + Shift + G) Based on the scheduled job the entities will be exported to the specified repository periodically which in turn will show up in the branch under which that repository is being tracked. Since I'm using VS Code with Git extension I can track these changes under the tab GitLens Additionally, for quick access to all the new entities created or existing ones modified - Source Control section could be checked Changes marked with "U" are new entities that got added to the repository and are also un-tracked and the changes marked with "M" are the ones that are modified entities compared to their last commit state in the specific branch. To quickly view the difference/modifications in the entity, simply click on the modified file on the left which will then highlight the difference on the right side, like so
View full tip
This document attached to this blog entry actually came out of my first exposure to using the C SDK on a Raspberry PI. I took notes on what I had to do to get my own simple edge application working and I think it is a good introduction to using the C SDK to report real, sampled data. It also demonstrates how you can use the C SDK without having to use HTTPS. It demonstrates how to turn off HTTPS support. I would appreciate any feedback on this document and what additions might be useful to anyone else who tries to do this on their own.
View full tip
Introduction Oracle 12c release introduced the concept of multi-tenant architecture for housing several databases running as service under a single database, I'll try to address the connectivity and required configuration to connect to one of the Pluggable database running in the multi-tenant architecture. Multi-tenant database architecture in scope of ThingWorx External Data Source What is multi-tenant Database architecture ? Running multiple databases under a single database installation. Oracle 12c allows user to create one database called Container Database (CDB) and then spawn several databases called Pluggable Databases (PDB) running as services under it. Why use multi-tenant architecture? Such a setup allows users to spawn a new PDB as and when needed with limited resource requirements, easily administer several PDBs just by administering the container database - since all the PDBs are contained within a single database's tablespace structure, start and stop individual PDB leading to low cost on maintaining different databases - as the resource management is limited to one CDB. When to use multi-tenant architecture? In scenarios like creating PoCs, different test environments requiring external data storage, maintaining different versions of dataset, having this run in the multi-tenant architecture could help save time, money and effort. Create Container Database (CDB) Creation of a Container Database (CDB) is not very different from creating a non Container Database use the attached guide Installing Oracle Database Software and Creating a Database.pdf same is accessible online. Create Pluggable Database (PDB) Use the attached Multitenant : Create and Configure a Pluggable Database (PDB) in Oracle Database 12c PDF guide to create and plug a Pluggable Database into the Container Database created in previous step, same is accessible online Using above guide I have bunch of pluggable databases as can be seen below. I'll be using TW724 for connecting to ThingWorx server as an external datasource for following example Connect to a Pluggable Database(PDB) as external data source for ThingWorx Download and unzip the Relational Databases Connectors Extension from ThingWorx Marketplace and extract Oracle12Connector_Extension Import Oracle12Connector_Extension to the ThingWorx using Extension -> Import Create a Thing using OracleDBServer12 Thing Template , e.g. TW724_PDB_Thing Navigate to the Configurations for TW724_PDB_Thing to update the default configuration: JDBC Driver Class Name : oracle.jdbc.OracleDriver JDBC Connection String : jdbc:oracle:thin:@//oravm.ptcnet.ptc.com:1521/tw724.ptcnet.ptc.com Database Username : <UserName> Database Password : <password>   5. Once done save the entity Note: A PDB in a container database can be reached only as a service and not using the CDB's SID. In the above configuration TW724 is a PDB which can be connected to via it's service name i.e. TW724.PTCNET.PTC.COM Let's head to the Services tab for TW724_PDB_Thing to query and access the PDB data Creating Services to access the PDB as external database source for ThingWorx Once the configuration is done the TW724_PDB_Thing is ready for use. The queries remain the same as any other SQL query needed to access the data from Oracle. Service for creating a Table Once on the Services tab for the TW724_PDB_Thing click on Add My Service select the service handler as SQL Command to use following script to create a testTable1 in the PDB create table testTable1 (     id NUMBER GENERATED ALWAYS AS IDENTITY primary key,     col1 varchar2(100),     col2 number ) Note: GENERATED ALWAYS AS IDENTITY option is Oracle 12c specific and I included it here for the reason that with Oracle 12c the possibility to auto generate is now built in with that option simplifying the sequence generation when compared with older Oracle versions such as Oracle 11g. User creating table will need access right on creating table and sequence checkout the Oracle documentation on Identity for more on this. Service for getting all the data from the table Add another service with script Select * from testTable1 for getting all the data from the table Service for inserting data into the table Adding another service with script insert into testTable1 (col1, col2) values ('TextValue', 123)  will insert the data into the table created above Service for getting all tables from the PDB i.e. TW724 Using Select * from tab lists all the available tables in the TW724 PDB Summary Just a quick wrap up on how this would look visually refer to the following image. Since this is a scalable setup - given the platform having enough resources it's possible to create upto 252 PDBs under a CDB therefore 252 PDBs could be created and configured to as many things extending the OracleDBServer12 Thing. ______________________________________________________________________________________________________________________________________________ Edit: Common Connection Troubleshooting If you observe the error something like this Unable to Invoke Service GetAllPDBTables on TW724_PDB_Thing : ORA-01033: ORACLE initialization or shutdown in progress Ensure that the pluggable database, in this error TW724 (since this is what I created and used above in my services) is opened and accessible. If it's not opened use the command after logging in as sys/system (with admin rights) in CDB, which is ORCL in via SQL*Plus or SQL Developer or any SQL utility of your choice capable of connecting to Oracle DB and open the pluggable database using the command : alter pluggable database tw724 open;
View full tip
Put together a quick example mashup in support of a couple of analytics projects to demonstrate use of the new TWX Analytics 8.1 APIs within TWX mashup builder. The intention here is for use in POCs to provide a quick way of demonstrating customer-facing analytics outputs along with the more detailed view available in Analytics Builder. Required pre-requisites are: ThingWorx 8.1 + Analytics Extensions ThingWorx Analytics Server 8.1 Carousel TWX UI widget (attached) imported into TWX Data set(s) loaded with signals / profiles generated. The demo can be installed by importing the attached entities file into TWX composer then launching the mashup 'EMEA.Analytics.CustomerInsightMashUp'. A quick run through of the functionality ... On launching the mashup, data sets and models are displayed for selection on the left hand-side. On selecting dataset and model, signals are presented in two tabs - first an overview of all signals. The list on the left can be expanded by changing the value for 'Top <n> Contributing Features'. On selecting a signal from the list, the 'Selected Signal Details' tab displays additional charting for value ranges, average goal etc. The number of 'bins' to display can be edited. Similarly, profiles can be viewed from the 'Profiles' tab - each profile can be selected by dragging the upper carousel. This is all done using the Analytics 8.1 "Things" in TWX along with an additional custom Thing with some scripted services (EMEA.Analytics.Helper). Thanks to Arian Van Huelsen & Tanveer Saifee at PTC for their support; all comments / feedback welcome.
View full tip
In the evolving landscape of software development, ensuring support for the latest, most secure versions of programming languages is essential. At PTC, we continuously evaluate our technology stack, and Java is no exception. As part of our ongoing commitment to providing secure and high-performing products, we’re announcing some important updates to the Java support plans for ThingWorx.   Current Java Support in ThingWorx (Through Version v9.1.X - v9.6.X)   As of ThingWorx v9.6, Java 11 is the only supported version. This version has been a mainstay of our IoT platform, ensuring stability and performance across various use cases. However, Java 11 entered Extended Support in September 2023, meaning its standard support phase has ended. While this version will continue to receive security updates for a while, its lifecycle is winding down.   Introducing Java 21 Support in ThingWorx v9.7 (Planned Release: December 2024)   With ThingWorx 9.7, releasing in December 2024, we will introduce support for Java 21, the next Long-Term Support (LTS) version of Java. This upgrade brings key benefits, including improved performance, enhanced garbage collection, and increased security, ensuring that ThingWorx remains optimized for enterprise-scale IoT deployments. (More details: The Arrival Of JAVA 21) Given the diversity of our customer base, we know that some are still using Java 11, while others are ready to move to Java 21. ThingWorx 9.7 will support both versions, allowing customers the flexibility to upgrade to the latest ThingWorx version while preparing their environments for Java 21.   The Road to Java 21-Only: What to Expect in ThingWorx v10.0 (Planned Release: June 2025)   As we assess the adoption of Java 21 following the ThingWorx 9.7 release, our goal is to phase out support for Java 11 with ThingWorx 10.0, scheduled for release in June 2025. Starting with ThingWorx 10, Java 21 will be the only supported version, marking the end of Java 11 support for the core platform.   This is driven by the need to stay aligned with modern standards and best practices, including support for third-party technologies such as Tomcat v10 and Spring Framework v6, which require latest Java versions. These updates will ensure that ThingWorx continues to benefit from the latest advancements in the Java ecosystem. Next steps for ThingWorx users   As we approach the release of ThingWorx 9.7, we encourage customers to begin planning for the move to Java 21. While ThingWorx 9.7 will support both Java 11 and Java 21, we recommend upgrading to Java 21 to take full advantage of the enhancements it offers. For more detailed information on overall third party support, do check Release Advisor Vineet Khokhar Principal Product Manager, IoT Security   Stay tuned for more updates as we approach the release of ThingWorx v9.7, and as always, in case of issues, feel free to reach out to <support.ptc.com>  This post on ThingWorxTM status & roadmap is a preliminary version and not subject to your license agreement or any other agreement with ThingWorx. This post contains intended strategies, developments, and functionalities of the ThingWorxTM product. The information is furnished for information use only and is not intended to be binding upon ThingWorx to any particular course of business, product strategy, and/or development. Please note that this document is subject to change and may be changed by ThingWorx at any time without notice; accordingly, you should not rely on this data for production or purchasing decisions. ThingWorx assumes no responsibility for errors or omissions in this document.
View full tip
Original Post Date:     June 6, 2016   Description: This is a video tutorial on creating a DataTable with a DataShape, and adding and retrieving an entry.  
View full tip
Announcements