Data is NOT free. It is easy to overlook the cost of data collection, but all data incurs some cost when it is collected. Data collection in and of itself does not bring business value. If you don’t know why you’re collecting the data, then you probably won’t use it once you have it.
For a wireless product, it is felt in the cost of bytes transferred, which makes for an expensive solution, but happy Telco's. Even for wired installations, data transfer isn’t free. Imagine a supermarket with 20 checkout lanes - with only a 56K DSL line - and the connection is shared with the credit card terminals, so it is important to upload only the necessary data during business hours.
For the end user, too much data leads to information clutter. Too much information increases the time necessary to locate and access critical data.
All enterprise applications have some associated "Infrastructure Tax", and the Axeda Platform is no exception. This is the cost of maintaining the existing infrastructure, as well as increasing capacity through the addition of new systems infrastructure.
The cost of the physical hardware
The additional software licenses
The cost of the network bandwidth
The cost of IT staff to maintain the servers
The cost of attached storage
Optimizing your data profile will maximize the performance of your existing infrastructure. Scaling decisions should be based on load because 50,000 well defined Assets can yield less data than 2,000 extremely "chatty" Assets.
Types of Data
To develop your data profile, first identify the types of data you’re collecting.
"Actionable Data": This is used to drive business logic. This is your most crucial data, and tends to be "real-time"
"Informational Data": This changes at a very low rate, and represents properties of your assets as opposed to status
"Historical Data": Sometimes you need to step back to appreciate a work of art. Historical data is best viewed with a wide lens to identify trends
"Payload Data": Data which is being packaged and shipped to an external system
Actionable Data Actionable Data controls the flow of business logic and has three common attributes:
It tends to represent the status of the Asset
It typically the highest priority data you will receive
It usually has more frequent occurrences than other data
Informational Data Informational Data is typically system or software data of which some examples include:
Historical Data Historical Data will represent the results of long-term operations and is typically used for operational review of trends.
May be sourced either from Data Items, File uploads or Web Services operations
May feed the Axeda integrated business intelligence solution, or internal customer BI systems
Payload Data Payload data travels through the Cloud to your system of record. In this case, the Axeda Platform is a key actor in your system, but its presence is not directly visible to the end user
Data Types Key Points
Understanding the nature of your data helps to inform your data collection strategy. The four primary attributes are the following:
Knowing what to store, what to process and what to pass through for storage is the first key to optimizing your data profile. The "everything first" approach is an easy choice, but a tough one from which to realize value. A "bottom up" or use-case driven approach will add data incrementally, and will reveal the subset of data you actually need to be collecting.Knowing your target audience for the data is the next step. A best practice to better understand who is trying to innovate and how they are looking to do it begins with questions such as the following:
Is marketing looking for trends to highlight?
Is R&D looking for areas to improve the product?
Is the Service team looking to pro-actively troubleshoot assets in the field?
Is Sales looking to sell more consumables?
Is Finance trying to resolve a billing dispute?
Answers to these questions will help determine which data contributes to solving real business problems. Most Service technicians only access a handful of pieces of information about an Asset while troubleshooting, regardless of how many they have access to. It’s important to close the information loop when finding out which data is actually being used.In addition to understanding the correct target audience and their goals, milestone events are also opportunities to revisit your strategy, specifically times like:
New Model rollouts
Migration to the Cloud
New program launch
Once your data profile has been established, the next phase of optimization is to plan the way the data will be received.
Data Item vs. File Upload A decision should be made as to the best way to transfer data to the Axeda Platform, whether that is data items, events, alarms or file transfers. Here's a Best Practice approach that's fairly universal:
Choose a Data Item if: (a)You are sending Actionable Data, or (b)You are sending discreet Informational Data
Choose a File Upload if: (a)You are sending bulk Data which does not need to trigger an immediate response, or (b)You intend to forward the Data to an external system
Agent-Side Business Logic Keep in mind that the Axeda Platform allows business logic to be implemented before transmitting any data. The Agent can be configured to determine when Data needs to be sent via numerous mechanisms:
Scripts provide the ability to trigger on-demand uploads of data, either via a human UI interaction or an automated process
The "Black Box" configuration allows for a rolling sample window, and will only upload the data in the window based on a configured condition
Agent Rules Agent Rules allow the Agent to monitor internal data values to decide when to send data to the Cloud. Data can be continuously sampled and compared against configured thresholds to determine when a value worthy of transmission is encountered. This provides a very powerful mechanism to filter outbound data.
The example below shows a graphical representation of how an Agent might monitor a data flow and transmit only when it reaches an Absolute-high value of 1200:
Axeda provides a versatile platform for managing the flow of data through your Asset ecosystem. It helps to cultivate an awareness not only of what the data set is but what it represents and to whom it has value. While data is cheap, the hidden costs of data transmission make it worthwhile to do your "data profiling homework" or risk paying a high price in the longer term.