Everyone is talking about Generative AI, Large Language Models (LLMs) in particular. They are being used to provide Customer Service assistance in Chatbots, automate writing code in software development, summarizing long articles, documents, and data stores, among many other use cases. LLMs have the ability to ingest huge amounts of data, process queries related to that data, find patterns, and provide answers to natural language questions asked of the model.
Could LLMs be used to provide greater insights into factory floor operations? The answer is most definitely yes! And ThingWorx can be used to facilitate this capability. ThingWorx enables connectivity to sensors, systems, and databases on the factory floor to monitor and analyze operations. Linking ThingWorx to an LLM establishes an AI Assistant capability that can provide personnel with valuable insights on the floor that can greatly reduce downtime and increase overall productivity.
A new Best Practice has been developed that lays out the steps necessary to accomplish this connection between ThingWorx and an LLM. This document can be used as a recipe for building a ThingWorx application that acts as an AI Assistant for an organization’s factory floor personnel. Questions can be asked by a user to find answers to issues like the following:
In addition to the Best Practice document, we have made available a ThingWorx project file that serves as an Accelerator to build out this application much faster. Use this project as a starting point and make any modifications necessary to implement this process in your own environment.
You can find both the Best Practice document and the ThingWorx project file here Using an LLM with ThingWorx
Any additional feedback or questions you have can be entered below.