Using an LLM with ThingWorx - Latest Version (V2) Update!
πNew Best Practice: Using LLMs with ThingWorx β Version 2.0 Now Available!
Everyone is talking about Generative AI, and Large Language Models (LLMs) continue to transform how organizations analyze and interact with data. Beyond chatbots and text generation, these models are increasingly being used to make sense of complex industrial data β from factory sensors to production analytics β helping teams make faster, smarter decisions.
ThingWorx, PTCβs Industrial IoT platform, enables secure connectivity to machines, systems, and databases across the enterprise. By linking ThingWorx with an LLM, you can create an AI Assistant for industrial operations that provides insights, answers natural-language questions, and supports data-driven problem solving β all while keeping your operational data secure.
π§© Whatβs New in Version 2.0
This new release builds on the original ThingWorx + LLM accelerator and introduces major new capabilities designed to make LLM integrations more powerful, flexible, and enterprise-ready:
Integration with ThingWorx Analytics
The accelerator now connects directly with ThingWorx Analytics Server to interpret results from Signals, Profiles, Datasets, and Predictive Models. You can ask questions like:βWhat factors most influence grinder failures?β
βHow accurate is the predictive model trained on my dataset?β
The LLM analyzes the context and provides clear, actionable insights in natural language.Router-Based Architecture
Version 2 introduces a βrouterβ service that allows the LLM to intelligently select which ThingWorx services to execute based on the userβs question. This makes the system more efficient, cost-effective, and scalable β only the necessary data and context are sent to the LLM.Dual-Tab Mashup Design
The new Insights Mashup includes two tabs:Data Insights (V1) for simple natural-language data queries
Analytics Insights (V2) for interpreting ThingWorx Analytics jobs
The design follows PTCβs corporate template for a modern, consistent experience.
Expanded Service Library & Security Enhancements
The Manager Thing now contains 17 modular services, improving transparency and maintainability.
This version also incorporates Azure OpenAI best practices for secure key management, data minimization, and compliance.
βοΈ What You Get
The Accelerator includes:
A ThingWorx project file (PTCDTS_TWXLLM_V2.xml) that can be imported directly into ThingWorx 9.5 or above
An updated Best Practice document (V2) detailing setup, configuration, and validation steps
Sample data files (DEMO_BEANPRO_DATA.csv and metadata JSON) used to demonstrate Analytics Insights
Together, these resources show how to connect ThingWorx and an LLM using secure API calls β no external libraries required β and how to extend the architecture to your own factory data and analytics models.
π₯ Watch the New 21-Minute Walkthrough
To accompany this release, weβve published a 21-minute video overview and live demo that walks through:
How ThingWorx and LLMs communicate
The new router-based architecture in Version 2
How Analytics Insights can interpret predictive model results
A full end-to-end demo showing natural-language queries in both tabs of the accelerator
Whether youβre a developer, system integrator, or operations engineer, this walkthrough shows how easily you can bring Generative AI into your ThingWorx applications.
π Get Started
You can download everything β the Best Practice Document, Accelerator Project, and supporting demo data β from the official GitHub repository:
π ThingWorx LLM Accelerator on GitHub
Import it into your ThingWorx environment, enter your Azure OpenAI credentials, and youβll be up and running in minutes.
By combining ThingWorxβs industrial connectivity with the reasoning and language capabilities of LLMs, this accelerator helps organizations turn raw IoT and analytics data into real, explainable insights β safely, securely, and efficiently.

