Since its launch with ThingWorx 10.0 in June 2025, the IoT Streams feature has been a hit!! It’s exceeded my expectations to see the creative ways it adds value to the ThingWorx ecosystem.
IoT Streams utilizes Apache Kafka or Azure Event Hubs as a distributed streaming platform, serving as a robust message broker and data pipeline. It empowers industrial data management by providing access to ThingWorx contextualized data for analytics, reporting, and generative AI. It enables streaming of ThingWorx data to data lakes and cold archival storage while maintaining hot data availability for real-time insights. It also enhances the platform’s scalability and reliability with robust processing of high event volumes through a durable message broker.
Working with early adopters, our services team has authored a best practices guide along with sample code to help you start with IoT Streams. The GitHub artifact located at https://github.com/PTCInc/iot_stream provides practical guidance for leveraging ThingWorx IoT Streams to:
In addition, with the recent ThingWorx 10.0.1 maintenance release, users can send custom JSON payloads to external messaging systems using the new WriteJSONToQueue service.
Lastly, I'm also enclosing an excerpt from our ThingWorx 10.0 Webinar where you can see a demo on IoT streams and hear directly from ThingWorx and Microsoft on how it enables integration between ThingWorx and Microsoft Fabric:
For full video, checkout the replay of ThingWorx 10.0 webinar here: https://www.ptc.com/en/resources/industrials/webcast/thingworx-10-launch-replay
So, check it out, try IoT Streams in your projects, and share how you’re using it in the comments. For questions or support, please do not hesitate to log a technical support ticket.
Cheers,
Ayush Tiwari
Director Product Management
