Skip to main content
1-Visitor
February 4, 2026
Question

WindchillPDMLink and MCP

  • February 4, 2026
  • 3 replies
  • 308 views
I am using Windchill PDMLink Release 12.1 and Datecode with CPS 12.1.0.1

Does WindchillPDMLink is compatible or has any MCP (Model Context Protocol) that we can use to feed and interact with our Internal AI servers.

    3 replies

    4-Participant
    February 5, 2026

    Codebeamer and Windchill are currently evaluating their roadmap plans for MCP integration. For the most up-to-date guidance, please contact your account team.

    13-Aquamarine
    February 6, 2026

    @EE_13299967 We implemented an MCP integration of claude desktop ai and Windchill ant it worked fine.

    14-Alexandrite
    February 11, 2026

    @MTH can you share some details please?

    16-Pearl
    February 19, 2026

    I have played around with developing a few different MCP servers for Windchill and tested everything from a STDIO MCP server with Cursor running locally to an HTTP/SSE MCP server with ChatGPT. I also developed utilities to read the Windchill configuration from the WRS OData domain documentation and the $metadata so the MCP has a cached definition of any custom types/attributes, etc.

     

    One problem I kept running into was how to rein in the chatbots. They regularly offered to do things they were in no way capable of doing: “If you like I can x, y, z - just say the word…”. When I asked them for some very obviously wrong things, they just went along with it and even congratulated me on how insightful my request was. If I asked them for something there’s no way they could find out, they couldn’t just say, “No, I don’t know.” They were also tripped up by language; for example, my prompt was “create a list of all the manufactured parts in the Golf Cart,” but it just couldn’t figure out that it needed to query ‘$filter=Source/Value eq make’ and, after dozens of queries, claimed there were none. So I wrote utilities to preload and cache the $metadata and included natural language descriptions of what those data represent to humans.

     

    For runtime I chose Node.js with npm after experimenting with Python for a while and had the most success when I split the MCP servers into a core package plus individual MCPs for the different endpoints like ProdMgmt and DocMgmt, or for a specific role I wanted from it. This was due more to my impatience to get it working than anything else.

     

    One concern, I guess, is how to manage the risks of exposing Windchill to a chatbot. In the development environment I used, Windchill is not exposed to the internet. The MCP servers run privately inside an LXC container and are not exposed to the internet. Instead, a Cloudflare Tunnel makes an outbound, encrypted connection from the LXC container to Cloudflare, which then maps public DNS names to that tunnel. When someone accesses the MCP URL, the request hits Cloudflare DNS, then securely travels down the existing tunnel to my private network to the reverse proxy (and directly to the MCP service), without having to open any inbound ports. I didn’t get around to implementing OAuth yet, I’m just using basic auth. Obviously, it wouldn’t be workable this way for enterprise, this is just how I worked during development.

     

    AI is moving fast, and there are already new protocols in the works to address security concerns properly. Even OpenAI’s app connector still only allows MCPs in development mode for this reason.

    avillanueva
    23-Emerald I
    23-Emerald I
    February 19, 2026

    How does this square this this statement from the license agreement?

    Without limiting the foregoing, without express written permission from PTC, the Customer is expressly prohibited from using (directly or
    via an application created by Customer or a third party) the application program interface of the Licensed Product to extract data from the
    Licensed Product for the purpose of training, fine‐tuning, or creating an artificial intelligence (AI) model or building a data source such as a
    Retrieval Augment Generation (RAG), whether for internal use or external distribution. In the event that Customer receives such permission,
    all users of any application(s) that leverage such AI model or data source must have a Registered User license to the Licensed Product,
    regardless of whether such users in fact access the Licensed Product directly (and in the event Customer acts in contravention of the
    foregoing restriction, the requirement to assign a Registered User license to such users shall not be PTC’s sole remedy). Also, all users of any
    application(s) that leverage such AI model or data source must only use PTC‐supported APIs of the respective Licensed Product. The parties
    acknowledge that the way the Licensed Product structures data and respective databases are proprietary and any permission from PTC to
    access the Licensed Product with application(s) that leverage such AI model or data source is not meant to derogate from the proprietary
    nature of such data structures and databases, and Customer may not recreate such data structures and/or databases that are part of the
    Licensed Products.
    16-Pearl
    February 19, 2026

    Well, I wasn’t aware of that clause so thanks for pointing it out.

     

    This relates specifically to extracting data from Windchill for the purpose of training, fine-tuning, or building AI models or RAG-style data sources.

     

    In my case, the MCP servers aren’t extracting or persisting Windchill data for AI training or building a knowledge base. A grey area might be the use of ODATA WRS and $metadata persisted in cache and being refreshed from Windchill at runtime, however this is in the MCP server cache, not AI RAG, or context, and not training, tuning or building AI models.