You can see the data area increase graph of my windchill Production server.
The curve you see here is constantly increasing at the level of terabytes and creates serious costs.
How do you manage the space here,
I keep all versions in the system, do you keep them too? Or do you have another recommendation? thanks.
Version might not be your issue but certainly iterations. You need to purge. Purge rules can be setup and run automatically. I have some queries that can help identify large file and context with the most purgeable items (see presentation from 2022 PTCUser). After you purge, you should also be running remove unreferenced files job on vaults to delete things that are not referenced in the database. When you delete items in Windchill, they do not immediately delete content, they just become unreferenced. This job will remove the files from the vaults for good.
If I remember right, we were previously on an unsustainable data path due to publishing. This was several years ago, but I think we were publishing a step file for every check in. We turned that off, and now users can publish on demand only when needed.
You can configure STEP publishing and PDF to only upon release. That can cut it down which is what we do.