Community Tip - Need to share some code when posting a question or reply? Make sure to use the "Insert code sample" menu option. Learn more! X
We have been using Creo Elements Direct Modeling and Model Manager since the late 80's when it was Hewlett Packard's ME10, ME30, and WorkManager. Our Model Manager database currently has over 200,000 models and drawings and the file storage area is about 200GB in size. Our largest assembly contains close to 100,000 parts and takes 45 to 60 minutes to load. I am working on specifying new harware (client computers and servers) to upgrade to. One major goal is to get these load times down as much as is reasonably possible. Our current hardware is listed below...
Clients:
Dell Precision T7400
Windows XP Pro 64-bit
Intel Xeon X5482 3.20GHz Quad Core Processors
16.0 GB of RAM
7.2K RPM Hard Disks
NVIDIA Quadro FX 5600 Graphics Cards
Server:
Dell PowerEdge 2950
Windows Server 2002 R2 64-bit
Intel Xeon 5160 3.00GHz Quad Core Processor
8.0 GB of RAM
15K RPM Hard Disks setup in a RAID5
We have 15 users and a Gigabit Network.
My question is where is our biggest bottleneck? I am looking at using Solid State Disks in our new client machines and increasing the RAM to 32 GB. Will a larger number of processor cores help with performance? I am thinking no since I do not believe that Creo is written to use mulitple CPUs. How about NVIDIA Tesla technology (offloading processes to the graphics card)? Can Creo take advantage of this? Are there any networking tricks that can help with load speeds? Multiple networks cards on the server?
I am a mechanical engineer and our "computer guy" although I do not have a background in anything computer related so any help is much appreciated. Thanks!
-Dan
My guess would be that the RAM in your client machines may be your bottleneck-at least that's where I would start looking.
Start up a new session and load your large assembly. After it has loaded, see how much memory is being used. You can use TaskManager to do this, but I prefer to use this command in Modeling:
(uic-show-memory-usage)
It will tell you how much memory you are using. For example:
Memory requested from system: 440.3 MB
If this value is more than 16000 MB, then you have exceeded the amount of RAM in your computer and it will start using your hard disc-which is significantly slower.
If that doesn't explain it, you might try using the Partial and Lightweight load options and see how much they help. And when you are doing that, pay attention to the steps that ModelManager is taking to load and see which ones seem to take the longest (for example, if you are loading "Highest Version", does it take a while for ModelManager to find the highest versions?).
Thanks for the reply Peter. We will definately be increasing the RAM in both our clients and server when we upgrade. We currently use Partial or Lightweight load options when we can and it does help a bit. PTC support suggested teaming NICs on the server and "tuning" the database. I'm going to have to do a bit more research on that.
Hello Daniel,
Can you please explain a bit about the need to load a 100.000 parts assy in one go? I'm sure you have a good reason for it, but I am curious! We design printers that can contain up to 25.000 parts, but we never load the entire machine. Instead we have divided the printer in relatively small sub-assy's. Individual engineers load e.g. 3 or 4 subassy's of their own and a few in the proximity of the design area, belonging to other engineers (possibly light-weight) and store this group into one personal design. Load times 5-10 minutes. So we have solved it in working methods. We have been working like this since over 14 years and hardware was much more limited then. Nowadays we have more powerfull hardware, but yours seems to be faster than ours. Of course it's never fast enough
I work in a research facility that contains more than 20 instruments located around a source. Each instrument contains an average of about 3,000 unique parts. On occasion we have to load the top-level assembly of our entire facility which contains the instruments, building with utilities, and other miscellaneous items in order to create layout views, plan for new instruments, create installtion configs, or other similiar tasks. The last time I checked this top-level assembly contained about 85,000 unique parts.
Our new client computers are now up and running and the load time of this top-level assembly has been cut in half! The new computers are running Windows 7 64-bit with 3.30 GHz Quad Core Processors, SSDs, 32 GB of RAM, and Quadro 5000 Video Cards. Dynamic video performance (panning, rotating, zooming, etc) is now the same whether I have a single part loaded or the top level assembly loaded.
I still hope to upgrade our server but have not done so yet.
Daniel and Ernest,
When you tell respectively 100.000 parts and 25.000 parts, are these unique parts or the total number of parts of your model ?
The number of unique parts in our top-level assembly is about 85,000
Daniel,
You say :"Our largest assembly contains close to 100,000 parts and takes 45 to 60 minutes to load".
Is that from ModelManager or from hard disk ?
From Model Manager. Sorry for not specifying.
Daniel,
Do you load your 100,000 unique part assy from ModelManager in one shot ?
Yes, all in one shot from the top level assembly in Model Manager.
How can you load 100,000 unique parts in one shot from ModelManager?
We have an assembly of 60,000 unique parts which we cannot load in one shot. We can see that the processus ModelManager.exe is reaching the limit of the 32 bit memory capacity and that it stalls before it can transfer all the data to Modeling.
Do you have any tuning that allows a workaround the 32 bit memory limit of ModelManager.exe?
Yes I actually do. In order to load large assemblies from Model Manager I increase the java heap space. This is done by editing the run.bat file in the Model Manager install directory (C:\Program Files (x86)\PTC\Creo Elements\Direct Model and Drawing Manager 18.1\run.bat). I increase the maximum java heap space size to 1 GB by changing the Xmx value on line 25 to 1024m. When done line 25 should read... "set JVM_ARGS=%JVM_ARGS% -Xms50m -Xmx1024m". Hopefully this helps.
Increasing Java max memory to 1024 MB also has drawbacks.
The overall memory for the WM C-kernel + Java is limited (approx. 2GB - for the 32-bit MM)
You may run out of memory in the C-kernel when you increase Java memory too much.
Both types of out-of-memory (either C-kernel or Java) lead to an immediate termination of MM.
Try if -Xmx768m is sufficient (default is 256m - which is not sufficient for large assemblies)
Alternative method to load huge assemblies (>100K parts)
- Load in several steps
- Use "partial load" and reload functionality
- Introduce Containers in structure and utilize the "partial load for Containers" preference
To track down where the bottleneck might be, divide the problem into two parts.
Export your big assembly as a SDEXP file.
1) measure the time until the SDEXP file is created.
This is the time which is needed to collect all data in the database and transfer it to a tmp directory
2)Once the SDEXP file exists, measure the time it takes to load that into Modeling
With this test, you are measuring if the computer running Modeling has enough RAM and a fast disk
HINT: Watch for virus scanners - they can significantly slow down performance when they "watch" the
temporary directories used to transfer files between MM and the Database.
Thanks for the responses Max. I did a similiar test on our old hardware by loading a large assembly from Model Manager, saving it as a package file to a local drive, and then loading the package file. The load from Model Manager took 2 hours and 12 minutes while the load from the package file took just 14 minutes. Our new computers with ssd's and 32 gb of RAM are able to load this same assembly from Model Manager into Modeling in 20 minutes! We have not upgraded our server hardware however I still hope to do so in the near future.
Please note that PKG files and SDEXP files are completely different.
PKG is basically the "pure model data"
SDEXP contains additional Meta Data (e.g. MASTERDATA information)
So to measure the time spent on the client machine, you should load the SDEXP data.
This is far closer to the real portion of "loading a model from the database".
Additional note: SDEXP contains what you have configured in Model Manager, e.g.
- load as stored
- OR load highest version
- Load with/without MASTERDATA
- etc.
So have an eye on what you have configured in MM
The time to load the SDEXP into Modeling may differ significantly.
(everything is worth being measured - you just need to be aware of *what* you are measuring)
Thank you. That is very helpful to know. I will keep that in mind when testing new server hardware.
Hi
The customer has given me a .sdexp file. How do I open it? Should installing Creo Direct Work Manager in my laptop to open it can work?
Or after opening the .sdexp file in customer's machine, can I ask him to save as another format whereby today's Creo Parametric or Creo Direct can read?
Thanks for your replies
Ravi