cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Need to share some code when posting a question or reply? Make sure to use the "Insert code sample" menu option. Learn more! X

More than 32 Gbyte memory

ehaenen
4-Participant

More than 32 Gbyte memory

Can PTC please report on plans to break the 32Gb memory barrier, It's been predicted for a while now, but I would like to know when to order the extra memory.

I run fairly big models, that could do with a bit more memory (look at the picture, 32 Gb in use, 425 Mb/s disk i/o on my SSD).

lotsofmemory.png

Thanks.


This thread is inactive and closed by the PTC Community Management Team. If you would like to provide a reply and re-open this thread, please notify the moderator and reference the thread. You may also use "Start a topic" button to ask a new question. Please be sure to include what version of the PTC product you are using so another community member knowledgeable about your version may be able to assist.
12 REPLIES 12
Chris3
20-Turquoise
(To:ehaenen)

FYI - RAM is not like hard drive space. Windows and other programs will use it all no matter how much you have. What happens is instead of replacing data in RAM with new data, the data just stays in RAM. For instance, if you opened and closed paint and then opened 30 other programs, the data for the paint program would likely be replaced in RAM. However if you had unlimited RAM than it would not be replaced and the next time you went to open it, it would likely open quicker.

Having more will help but it may not be a big performance jump like you are hoping. In you case, having another (or multiple) SSD with RAID 0 will likely have a bigger performance grain as I/O is usually the bottle neck.

Chris

ehaenen
4-Participant
(To:Chris3)

Hi Chris

Thanks for your information, but in (large FE models) this works differently:

1: When the model gets big, not all the matrices will fit in memory, the Stiffness matrix gets partitioned and parts of it are placed on the hard drive (scratch) this seriously reduces performance compared to running the full model "in core'

The high disk i/o rate shown means the software is swapping data blocks form memory to disk and back. a normal harddrive would not get highet than approx 80 Mb / sec. this already reaches 500Mb/s. but is still a limit for the cpu, which can't run at 100% for lack of data.

2: For some types of analysis, the memory limit will limit the model size, for this type of analysis it means that if you require more than 32 Gbyte, the analysis will simply not run (message: the analysis requires an extra xxx Gb of memory to complete).

this is a shame, because the memory limit is (no longer) caused by the windows 64 bit OS. It is a limition in the simulate software which can and should be removed.

Erik

Chris,

That's true for 'normal' computer usage, but for very large FE analyses it's possible to use all the RAM - and more - just with Mechanica (or whichever solver you're using - it applies to Nastran and Abaqus too).

If the analysis requires 40 GB to run, and you only have 30 GB available on your system (allowing 2 GB for OS, Creo itself and other running processes) then the other 10 GB uses temporary files on disk, which typically have to be written and read many times. The solver's own temporary files are generally more efficient than Windows virtual memory, so it's best to set your options to avoid using that; using an SSD as Erik is, is clearly faster than a traditional HDD (particularly as the reads and writes are generally non-sequential) but it's not as fast as keeping the whole analysis in RAM.

Since it's now perfectly possible, and even affordable, to spec a workstation with 64 GB of RAM, it would make sense to allow Mechanica to use it all. Mind you, it's only a few years since the 8 GB restriction was raised...

gkoch
1-Newbie
(To:ehaenen)

There is a Product Idea and Product Manager is already in the discussion.

Yet according to recent notes, the limit is at 16 GB and some tests showed performance problems when increasing the limit.

See Product Idea: Creo Simulate 3.0 RAM Allocation for 64-bit Windows 7 Ultimate OS

dschenken
21-Topaz I
(To:gkoch)

First off, it would be odd to know there can be "a small performance degradation" in testing without there being a version of the software that can run that way. So it's possible to compile a large memory model version.

Is there some reason that version is not available to those users who need to exceed the 16GB limit? For the models that need it, it's a 100% degradation of performance.

Surely removing limits and compiling without them is no big issue - a lot of people in our development could do this.

But if our internal tests reveal problems, that cannot be addressed, such a version will not be released.

Obviously this was the case here, so the limit persists for the moment, until the performance problems can be solved.

For clarification: The performance degradation with larger memory setting compared to smaller settings was observed for models that exceed the current memory limit.

dschenken
21-Topaz I
(To:gkoch)

The description was as if for smaller models they would run slower with the large memory switch compiled in, not that there was some general or fatal problem. It did not seem from the description that large amounts of code re-write were required to expand the memory capability, such as using 24 bit indexing and needing to change related functions to handle 32 or 64 bit references.

Even that can be a valid reason to not release it.

Question is if you wish to accept a performance penalty for 99% of the users to accommodate 1% of the users that are now running into limitations of the software.

Although one can wonder about possible compromises. A config option to enable more than 32GB springs to mind.

Release it to whom? There's no reason not to release both.

ehaenen
4-Participant
(To:ehaenen)

Did my analysis just use 37 Gbyte?

I've moved to Creo 3.0 M060, it looks like somebody moved the 32 Gbyte limit while we weren't watching, or am I mistaken:

Look at my summary file: That's 37Gb in the line with Maximum Memory Usage

Machine Type: Windows 7 64 Service Pack 1

  RAM Allocation for Solver (megabytes): 512.0

  Total Elapsed Time (seconds): 19670.67

  Total CPU Time    (seconds): 20739.46

  Maximum Memory Usage (kilobytes): 37260651 

  Working Directory Disk Usage (kilobytes): 1003520

If I see this correct: Thanks PTC

Erik

From previous discussions on here, I believe the 'limit' refers to the SOLRAM value (512 MB in your summary).  This is only part of the total 'memory take', which is SOLRAM plus another amount of memory, which seems to be fixed for a particular model (or analysis, or mesh, or similar).

In the more general case, a run with, say 8 GB SOLRAM may well use 10 GB in total - although I've noticed that the "Maximum Memory Usage" rarely agrees with the memory usage reported for msengine.exe by Task Manager.  If your workstation is RAM-limited and your model is large (as yours appears to be!), it's sometimes necessary to reduce SOLRAM to keep the total memory usage within the available RAM.

Hi Jonathan

What i intended to post is different: Whatever you did with Solram, 32 Gb was to be the absolute maximum in memory that Simulate could address, be it physical memory, or using swap space.

The discussion here has been to ask PTC to raise the 32 Gb ceiling to a higher value.

As I've shown in the reproduction of part of the summary file is that it seems that PTC has done it (my analysis used 37 Gb, which was hitherto impossible, the analysis would have stopped with the fatal message that "the system was requesting an additional memory of ... Gb" which was not possible.

Now with Creo 3.0 M060, it just did it.

Good news i think.

Erik

Top Tags