Community Tip - Your Friends List is a way to easily have access to the community members that you interact with the most! X
We know that for Creo-Simulate, there was a limit in the memory that can be addressed. This was independent of the physical memory in the computer, or swapspace, or solram.
If an analysis needed more than 32 Gb, the analysis would stop with an error message
We had a discussion about it on this forum in may tyhis year (with Jonathanan, Patrick, Steven, Gunter, David)
I am please to be able to tell you that the limit has been moved.
After switching to Creo 3.0, M060 I ran an analysis which used 40 Gb (which would simply not have completed in Creo 2.0)
45961 solid elements, 10 contacts, elasto-plastic analysis with 46 output steps:
Analysis "Hefboom_finesmesh_bigmemoryjob" Completed (08:44:04)
------------------------------------------------------------
Memory and Disk Usage:
Machine Type: Windows 7 64 Service Pack 1
RAM Allocation for Solver (megabytes): 512.0
Total Elapsed Time (seconds): 54376.72
Total CPU Time (seconds): 71162.71
Maximum Memory Usage (kilobytes): 40571058 that's 40 Gb
Working Directory Disk Usage (kilobytes): 4847616
I do not know where the new limit is.
I hope PTC will tell us.
regards
Erik
------------------------------------------------------------
Creo Simulate Structure Version P-10-27:spg
Summary for Design Study "Analysis1"
Thu Mar 05, 2015 07:41:10
------------------------------------------------------------
Springs: 156
Masses: 0
Beams: 76
Shells: 22302
Solids: 245498
Elements: 268032
Description:
REF7 - first 100 modes (and hope we don't run out of d
isk)
Modal Analysis "Analysis1":
Machine Type: Windows 7 64 Service Pack 1
RAM Allocation for Solver (megabytes): 16384.0
Total Elapsed Time (seconds): 62761.70
Total CPU Time (seconds): 28772.04
Maximum Memory Usage (kilobytes): 69400130
Working Directory Disk Usage (kilobytes): 29021178
Creo2.0 M100
It was one of those studies that simply had to be big.
Wow! 69 Gb
So it was there in Creo 2.0 M100, I still run M010 for Creo 2.0 that would definitely not have used 69 Gb
Has anyone reached even more?
Where's the new limit?
Hi Steven
In this case it doesn't seem to matter how much you allocate (I left it at default). It just took it.
Steven
i see your point, it is not possible to set solram to a higher value then 16384.
But that is not the issue here.
Until recently, when you had a really big model, the minimum required amount of RAM for the analysis could be more than 32 Gb.
In that case the analysis would not finish, the job would run until 32 Gb of RAM was in use and then stop with an error message. This was not influenced by the solram setting, you simply would get no results.
Now (Creo 2.0 M100 and Creo 3.0 M060) it can be seen that Simulate can go beyond that point. My latest analysis has allocated, acquired and used 60 Gbyte of RAM, the one Charles simpson did even 66Gbyte. RAM that is.
The solram setting does not change this.
We've lost a limiting factor. We can now run bigger models. Period.
I would like to hear what PTC can tell us about that, Gunther Are you reading in?
Erik
It would be nice to finally lay the question of exactly how memory is used firmly to rest.