cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Need to share some code when posting a question or reply? Make sure to use the "Insert code sample" menu option. Learn more! X

What do you do when your model reaches Simulate's 16Gb limit

rbrooks
2-Explorer

What do you do when your model reaches Simulate's 16Gb limit

ANybody know what can be done when the limits of Creo's capabilities are reached?

I have a large assembly which I need to do dynamic analysis on. Unfortunately the modal analysis fails because it requires more memory than simulate can handle and it stops the analysis with a "fatal error".

Unfortunately I have been given the job of analysisng how bolting these assemblies together increases their strength so any improvements I make by simplifying the assemblies will be quickly wiped out.


This thread is inactive and closed by the PTC Community Management Team. If you would like to provide a reply and re-open this thread, please notify the moderator and reference the thread. You may also use "Start a topic" button to ask a new question. Please be sure to include what version of the PTC product you are using so another community member knowledgeable about your version may be able to assist.
18 REPLIES 18
DenisJaunin
15-Moonstone
(To:rbrooks)

Hello, Richard,

You can simplify your parts, removing the shelves, chamfers and other inutils details.

Can you use the symmetry function.

Cordially.

Denis.

Good advice from Denis. Also if you are using mesh controls, shoot for larger elements where appropriate. Max element size controls on surfaces and edges are much thriftier than on components and volumes. Also consider using shell and beam elements where appropriate.

Almost any complex model can be simplified for the purpose of obtaining meaningful FEA results. Some details matter, most don't. Figuring out which is which takes either experience and/or a number of smaller trial models.

EDIT: Also try AutoGEM > Geometry Tolerance > Increase the 0.0001 defaults to something larger. That will auto-smooth fine details.

Message was edited by: David Malicky

What do you mean by "Simulate's" 16 Gb limit? I've run models that have gone upwards of 40 Gb. What is your SOLRAM set to? How much memory does your machine have? How much virtural memeory do you have set aside for your machine?

Shaun,

I have 128 GB available on my 64bit machine and 2 2.7 GHz processors. I have tried to increase my solver settings memory allocation box to anything above 16GB and get the error "The maximum memory allocation (solram) is 16384 MB.". I went into my config.pro settings and found the "sim_solver_memory_allocation" parrameter and set the value to 65536. And I still get the error. I can't find a "solram" parameter anywehre. Any Ideas?

Thanks

Jesse

Jesse,

You are correct, There is a 16gb Solram limit in Creo Simulate. Prior to Creo Simulate if I remmember correctly it was 8gb in Wildfire 4 and earlier.

I know PTC is looking into expanding the limit for a future release of Creo Simulate.

Not sure if you have tried taking part of you physical ram and tried turning it into a RAM Drive to handle the temp files Simulate creates to try to speed things up?

Hope this Helps,

Don Anderson

rbrooks
2-Explorer
(To:danderson)

Not sure if you have tried taking part of you physical ram and tried turning it into a RAM Drive to handle the temp files Simulate creates to try to speed things up?

Yes this makes a big difference on speed but speed is pretty much irrelevant when the model won't and there isn't a satisfactory explanation about what is wrong.

Re: What do you do when your model reaches Simulate's 16Gb limit

I am told by our technical support that there is a limit on model size of 16gb and as my model is larger than this it won't work on simulate. I haven't found anything to disproove this yet.

I am running a 4 core machine with 50Gb of Ram and Solram set to 16gb.

Static analysis works fine but modal analysis fails with a "disk read error" which I am told is a generic error message which comes up with various problems. I am also told that the problem is the size of the model.

One problem I have is that it's going to be difficult to simplfy the model as I need to compare the effects of dynamic shock on 1 unit (which is over 16gb) to 3 units bolted together.

As far as I can see there are tow options:

1. Creo simulate is not fit for puropse because it won't work with bigger models.

2. Our technical support is not fit for purpose as the have no idea what the problem is and Creo simulate is less than ideal due to the complete absence of any helpfull information about why it is crashing.

I have undoubtedly run much larger models. Look for the solram/memory discussion in one of these forums and a post from Tad Doxsee. I think he did a pretty good job of explaining what is going on with memory usage if I remember correctly.

Don't max out solram. Use a smaller setting or even the default. Solram increases can improve the speed of the run, but jacking up the setting doesn't help you run larger models. Having the memory available in physical or virtual RAM is what you need. Your model sizes are basically limited by the last two and how much time you are willing to surrender.

I agree with Brad, I have a 24 GB, quad core machine and have solved dynamic random, time and frequency analyses with one of my models global matrix profile in excess of 60 GB's (took forever). I am not sure you are correctly interpreting if your model can 'fit' in SOLRAM. It does not have to and with the excess RAM you have available for disk caching I think your model should run without error assuming you have it setup correctly. When you mention an assembly bolted together I worry that is were your problem may lie depending on the idealizations you have used. I will say that Simulate 2.0 has been able to solve everything we have tried once our simulation was correctly setup.

Read this thread which has a reply from Tad Doxsee which does a good job explaining some of what goes on under the hood of Simulate.

http://communities.ptc.com/thread/33467

Hope that helps and good luck,

I'm not sure I follow the logic of "it's going to be difficult to simplfy the model as I need to compare the effects of dynamic shock on 1 unit (which is over 16gb) to 3 units bolted together." That implies it is a complex problem, yes, but not that it can't be simplified. A CAD model is not an FEA model. FEA analysts have been working with far fewer GB for decades and have solved very complex problems, with appropriate simplifications.

For speed, writing the temp files to an SSD makes a big difference, especially if setting SolRAM to a low number (see my graphs in the link Jarret posted).

Hi David

The logic may be flawed but if, as I am being told, a model of approx 16.5 gb I am using at the moment is too big to run then it may not be a big job to simplify it to get it under 16gb but it's going to be a lot harder to simplify a 49.5gb model made up of 3 assemblises to under 16gb.

The question of "appropriate simplifictaions" is an interesting one. From past experience every simplification we introduce to a model brings an objection from our customer and a request for a report justifying the variation from the design.

Hi Richard,

What value are you calling the model's "size"? Global matrix profile? Element file? Something else?

Hi Richard,

Hmm, sounds like you have a customer that doesn't understand FEA. That may indeed be hard to change. Possibly some education on their part would help:

http://creo.ptc.com/2011/05/19/creo-1-0-and-%E2%80%98cae-and-fea-workflow%E2%80%99/

http://www.adina.com/newsgH108.shtml

http://books.google.com/books?id=eG-y546wpSgC&pg=PA179

Or, perhaps they do understand FEA and have seen the effect of bad assumptions.

Defeaturing/Cleanup from 50gb to 10gb or even 1gb may be very reasonable, depending on the purposes of the analysis, the model geometry, and the reasons behind why 50gb is being generated.

As a general rule:

- If the goal of the analysis is to find overall stiffness or modal shapes/frequencies, small features like bolt heads, threads, fillets, chamfers, small holes, etc, do not matter.

- If the goal is to find stresses in predictable locations or on broad surfaces, small features away from those locations do not matter.

- If the goal is to find stresses in inside corners, fillets, etc, then those small features are very very important and need extra elements. Do not defeature those, e.g.: http://books.google.com/books?id=ewHIbtBQ2mQC&pg=SA3-PA3 But, chamfers and radii on *outside* corners generally do not matter, unless they are 'large'.

Another main question is why the single model generates 16gb. Most of the elements usually come from the smallest and least important features (and often small modelling mismatches), unless there are excessively tight mesh controls.

Often CAD modelers or engineers are given FEA problems to solve without also receiving much FEA training, or guidance from an experienced analyst--sounds like you may be in that boat? There is a lot that can go wrong in FEA if the model isn't setup, meshed, and analyzed appropriately. Even after 1 semester of an FEA class, students often make mistakes that make the results meaningless. I ask because it sounds like you're fairly new to FEA and your company has dropped this in your lap without supporting you?

Another main question is why the single model generates 16gb. Most of the elements usually come from the smallest and least important features (and often small modelling mismatches), unless there are excessively tight mesh controls.

The model is an assembly of 132 parts, all parts have been simplified by removing unused holes, irrelevant details, fillets and chamfers.

The maximum memory usage from the .rpt file is 22265439KB which the company who provide our Creo technical support have told me "is so big that it needs to use more memory than Mechanica allows and hence it crashes." Looking at the responses on from Brad and Jarret here this isn't correct which answered my original question and I can take this back to our support company (thanks Brad and Jarretand thanks to everyone who has taken time to respond here

).

Another issue I have with this model is I don't have the time/budget to do any work on it. The model was produced by an ex-colleague who isn't here anymore and my original reason for looking into it was I believed the approach taken of only analysing one unit and looking at static results rather than dynamic was wrong. I proved this in 1/2 an hour using analogous simplified models of multiple units and conventional calculations and the project is continueing without the need for futher FEA at this stage.

Given a week to look at every part in the assembly, decide whether or not it should be included, how it could be simplified and how it interacts with other components I believe I could create a model which would work. Unfortunately I don't have this time/budget but I could get a short time to change the existing model to complete the existing analysis if I could work around the problem.

The failure from the .rpt file is as follows:

The design study terminated abnormally.

An engine disk read error has occurred, probably due to

insufficient disk space or directory/file permissions.

which I'm told is a "generic" error message. As we know there is sufficient space and the permissions are OK I was also hoping someone would have met this problem before with large models and know what the problem was.

Richard,

I'm going to go with Brad's answer above ("don't max out SOLRAM").

You say you've got SOLRAM set to 16 GB, and the 'max memory usage' shown is 22 265 439 KB (my spacing for clarity). Essentially this tells us that Mechanica is taking [SOLRAM + 6 GB], so if you drop SOLRAM to 8 GB it should now only use 14 GB in total (although it may continue to grow if it gets past this particular crash point).

Also, you don't have a digit '0' anywhere in the analysis folder path, do you?

http://communities.ptc.com/message/199209#199209

SylvainA.
12-Amethyst
(To:rbrooks)

Richard, did you check your HDD free space availability when the error happens ?

rbrooks
2-Explorer
(To:SylvainA.)

Re: did you check your HDD free space availability when the error happens ?

Yes it's around 643 Gb

Richard,

Have you verified you have adequate swap space allocated? I was having similar crashing issues with a large dynamic analysis which was purely related to disk swap. When the model's global matrix profile is larger than your SOLRAM setting, the engine begins to use swap space (which is orders of magnitude slower than RAM). The reason you should not max out SOLRAM is for disk caching, which is essential if you are running into swap scenarios; however, it seems your system has enough ram that you could be able to max out SOLRAM, although I do not recommend it. The default swap settings in windows are to allow the system to manage the page file size. My issues went away by forcing a larger swap space allocation (100GB on my machine) to allow the solver to run with essentially no limit.

Lastly, what type of analysis and definitions are you using? MPA or SPA? And are you using Simulate 1.0 or 2.0? There is a HUGE difference in dynamic solution times between 1.0 and 2.0 as well as MPA and SPA.

Announcements
NEW Creo+ Topics: Real-time Collaboration


Top Tags