Community Tip - Have a PTC product question you need answered fast? Chances are someone has asked it before. Learn about the community search. X
Recently we started conversion of different MC15 sheets to PRIME 7.
Although the amount of error solving after conversion take a lot of time, the results look promising.
However for the large sheets, the system starts to drawn due to very high CPU,GPU and RAM usage.
Just simple scrolling the PRIME 7 document seems impossible.
One of our major company sheets, used for calculation of structural capacity of cranes, uses 6GB of RAM only for loading the sheet after conversion to PRIME. The "original" MC15 version uses only 260MB of RAM in order to load it.
Loading time for the PRIME 7 sheet is approx 7minutes, the MC15 version took 15sec.
We are quite desperate with having a fully functional MC15 sheet converted to a PRIME 7 sheet that seems useless.
Can anybody advise?
Thanks in advance
André
structural engineer
Hi,
From Mathcad 15 to Prime 7 is 32 bit to 64 bit processing that is at least double the size. Programming in the time difference has gone from char to wide char types also meaning a doubling. Enlargement in size is typical.
This rise from 0.095 GB to 0.560 GB approx is typical for a Mathcad file to Prime. The size ratio is 5.8
A way forward is to break the large crane file into chapters. There are several approaches to transferring input and output data between chapters using Prime inclusion files, Excel, or text files.
Cheers
Terry
Hi,
Below enclosed is three files of the same thing. Their sizes are 7, 76, 134 kB.
The bloat is going from a proprietary system of storing a file 7 kb.
To a compressed XML system at eleven times the size to store the same thing at 76 kB.
The uncompressed XML is 134 kB.
It takes time and memory to decompress and parse the compressed XML file.
If you have a very large file the time to decompress and parse the XML becomes noticeable as in your case.
The solution is to break it into a number of smaller files.
Cheers
Terry
Hello,
the way I handle large files (measurement data of strain gauges) is to condense the data set if that is possible. For that I wrote a small routine that extracted every, say 10th, datum from the data set and used this from then on.
Not a very elegant solution for a software that claims to be the fastest.
Cheers
Raiko