cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Summary Report from ModelCheck

joe_morton
15-Moonstone

Summary Report from ModelCheck

We're starting to look at using ModelCheck. We have complex CAD assemblies where different engineers own different assemblies and parts. The use case that we're expecting is that an engineer will run ModelCheck on a top level assembly, then share out the results with all the engineers who worked on that design. That way, each engineer could review the list for their parts and address any errors. 

 

I'm trying to find the easiest way for that engineer to run ModelCheck, then pull the results into a spreadsheet. Is there an easy way to do this?

 

The only path I've found so far is the HTML summary, which can be copy/pasted to a spreadsheet, but then there are steps needed to clean up formatting and filter down to just the parts that have errors. It's not too bad for a one-time event, but it would be difficult to implement across a team doing it frequently.

1 ACCEPTED SOLUTION

Accepted Solutions

For anyone else interested in the ModelCheck Metrics function:

You have to enable metrics, as described here: http://support.ptc.com/help//creo/creo_pma/r7.0/usascii/index.html#page/model_analysis/modelcheck/To...

Once enabled, you must modify your check file (extension .mch) and set Metrics to Y for the settings you want reported to the file. It seems that not all settings support reporting, however. 

The metrics file is essentially a log file of ModelCheck running. There is a line when ModelCheck starts processing a model, then it records out for each (supported) check (where Metrics is set to Y)  the result of the check.

 

But to my original intent, it seems there is no direct way to get a report into a spreadsheet of which models had errors. We may look into developing a script to parse the report files. 

View solution in original post

6 REPLIES 6

Hi @joe_morton 

 

I don't think there is any direct option to to get the results as expected. However you may import metrics results, exported to text file by ModelCHECK, in Excel. Within ModelCHECK Settings there is a mode "Metrics" enable the required checks. On each analysis, it will store the information of results and on exiting Creo, application will create a text file along with html files. You may import these files in Excel.

Location of these files can control by setting dir_metrics in config_init.mc.

 

I hope this will help. 

 

Thanks 

Hi @Mahesh_Sharma,

Thanks for your reply. The metrics does seem like it might have the functionality I'm looking for. I'm doing some testing, and right now it is outputting for each model run:

Username, Date, Time, Filename, File extension, mc_initialize, 1, 0, MC_regen

 

What I really want to get is if the model had errors or not. It seems to be reporting "1, 0" for every model. Any idea what this corresponds to, and is there a way to configure the metrics to maybe get a count of errors?

Hi @joe_morton 

 

"Username, Date, Time, Filename, File extension, mc_initialize, 1, 0, MC_regen" in text file will remain same in each file. Information about Modelcheck will result in lines after the mentioned one, something like.... 

 

Mahesh_Sharma_0-1668705065545.png

 

 

Which version of Creo are your working with?

 

Thanks. 

 

Testing more.. it looks like only some checks can output metrics. I was able to get reporting on ACCURACY_INFO, PARAMCHECK, REGEN_XSEC, FAILED_COMPONENTS.

 

I'm not able to get anything for REGEN_ERRS or REGEN_WRNS.

I think, after setting appropriate values for check in Metrics column in check file, you will get the required results. 

For anyone else interested in the ModelCheck Metrics function:

You have to enable metrics, as described here: http://support.ptc.com/help//creo/creo_pma/r7.0/usascii/index.html#page/model_analysis/modelcheck/To...

Once enabled, you must modify your check file (extension .mch) and set Metrics to Y for the settings you want reported to the file. It seems that not all settings support reporting, however. 

The metrics file is essentially a log file of ModelCheck running. There is a line when ModelCheck starts processing a model, then it records out for each (supported) check (where Metrics is set to Y)  the result of the check.

 

But to my original intent, it seems there is no direct way to get a report into a spreadsheet of which models had errors. We may look into developing a script to parse the report files. 

Announcements