Community Tip - Did you get an answer that solved your problem? Please mark it as an Accepted Solution so others with the same problem can find the answer easily. X
It is CE-Tol, formerly Ti-Tol ... back to University of Utah.
Sigmetrix claims responsibility for it now.
The analysis becomes part of the model, so as the model changes the analysis will update.
x
Just finished training on version 8.3 and I don't think you can get the same quality and/or quantity of information about your model and the different sensitivities from a hand calculation in the same amount of time.
I am guessing GM did some hand calculations and regrets it now.
I spent almost and entire year setting up tolerance analyses with the competing VSA software for hundreds of parts and dozens of large component count assemblies.
What I discovered is that tolerance analysis is the easy part. It is tolerance allocation that is difficult. At least it is difficult to do a meaningful job.
It's a bit like being a good writer or a good programmer. Before a person can do either one well they need to read a lot and have a lot of feedback to know if they are doing a good job of it, that is, knowing how to get the results they want. In tolerance allocation, most people act as if it's most important to put down numbers that manufacturing agrees with, but without any feedback as to the overall result.
The difficulty in doing that is that most real limits on acceptable variation are stress and deflection limits which are often driven by life limits for reliability. All of that needs to be determined before the results of a tolerance analysis is usable.
VSA used a Monte-Carlo approach to vary a representation of the geometry within limits. They generated an intermediate source code so the user could see exactly how the simulation would work. For more sophisticated users, this gave the option to include external routines that either used or modified the representations. If there was an analysis on a light-bulb filament, one could include equations that solved for luminous output as a function of filament diameter or length, orequations forfilament evaporation rate, and benefit from the variation in basic geometry and from the statistics being generated while only having to contribute static equations.
CE-Tol used a vector-closure method and mathematically superimposed the 3-Sigma or 6-Sigma distribution on the tolerance range. They also seemed to depend on being able to regenerate the model geometry to get geometry sensitivities. This meant that parallel faces in models remained parallel in the analysis and so size variations did not produce angle variations. While it is true that many size-variant parts do retain parallel faces, it is also true that '14.5, for example, doesn't guarantee it.
The marketing-claimed advantage for CE-Tol was that it was a direct calculation, where VSA ran randomized simulations. I didn't see the problem with that given that the factory will be producing randomized actual variations, so slight differences in the mean and deviation were going to happen anyway.
If I were manufacturing a large (50,000+) number of critical assemblies, I would probably include VSA in the analysis path. It's part of whoever owns UGII, but they used to have a stand-alone version. I would probably not use CE-Tol because the type of analysis it seems to perform should be covered in the up-front work done before even starting an analysis. Both of them require a lot of up-front stress work, manufacturing cost/fixturing analysis work, assembly fixturing work, production capability capture, and so forth to have the result be meaningful.
x
Possibly, but I think the more likely fail was that as they went to keys with transmitters in them to operate power door locks and trunks (up to then these would have been a box separate from the key) they made the key ring slot longer so that small key rings would allow the now larger keybody to turn and fold up with the rest of the keys, or just for appearance.
Where they failed was in noticing that this allowed the keyring greater leverage on the lock in such a way as to tend to turn the car off - taking the ignition and airbag system with it, leaving the power steering system inoperative, and with onlya short brake boost for the power brakes.
On older cars the slot is smaller so there isn't as much leverage and this would not have occured as easily. Of course older cars didn't have airbags either.
To do a tolerance analysis you'd have to predict key-ring weight variation, vehicle acceleration contribution to turn-off torque, material wear to the detent that holds the key in the run position and friction variation in the switch parts. None of these are part of any tolerance analysis package I've seen, but could have been added to a VSA model, though as I wrote before, once you know you need to deal with these, you don't really need an analysis.
In the case of the fatalities I'd like to know how often it was that car operators were trying for maximium braking prior to collision and whether that acceleration tended to increase the turn-off torque. If that is the case then changing the switch operating orientation so that the torque was neutral during heavy braking events would be worth considering.
The real horror is if the collision itself led to (if) torque on the (if) switch to move it (if) to the off position, shutting off the airbag system when it was needed most.
Ignore the (if)s. I just don't need someone copy/paste and starting an unfounded rumor, because "if"
In Reply to Kevin Audibert:
Just finished training on version 8.3 and I don't think you can get the same quality and/or quantity of information about your model and the different sensitivities from a hand calculation in the same amount of time.
I am guessing GM did some hand calculations and regrets it now.
x
It's funny about the CE-TOL reps saying no one does loss of parallelism ans such, as that is what VSA handles well enough.
The VSA analysis deforms the underlying geometry in accordance with limits allowed by the applied tolerances, then assembles parts and makes measurements based on those deformed and reoriented parts, essentially simulating the actual making and assembly of parts.
It's too bad VSA is buried over at http://www.plm.automation.siemens.com/en_us/products/tecnomatix/quality_mgmt/variation_analyst/ I believe Seimens rolled VSA in under Tecnomatix; I think VSA went from standalone company to being owned by UGS, which was bought by Seimens. Tecnomatix did the human factors analysis puppets called Jack, and other factory design and analysis software.
3DCS is another Variation Analysis (VA is not VSA) supplier. http://www.3dcs.com/index.html They claim an integration with Pro/E http://www.3dcs.com/pro-e.html
x