Counting relationships: is it better to use a dynamic calculation or a relationship field rule?
I need to count the relationships for some items types, like for example the number of "Validated By" traces for a Requirement.
1) One way to do that is via a dynamic calculation: for example, "Trace Count - Validated By" is defined as isEmpty( RelCount( "Validated By" ) , 0 ).
2) The other way is to keep the field as a normal input integer (no calculation), but define a rule on the relationship field "Validated By" that runs a trigger script like "count.js".
Which solution is better from a performance standpoint?
- On one hand, having the dynamic calculation means it's performed every time the item is accessed. Changing the computation to static is not a solution since I need the number as soon as it changes.
- On the other hand, I've noticed that rule triggers are run every time the item is edited, no matter which fields are edited. I even suspect that all the rules are run for all relationship fields, but I'm not sure of that (and no time to investigate further). For items in a document that is saved as a whole, that could mean tens of trigger calls for each edit.
I've implemented both and don't perceive any performance impact. However, I'd like to get some feedback from the community and PTC before going further.
Re: Counting relationships: is it better to use a dynamic calculation or a relationship field rule?
I use a mix of both techniques, depending on the need at the time. If the computation is looking to navigate through a relationship tree, then I will use a trigger to count the number of related items at each level of the tree (I usually hide the fields from users) and then use a dynamic computation at the top of the tree to simply add the hidden count fields.
In terms of a concrete answer of which technique is better, that may be tough. It really depends on the volume of traffic that you have, the complexity of relationship fields, complexity of the computation, the capability of your database server, number of other dynamically computed fields that you have on the item, etc. All of these can play a role in performance.
Other considerations to keep in mind: If you trigger the value for count, you can then trigger off of changes to that field. For example, if you needed to fire 3 other triggers based on the count field reaching a value of 20, you can accomplish that without having to recompute the field for each trigger definition. You can also have other computed fields that use the count value in their computations, without those fields having to re-count the number of related items. With a trigger approach, you can also make the trigger very specific based on values/fields in the related items. Any logic required to analyze the related items to derive the count may be easier to implement in a trigger.
The only time that I resort to statically computed dynamic computation fields is if the computation is based on some sort of date or time constraint. So if the value is determined to be accurate to within a day ("Daily Count"), I'll use a dynamic computation field and then run a static computation against it at night. Otherwise, if you have too many statically computed dynamic fields, you may find that you can't compute them all in a single day.
Another practice for computed fields is to make the computation as specific as possible. Example, if you have items in a final state where they can't be edited anymore, there's no sense in computing the field anymore either. So include logic within the computation itself to avoid computing a field if it couldn't possibly have changed anyway.