Hi everyone,
I would appreciate if someone would like to share his/her experiences concerning the numerical integration function. I am aware of what the help says, namely that "Sharply peaked integrands, or functions whose shape is not readily characterized by a single length scale, do not evaluate accurately. This is the nature of numerical integration. Accurate results may be obtained by breaking an integral into pieces and separately integrating around the peak and away from the peak.".
I am experiencieng some difficulties integrating artificially generated earthquake acceleration time histories (discretized, of course, for example with a sampling rate of 100 or 200 Hz) to velocities and then to displacements. Admittedly they are peaked (see attached example), but I am surprised how often the standard integral function of MathCAD 15 fails to converge (with autoselect, or with Romberg or Adaptive). The convergence problems happen if I use linterp on the time history, and tend to diminish if I interpolate the points with lspline, pspline or cspline. TOL is 1E-3 as per default.
I have been forced to develop my own function (a Simpson rule iterated everytime doubling the number of intervals until relative precision of TOL is reached).
So as I said, I would like to know if some other user has encountered similar issues.
Best regards & thanks in advance
Claudio
Solved! Go to Solution.
If you have them as sampled data, why not just sum the data?
I should guess the numerical integrator of Mathcad is designed to work with functions. You don't have a function, you have sampled data. And using linterp or spline algorithms is trying to make a function of the data where, in your case, there isn't one... The best improvement you can get is by increasing the sampling rate.
Success!
Luc
If you have them as sampled data, why not just sum the data?
I should guess the numerical integrator of Mathcad is designed to work with functions. You don't have a function, you have sampled data. And using linterp or spline algorithms is trying to make a function of the data where, in your case, there isn't one... The best improvement you can get is by increasing the sampling rate.
Success!
Luc
Hi Luc, Hi Fred,
thanks for your insights! Yes, it is true, sometimes one does not see the forest because of the trees!
Basically you are right, I could simply sum the data points, like Luc says and Fred demonstrates.
Defining a function and using the integral of MathCAD seemed me the more elegant way, for example I can use such expressions as: and similar, also for evaluations, making easier to understand my worksheet. And you are right too that I am trying to create a function where there is none, when I interpolate with spline, but .... it makes sense, "nature doesn't jump", look at the following extreme zoom:
I will put it so: I am surprised that, given how easy it is to write some code reaching a reasonably accurate result, that the MathCAD builtin function "gives up" and fails. My test dataset have 4400 points, but the first failures are already to see with 1000 points.
I just programmed the Simpson integration rule and iterated it doubling the number of intervals at each iteration (until reaching TOL relativ precision or a maximum number of iterations). It gives comparable results to MathCAD integral, but converges "always".
Best regards & thanks again
Claudio
"nature doesn't jump" forget's the fact that there are jumping phenomena in nature.
Look at quantum-theory...
And I also argue against "it makes sense". What you are showing is a curve fitted to your sampled points. Without any knowledge of the accuracy of those points (how much of it is noise?) there's no telling if the curve should be above the leftmost peak bar, or fully below it, or somewhere in the middle.
If you know there's a band of (hopefully high-frequency) noise disturbing your signal it would make sense to process it with a digital (low-pass) filter before you summate. But other than that there's little you can do to improve the accuracy. Defining you own simpson algorithm is fine, but simple summation also always (? well...at least usually) converges with sampled data.
Luc
Ok Luc, agreed. I was aware that I could get some rebuke with my quote!
The time series are generated by me or other earthquake-related people, with relatively well documented methods for the so called "spectrum compatible time history generation" in seismic analysis. I could theoretically have the time resolution that I wish, at least if I am the creator of the series. But you are right: in the middle there could simply be anything, what I am doing is "inventing" data. So probably the best I can do is just forget about the beutiful functions and work with my data, without any interpolation. But I tested the interpolation to see if the integral function works better, and this I can confirm. It does converge "more often" but somewhere where the points begin to be > 1000, then it "quits", no matter how smooth it is the interpolation.
Regards
Claudio
PS: some refs
http://www.iitk.ac.in/nicee/wcee/article/13_2096.pdf
http://www.dynamassist.com/ (free downloadable MathCAD-Worksheets, one of them contains a simplified version of time history generation, but it does not fulfill all required criteria from the norms)
I suspect that the reason Mathcad's integration schemes have trouble is the discontinuous nature of your interpolation functions.
This same interpolation is a source of error in your Simpson's Rule integration--when you double the number of integrals you're manufacturing data that may not be valid. Only the acceleration data you start with are valid, so that data fixes your integration accuracy.
There might be some way (I haven't found it yet) to take an FFT of your data and integrate that series term by term to get displacement . . .
Nothing to add, except that I agree with Luc and Fred. Since you have discrete data, work with the discrete data.