cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Help us improve the PTC Community by taking this short Community Survey! X

Smoothing Data with a Variable Window

ClaudioPedrazzi
11-Garnet

Smoothing Data with a Variable Window

Hi everybody,

I am confronted with the following need (extracted from NUREG-0800, SRP, 3.7.1)

"At any frequency f , the average PSD is computed over a frequency band width of ±20 percent, centered on the frequency

f (e.g., 4 Hz to 6 Hz band width for f = 5 Hz).

Now I have a vector of equispaced points in frequency, resulting from a PSD. Say for example f=0.1, 0.2, 0.3 ... 50 Hz, and the corresponding Power Spectral Densities.

If I use the medsmooth function, I have to specify the window width, and it would be a constant window. What I need is a window that, for example, around

1 Hz is (1.2-0.8)/0.1 = 4 points and around 20 Hz should be (24-16)/0.1 = 80 points ...

Does it exist in Mathcad function for smoothing a data vector where I can specify the window width as a percent of the data value itself? Or have I to program it by myself? I have already looked at the various "smoothing" functions and quick sheets, but I do not find anything appropriate.

Thanks a lot in advance for any hints.

Best regards

1 ACCEPTED SOLUTION

Accepted Solutions

Here is what I came up with.

BTW, your output vector is one element less than the input!

View solution in original post

8 REPLIES 8

Don't think that we have a rountine built into Mathcad which does what you want, but it shouldn't be difficult to write it.

Does it exist in Mathcad function for smoothing a data vector where I can specify the window width as a percent of the data value itself?

What you mean is that you are smoothing vector vy but the value in another vector vx determines the window size, right. So the routine would have three parameters, vx, vy and the percentage p.

Can you provide a sheet with sample data?

Hallo Werner,

yes, that is what I mean. This is my solution (the data are random, it is a sheet that I developed for myself in order to test the function before I go and use it).

I do not like the repeated usage of medsmooth inside the loop although...

Here is what I came up with.

BTW, your output vector is one element less than the input!

Wow... and you even corrected my german title of the plot.

I thank you a lot. Your solution is much more elegant and makes no use of medsmooth inside the loop.

I like it and I will adopt it.

Best regards and have a nice weekend!

Glad you like it. We Austrians should know our umlaute 😉

But there is a difference between our routines. Mine assumes the values of vx being equidistant while yours works with those random jiggled x-values as well. The outcome of our both routines is very similar because the difference to an equidistant series is not that large in your example.

If working with a non-equidistant vector vx is demanded, my routine has to be changed a bit.

Yes Werner, i did notice that. Actually my "real" vector is with constant spacing, but I just had fun in creating my random data example and wanted actually something more general. But I did not point that out to you, because I thought I would be able to program it myself. Now you were faster, thanks a lot 🙂

And... viele Grüsse (mit Umlaut) aus der Scwheiz from an italian-mother-tongue speaker

Best regards

Yes Werner, i did notice that. Actually my "real" vector is with constant spacing, but I just had fun in creating my random data example and wanted actually something more general. But I did not point that out to you, because I thought I would be able to program it myself. Now you were faster, thanks a lot 🙂

So maybe you come up with a faster routine at the end.

Mit besten Grüßen aus Wien (auch wenn das scharfe ß in der Schweiz nicht mehr verwendet wird)!

For what it may be worth here are two more routines which should cope with non-equispaced vectors vx. We are still assuming though that vx is sorted in ascending order.

Version 2 needs double the time of Version 1 and Version 3, which I like because of its elegance, would need about seven times the time of the first. I guess that constructing that vectorized expression, which I have stolen from your solution, slows calculation down significantly.

Top Tags