cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Stay updated on what is happening on the PTC Community by subscribing to PTC Community Announcements. X

Entropy (of image)

Jbryant61
4-Participant

Entropy (of image)

Hi, I've have been looking at the idea of using entropy as a measure of uniformity of an image. I have created a worksheet with the calculation but is doesn't agree with the inbuilt function in matlab. Can anyone point to an error I have made.

Thanks

Jason

1 ACCEPTED SOLUTION

Accepted Solutions
LouP
11-Garnet
(To:Jbryant61)

I don't see any prob with your calc. The entropy will depend on both the distribution and the number of bins used - the quantization of the input array Z. As an experiment, I replaced the array Z with an array of uniformly distributed random numbers. With the # bins=256, your entropy calc=8bits(as expected); with #bins=64, the mcd calc gives entropy=6bits. (Careful with units also). #bins=200 gives entropy=7.64bits with uniformly distributed Z and the 6.6 with original Z. This all makes sense.

The quantization (#bins) is the key. If you make the #bins=2 in your calc (asymmetric binary distribution), the resulting entropy=0.476bits. I suspect the matlab calc is making some assumptions in its calc that are different from those in your mcd calc.

One other comment. The calc results will be an upper bound, since they (mcd and matlab) include an assumption that all elements of Z are independent. If true, thiswould make it appear as a uniformly noisy field(which it is not) with a white spectrum. While this may be approximately true here, it may not hold in general for the images of interest. The correlations are exploited by all compression algorithms.

Lou

View solution in original post

6 REPLIES 6

After spending more time than I wanted to researching MatLab functions, no, I can't see an obvious error. You are, however, doing a fair amount of normalizing; I suspect that the difference lies there. I suggest you compare histograms, bin sizes and numbers, etc. MatLab appears to zero out log2(0) elements--not sure if that's part of the problem or not.

LouP
11-Garnet
(To:Jbryant61)

I don't see any prob with your calc. The entropy will depend on both the distribution and the number of bins used - the quantization of the input array Z. As an experiment, I replaced the array Z with an array of uniformly distributed random numbers. With the # bins=256, your entropy calc=8bits(as expected); with #bins=64, the mcd calc gives entropy=6bits. (Careful with units also). #bins=200 gives entropy=7.64bits with uniformly distributed Z and the 6.6 with original Z. This all makes sense.

The quantization (#bins) is the key. If you make the #bins=2 in your calc (asymmetric binary distribution), the resulting entropy=0.476bits. I suspect the matlab calc is making some assumptions in its calc that are different from those in your mcd calc.

One other comment. The calc results will be an upper bound, since they (mcd and matlab) include an assumption that all elements of Z are independent. If true, thiswould make it appear as a uniformly noisy field(which it is not) with a white spectrum. While this may be approximately true here, it may not hold in general for the images of interest. The correlations are exploited by all compression algorithms.

Lou

Jbryant61
4-Participant
(To:LouP)

Hi Lou, thanks for your reply. Im assuming then that if matlab is only using 2 bins for the histogram, then this is not very accurate. It would be better to use more bins via mathcad.

Jason

RichardJ
19-Tanzanite
(To:Jbryant61)

I think you need to sum over all the pixels, not all the histogram bins. See here:

http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

Your random variable is the pixel intensity, so in p(x[i)*log(p(x[i) p(x[i) refers to the probability of the intensity of pixel i.

LouP
11-Garnet
(To:RichardJ)

The histogram bins account for all of the pixels, grouped by particular outcome or value - the bin number. The entropy ( or information) of the image depends only on the number of distinct values present and their relative frequency (the probability of that value). This is precisely what the histogram info provides. The point I was making in my earlier post is that the entropy depends on the level of quantization (i.e., # bins) used, so one needs to define this first, whether 200 levels or 200K levels. The nature of the measurement and its underlying noise will help determine what quantization level makes sense.

Lou

RichardJ
19-Tanzanite
(To:LouP)

Yes, OK. I see

Top Tags