There is no such strict term as "intensity" in optics, photometry or image processing. (People working in this area and using this term, dont's rush to object!) There is a fuzzy jargon usage of the word when no quantitative measure is important; something which is "the lighter, the more". Gray-scale image pixel value is measured by some unsigned integer value, usually 8-bit or 16-bit; an alpha channel can be added (transparency).
If you explain what measure you want to need for what purpose, I would probably be able to advise what to use.
[EDIT]
Answering the question after clarification. "Histogram equalization" is explained here:
http://en.wikipedia.org/wiki/Histogram_equalization[
^]. This is required explanation of gray-scale image:
http://en.wikipedia.org/wiki/Grayscale[
^].
Unlike pretty complex color case, histogram equalization in gray scale is fairly simple transformation, because the image is, well… gray-scale. In the article about histogram equalization referenced above "intensity" is used in non-strict sense in the very beginning but real calculations are done using "pixel value", as well as "number of gray levels". Most gray-scale images have 8-bit or 16-bit pixel values, maximum number of gray levels will be 0xFF + 1 = 0x100 = 256 or 0xFFFF + 1 = 0x10000 = 65536, and pixel values 0 to 0xFF or 0 to 0xFFFF, respectively, 0 for black color and maximum value (0xFF or 0xFFFF) for white color. The histogram equalization will transform the image to the distribution when the darkest pixel gets the value exactly 0, and the lightest pixel gets the value exactly of the maximum, 0xFF or 0xFFFF.
This is the MathWorks article with illustrations:
http://www.mathworks.com/help/toolbox/images/ref/histeq.html[
^], as far as I can see — "in simple words". :-)
Thank you,
—SA