Taking and processing photomicrographs — part 5: Adjusting white balance

Now that you have a nice flat image after dividing by the brightfield image, you may see that the white balance is a little off.  That division, particularly if you’ve used the division in Fiji/ImageJ may be a little green or whatever.  So, it’s time to make that background a bit closer to white.  This is called fixing the “white balance.”  There are a number of algorithms and packages that do this.  They all work by basically the same mechanism:

  1. Find a place in the image that should be white.
  2. Figure out what changes need to be done to make it white.
  3. Do that change to the entire image.

There are, as you might expect, a lot of variants on this.  Some algorithms are happy just to make white into r=g=b somewhere.  Others try to figure out the visual stimulus values for what makes things look white.   And on and on.  But basically, just like correcting for anisotropic illumination is a division, correcting white balance is (usually) a multiplication.  The other thing that these packages differ on is finding the spot(s) to turn “really” white.  Some take one spot and change the whole image.  Some take a bunch of spots and change a region of the image.

Some packages try to find the spot to use as the basis for turning things white automatically.  Some ask the user to choose a background spot.  I wrote a little macro in ImageJ that works pretty well.  The algorithm is simple:

  1. Make a tiny window to place on the image
  2. Move the tiny window across the image, keeping track of the average rgb values
  3. Choose the window that had the brightest average rgb values
  4. Multiply the pixels by whatever it takes to make that spot 255,255,255 in integer rgb values.

So, let’s say I move the tiny window around the image and find a place that is 180,210,200 as my brightest window average.  I then multiply the r values of all pixels by 1.41, g by 1.21, and b by 1.27.

There is a nice plugin macro for this by Patrice Mascalchi called ImageJ_Auto-white-balance-correction ( https://github.com/pmascalchi/ImageJ_Auto-white-balance-correction ).  In it, it has a pop up that asks you to draw a little square to be the sample that is corrected

So, here’s an uncorrected image:

Now I’ll run the ImageJ/Fuji macro mageJ_Auto-white-balance-correction.  I get this dialogue box to draw my square, which I do (in yellow):

Here’s the result:

Not too bad.  You’l notice that the background isn’t flat, so the lower middle part is more red and the far right is more blue.  That’s because I didn’t do the brightfield image division.  This time, I’ll do that first.  Here’s the result of doing the division using that vips library routine:

This already pretty good, and it has a flatter background, though it seems a little yellow.  I’ll apply the ImageJ macro to this to see how it works. Hint — we are already at the point of diminishing returns:

 

This increased contrast a little, but more important, it’s starting to emphasize a rainbow effect in the background.

We are running into two problems.  The first is that the brightfield image is not perfect, so there will be small residual differences.  The second problem is that these differences are magnified a little by doing a division that results in a very small number with small differences between pixels.  But, when you multiply a small difference by a lot in order to restore the dynamic range, it magnifies those differences a little.  It gets to the point that you can only polish a turd so much.  This is where making sure you have the best original image becomes important.

It also emphasizes the point of “good enough.”

If you really are bothered by it, you can play one more game.

  1. Go into an image editing program (again, I use GIMP) and load the image.
  2. Use an automatic select tool to select the white area.  Don’t worry about being perfect, but make sure you don’t include any of the tissue.
  3. Blur the bejeezus out of the white area.  I use gaussian blur, but it doesn’t really matter. Be careful that you are not bleeding color from the tissue.
  4. Choose a background color from a “white” area inside the tissue part of the specimen (the area not chosen)
  5. Do a feathered fill of the background.
  6. Adjust the dynamic range to taste (I’ll talk about that in a later post)

Here’s the result.  Note that this is for aesthetics.  We have left any idea of measuring color or anything like that behind.  I’ll talk about using this stuff for publication in a later post.

 

There is a tool in GIMP to adjust the white balance either visually or automatically.  I don’t like either one much.

So, here’s the final result compared to the original image:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.