Image Analysis Tools

Image Analysis Toolbox LogoMultispectral Image Calibration and Analysis Toolbox

Digital cameras can be powerful tools for measuring colours and patterns in a huge range of disciplines. However, in normal ‘uncalibrated’ digital photographs the pixel values do not scale linearly with the amount of light measured by the sensor. This means that pixel values cannot be reliably compared between different photos or even regions within the same photo unless the images are calibrated to be linear and have any lighting changes controlled for. Some scientists are aware of these issues, but lack convenient, user-friendly software to work with calibrated images, while many others continue to measure uncalibrated images. We have developed a toolbox that can calibrate images using many common consumer digital cameras, and for some cameras the images can be converted to “animal vision”, to measure how the scene might look to non-humans. Many animals can see down into the ultraviolet (UV) spectrum, such as most insects, birds, reptiles, amphibians, some fish and some mammals, so it is important to measure UV when working with these animals. Our toolbox can combine photographs taken through multiple colour filters, for example allowing you to combine normal photographs with UV photographs and convert to animal vision across their whole range of sensitivities.

FlowerLayers

Download & Install

This toolbox requires a working installation of ImageJ. An R installation is also required if you want to generate new cone-catch mapping models (if you know your camera’s spectral sensitivities). Download the version of the toolbox for your operating system, unzip the files and place them in your imagej/plugins folder. See the user guide for more specific details.

NOTE: If you downloaded version 1.12 and it throws an error when you try to make or import an mspec image, just re-download and copy the files across – there was a file missing!

NOTE: The previous script for generating new cone catch models stopped working with the update to ImageJ v1.50 and gave a “IllegalArgumentException: “Label” column not found” error. The toolbox has now been updated, so update to the latest version to solve this issue.

 

Tutorial Video

FAQs

  • Can this software be rolled out for Android, iOS, Chromebook, or cloud processing? This could be done, but it would mean mapping the linearisation curves and spectral sensitivities of every different mobile phone camera, so in practice wouldn’t be worth doing. Mobile phone cameras are also unable to photograph in UV, which means only a limited number of animal visual systems could be mapped to. Phones normally only output compressed images, adds extra problems (e.g. compression artefacts and reduced colour gamut compared to RAW images).
  • Can I open MSPEC images in photoshop/anything else? No. MSPEC images are just text files that tell the toolbox how to open the RAW file(s) correctly (performing calibration and alignment). MSPEC images are opened as 32-bits per channel images, and many other software packages won’t support images with this level of detail and display them correctly if you save them in this format. Use the “Make Presentaiton Image” tool to produce a simple RGB colour image for saving as a standard 8-bits per channel colour image.
  • Why are MSPEC images shown in black & white by default? Each channel of output from the camera is displayed in a separate (achromatic) slice. Scroll between the slices to see them and measure them separately (though you can use the “plugins>measure>measure all slices” tool to measure all slices more easily). This is because the toolbox can deal with more than three channels (which would be impossible to display in colour). You can make a colour or false-colour image for presentation with the tools included (see above post).
  • UV radiation warning signWhere can I buy the Iwasaki eyeColor bulb? In the UK you can get them here. You also need to run this lamp from a ballast (ask your bulb supplier about this), and it needs wiring together. The wiring is straightforward, but ask an electrician if you’re not comfortable. There are many other potential light sources that might work for full-spectrum UV photography, e.g. the Exo Terra SunRay Metal Halide lamp, which will be ready to use for UV straight out of the box, but care should be taken given this is a focussed bulb, and also has very high UVB output (and is therefore more dangerous to work with than the eyeColor bulb). This focus also makes controlled illumination more difficult given the standard must receive the same light as your sample. See the user guide for further lighting discussion.
  • The MSPEC images are really dark – it’s difficult to see the thing I’m selecting. Linear MSPEC images make dark things look really dark. When you make your mspec images select the “Visual 32-bit” option if you’re just working with normal visible images, or “Pseudo-UV 32-bit” if you’re working with UV & visible. This will show you non-linear colour images that look good on your monitor. The pseudo-colour UV images will show misalignment as the blue channel not matching up with the others. Remember not to measure the colours in these images – they’re just there to help you see everything more easily – use the batch measurement tool to measure these images afterwards.
  • When generating a new Cone-catch model an error crops up complaining about compilation failing. Make sure the name of the camera you’ve added doesn’t start with numbers (e.g. “7D Canon” might not work, try “Canon 7D” instead). This is a known bug, but quite easy to remedy. It results in a “.java” file being created, but no accompanying “.class” file.
  • Why do I have negative reflectance values when using two or more standards? Negative reflectance values are obviously a physical impossibility, but you would actually expect them from the image processing under certain circumstances. For example, camera sensors have inherent noise (which goes up with gain, dependent on the ISO setting), so if the camera is photographing something that’s got a very low reflectance then you would expect some pixels to go below zero due to this noise (it’s slightly more complicated than this as there’s also the bounding effect around zero values). If you are recording mean values over lots of pixels as being negative something has gone wrong with your standards, e.g. your dark standard’s reflectance is actually higher than the number you’ve used, or there is some degree of lighting difference or lens flare over the standard that isn’t over the thing you’re measuring. One way to get around this issue is to use the ‘estimate black point’ option instead of or as well as using a dark standard, or, if lens-flare isn’t likely to be an issue then only use one standard and don’t use the ‘estimate black point’. If you’re getting any noticeable level of lens flare over the standard the photo should not be used.
  • When opening mspec images there is a DCRAW error. Make sure there are no spaces in the file names of the raw images as this can sometimes cause issues.

 

Recent Changes:

  • 15/03/2015 – Bug fixed when generating cone catch models in ImageJ version 1.50 or greater, causing an “IllegalArgumentException: “Label” column not found” error.  Loading multispectral images now offers the option of converting to colour if you don’t want to measure the image. An option has also been added to the batch analysis tool to allow scaling all images equally irrespective of scale bars (useful for scaling down noisy images).
  • 14/12/2015 – Bug in the luminance JND calculation fixed, and a separate tool for calculating luminance JNDs following Siddiqi et al 2004 created. Camera sensitivity files were also renamed to remove any starting with numbers (this stopped compilation working correctly).
  • 6/10/2015 – Photo screening portrait image fix.
  • 5/9/2015 – DCRAW for Mac problem fixed (see user guide) – many thanks to Matt Henry.
  • 7/8/2015 – Addition of photo screening tools, providing photo preview and exposure testing, and easy creation of MSPEC images. Bug fix to JND measurement tool.
  • 29/7/15 – Bug fix – Generate Cone Catch Model wasn’t working on Windows (tested on Windows 7)

60 Comments

  1. August 7
    Reply

    FWIW, Google Mail blocks d/l of your MicaToolBox zip files … as well as .exe files.
    Just so ya know. No response expected.
    Wish I could’ve used your research, I raise bees. & have a LOT of wildlife on my propety.
    Best wishes

    • Matt
      August 31
      Reply

      What does Google Mail (Gmail) have to do with downloading the software provided from the links above? Are you trying to email the software to yourself?

  2. Kait
    August 7
    Reply

    This is amazing! How long did it take to make this and how did you get the vision of an animal?

    • jolyon
      August 7
      Reply

      Thanks! The toolbox has accumulated over the past couple of years as I’ve been working on camouflage in our lab. Knowing how the vision of other animals works is based on decades of research by other researchers though. Determining an animal’s sensitivity to different wavelengths is very difficult, and requires microspectrophotometry of the cone cells in the retina, or flickering specific wavelengths of light at cone cells and seeing how they respond.

  3. jim
    August 8
    Reply

    Is a chromebook cloud verzon in the works?

    • jolyon
      August 8
      Reply

      I’ve not got any plans to expand to other operating systems as the vast majority of users will use windows, mac or linux. Though you could install linux on your chromebook.

      • Jorge
        August 15
        Reply

        Hey sir, you got a pretty good thing going here, but you need to consider about expanding.. I think this type of software would make a pretty good Android app, I’m just saying.
        Possibly create a test (free) version..
        And if the people like it they could download a full (paid) version.
        I know I would..
        Besides there’s a lot more people that work on camouflage, and i can see them really using this type of software to do some on-the-go image analyzing..
        I can see it being very successful.

        • jolyon
          August 18
          Reply

          I want to make these tools free for all scientists, so am keen to leave it as a free, open-source project.

  4. Ardy Bee
    August 8
    Reply

    So you’re not going to make this available for Android? What a shame……

    • jolyon
      August 11
      Reply

      Sorry – I don’t have the ability to maintain and calibrate all the phone cameras out there right now! It would be possible for working with non-UV images though.

  5. We are interested in collaborations to assist with mapping gradient color sensitivities to compare differences among various hymenoptera.
    Honey Bee Research Institute and Nature Center Inc., 24 Gendreau Road, Saint David, Maine 04756

    • jolyon
      August 11
      Reply

      Do send me an email (jt at jolyon dot co dot uk).

  6. Elena
    August 9
    Reply

    Waiting for an android version 🙂

  7. Randall badilla
    August 9
    Reply

    just amazing… using this sfw and the knowledge behind this we can push further nor how animals see the world also how other people with eyes diseases live with us.

  8. Randall Lee Reetz
    August 10
    Reply

    Is there a place online where one could upload a digital image and download a conversion of that image to insect vision?

    • jolyon
      August 11
      Reply

      For now you’ll have to use the software. It’s pretty fast at processing through, and should work on most platforms.

  9. Randall Lee Reetz
    August 10
    Reply

    I’ve been seriously injured on bicycle rides as a result of insect bites. Wasps and bees seem particularly interested in helmets I ware. I was wondering if there would be a way to submit a series of images of various helmets (in verious colors and textures) through your insect vision filters such that I might see which would be less visible to the sorts of insects that can cause havoc to a bicyclist? The problem of course is that helmets are best when they are very visible to other riders and to motorists and pedestrians (and large wildlife (deer, coyote, dogs, mountain lions, squirrels, and birds. Is there a color or range of colors that might be invisible or less visible to insects and more visible and even alarming to humans and other animals that may be on roads and trails?

    Thanks, Randall Lee Reetz

    • jolyon
      August 11
      Reply

      That’s a really interesting question! I imagine the plastics used in most cycle helmets absorbs UV (is black in UV), but if there are UV reflective parts that could make them attract pollinators. I guess most bees and wasps that get stuck in helmets are accidental as you shoot past them though. Wasps aren’t pollinators, so shouldn’t be attracted by colour quite as much as bees.

      • Callen Peter
        August 27
        Reply

        Yes, wasps are pollinators! Very important ones at that. So making clothing and helmets which are not attractive to bees and wasps would be a great application of this technology.

  10. Randall Lee Reetz
    August 10
    Reply

    In addition, bees, flies, moths, and wasps can fly into the open mouths of riders. It is impractical of course to cover the mouth with anything that may obstruct a cyclist’s ability to breath freely. What might be done with colors and patters to keep insects away from a rider’s mouth?

  11. August 10
    Reply

    I want to download this software because I am interested in this application of carácter scientific

  12. Ian prins
    August 10
    Reply

    I read you were not planing on expanding to other platforms, but this looks like an extreme good concept for a mobile app.
    To give a true animal view of the world to have active rendering.
    I dont know how much proces power it takes to make the adjustments but il throw it out there.

    • jolyon
      August 11
      Reply

      The trouble with using mobile phone cameras is that they won’t produce raw (linear) output, so every phone would need to be carefully calibrated for linearity and its sensitivity curves. So while it would be possible it would be a lot of work to maintain.

  13. bambi
    August 10
    Reply

    Hi Jolyon: I was directed to your website via gizmag. I am a honey bee keeper and I would love to make the fantastic images you have of what objects may look through a bees eyes. I have watched your over 1 hour detailed tutorial, but being a lazy Australian, I was wondering if there was a less complex way of doing it? *smiles*

    • jolyon
      August 11
      Reply

      I’m afraid bees have UV vision, so you’d need a UV camera (which makes things far more complicated!)

  14. LEONID
    August 12
    Reply

    Unreal thanks! With love from Russia <3<3<3

  15. Stephanie
    August 14
    Reply

    Can you make this into a free app for android phones? Or is it not possible

  16. Subcomputer
    August 14
    Reply

    Very nice! Is there enough data so far on Trachemys scripta elegans or Chrysemys picta (both often miscategorized back/forth and with Pseudemys) in order to add one of them at some time? They seem do have some heritage in the field, and I’d like to see how their perception of self and food items works out.

    • jolyon
      August 19
      Reply

      I’m not sure, you might have to go with some other related species likely to have similar vision (e.g. turtles that have had their spectral sensitivities measured). They’re likely to be tetrachromats, but who knows what filtering their oil droplets might be doing…

  17. marios
    August 15
    Reply

    This could be a great app for smartphones.

  18. Specta
    August 15
    Reply

    When will it be possible to use it with smartphones ….android .uos and windows phones

  19. jolyon
    August 18
    Reply

    Hi, currently I don’t have the time to create a standalone program, and ImageJ has loads of built in tools that make it a great option for dealing with this image processing. If you watch the video you’ll see it’s pretty easy to install and use.

  20. Wilhelm
    August 21
    Reply

    No good impossible to set up in Windiws xp 🙁

    • jolyon
      August 24
      Reply

      Is it DCRAW that doesn’t work on XP? It should be possible to compile a version XP, though I don’t have an XP system available I’m afraid.

  21. Heather
    August 29
    Reply

    Great idea… And keep it open source and free, for sure. What colors are cats attracted to? Do they see in UV?

    • jolyon
      August 31
      Reply

      Cats probably just see in blue and yellow, though their eyes are adapted for night/low light vision with a higher ratio of rods to cones than many other mammals. So their colour vision is probably not all that great.

  22. Vinicius
    October 11
    Reply

    Hi Jolyon

    Where can I find the white and gray standards you used in the tutorial video.

    Thanks a lot

    • jolyon
      October 12
      Reply

      Hi Vinicius, I made that standard. I should start selling them really, though I don’t have the time to make them. The big labsphere standards are too bulky to use with most macro photography, so these little twin square ones are great.

  23. Alfonso Aceves
    November 22
    Reply

    Hi, congratulations is a great tool with many applications. I belong to a study group with Luis Robledo and we have been experimenting with the Toolbox, but last time, we had some issues when trying to generate the cone mapping model the log window gave this in Mac computers:
    Waiting for R to process data…
    Operating system: Mac
    Command: Rscript /Users/AlfonsoAceves/rScript.r
    Cannot run program “Rscript”: error=2, No such file or directory

    Have you run with this issue before? any recommendation?
    i’ll appreciate your response, have a good day,

    A

    • jolyon
      November 25
      Reply

      Hi Alfonso, Thanks! I’m not sure what’s causing that error. I don’t have access to a mac, so can’t easily debug these things. What version are you running? In the meantime you could generate the mapping files with a windows or linux machine and then copy them across to the mac. Cheers, Jolyon

  24. chen
    December 10
    Reply

    amazing works , jolyon,
    here is a simple question, where I got some other camera or lens sensitivity data?
    thank you

    • jolyon
      December 10
      Reply

      Thanks Chen! Calculating a camera’s spectral sensitivity isn’t particularly easy. The easiest way is probably with a monochromator and spectroradiometer. Shine monochromated light onto a diffuse white standard. Take a photo of the standard at a fixed/known shutter speed, then measure the radiance of that sample, then repeat at e.g. 5nm intervals across the whole spectral range of the camera (e.g. 400-700nm for normal cameras). You then compare the pixel values (multiplied by exposure time if different exposures were used) to the known radiance at each wavelength to build up the spectral sensitivity. You need to ensure you’re working with linear pixel values when doing this though (very important)!

  25. Amanda
    December 10
    Reply

    Hi there,
    Excellent program! Such a great idea and the instructional video is so good.
    I’m having an issue with creating a cone catch model. Each time I try to generate a new model (using the data which come with the package) I get two error messages pop up. One says “Plugin or class not found: “Cone Model Name” (java.lang.ClassNotFoundException: Cone Model Name)” And then a second pops up saying “Macro Error, No window with title “Exception” found”.
    I don’t know much C++, but I added in print commands and it seems to run line 751 “run(“Compile and Run…”, compileString);” but then gets stuck. There is no .class file created in the Cone Models folder, but a .java file is created.
    I’ve tried reinstalling ImageJ not in Program Files (incase it was an access issue) but no luck. I’m using my own laptop running Windows 10, R 3.2.2, Java 1.8.0. .
    Does anyone have any suggestions? I’ve run out of ideas!
    Thanks so much!
    Amanda.

    • jolyon
      December 10
      Reply

      Hi Amanda, I think someone else possibly ran into this issue (is the code creating a “.java” file in plugins/Cone Models/, but no “.class” file?). In the end it turned out that the compiling commands don’t like the files to start with numbers or have certain characters in the name. Try changing the name of your camera in plugins/Cone Mapping/Cameras to take out any numbers at the start of the file name. Do let me know if this solves your issues!

  26. Kristen
    February 5
    Reply

    Hi! I am a PhD student In California, studying UV floral patterns in annual wildflowers. Whenever I try to generate a multispectral image, I get to the prompt “Select Visible Photo containing standard” and select a raw vis photo with a standard, and it says ‘Lookup Thread terminated with code 1. Cannot decode file __(the file name).rw2’. Any suggestions on where the problem is? Thank you so much for making this tutorial and software freely available!!! It’s been a huge help already.-Kris

    • jolyon
      February 8
      Reply

      Hi Kristen, sorry you’ve been having issues. I think I might have encountered this issue sporadically, and have never been able to replicate it. It seems to be due to DCRAW playing up, and I always found restarting ImageJ sorted it. I recommend trying to open ImageJ with a different version of Java. Do let me know if this helps, and feel free to email if it doesn’t.

  27. Cedric
    February 29
    Reply

    Hi Jolyon. I have the same issue as amanda (“Macro Error, No window with title “Exception” found”). That’s the only prompt that appears when I tried to create a model using your nikkon 300D with 60mm lens (400-700), D65 (400-700), gobiusculus flavescens, natural spectra 400-700, n=3, no stepwise, no square transforms and including diagnostic plots. The program creates both files (.java and .class) and the model becomes available to use when refreshing the menues. When I change the camera name in the plugin folder (no capital letters, replacing space with underscore) I get both error messages Amanda describes. I am using a Win10 64-Bit system, ImageJ (64-Bit) v.1.49, installed in programmes (x86), Java 1.8.0_73.

    • jolyon
      March 1
      Reply

      Hi Cedric, thanks for the info. I’ll try to fix this issue. At least the model is created ok, but it’s annoying to have error windows. Cheers, Jolyon

  28. John Cavagnaro
    March 8
    Reply

    1. What nikkor 105 lens are you using for the spectral sensitivity curves in your paper (just called nikkor 105mm)? the UV nikkor f4.5, or the f2.8 vr micro-nikkor? Is spectral sensitivity not mapped for the UV-nikkor on the d7000?

    2. In the full spectrum converted d7000s, does the quartz filter have an AR coating of any kind, and would this affect the spectral sensitivity curves?

    • John Cavagnaro
      March 8
      Reply

      I’ve also read that the coastal optics 105 is modelled on the UV nikkor, would they have the same spectral transmittance?

      also AR refers to anti-reflective coating, which generally increases transmittance for a lens/filter.

      • jolyon
        March 8
        Reply

        Hi, in our paper we refer to the standard 105mm F/2.8 micro-nikkor lens. The UV-Nikkor has only just gone into production again hasn’t it? We don’t have one in our lab, instead using the Ocean Optics lenses. I’m not sure whether they have any AR coatings, though I suspect they don’t, as (according to my limited knowledge of these coatings) they tend to only work in a limited spectral range (i.e. not across both UV and visible). I would guess that the UV-Nikkor would result in very similar spectral sensitivities to the Ocean Optics lenses. The main thing affecting the spectral sensitivities with different lenses is the UV transmittance cut-off, but given these lenses both transmit well down below 300nm (beyond the sensor sensitivity) I think they will be very similar. If you have access to a spectrometer you could see whether the D7000 with UV-Nikkor (but modelled with the Ocean Optics lens) produces the correct cone catch quanta from a sample of colours (as we did in our paper).

  29. Andy
    June 22
    Reply

    Hi Jolyon! I adapted a Samsung NX1000 according to your manual and it works just fine. The only thing is that the pure ptfe, that I tried to use as a standard, is always overexposed in the redUV channel. The very far right bar in the histogram is always indicated, no matter how underexposed is the rest of the photo (I use the same baader UV venus filter).. Is it a problem of the camera, or is the ptfe supposed to be that much reflective? Is there any way to reduce its reflectance a bit, but maintain it flat? Unfortunately, I have no spectrophotometer now to check the reflectance. Thanks.

    • jolyon
      June 22
      Reply

      Hi Andy, Glad you’ve got your camera working! The PTFE is nearly white, while very few other things in nature are that white. As a result the camera will generally be trying to expose for the majority of the image rather than the brightest thing in it. So you could simply switch to manual exposure control (“M” mode) if going right down to -3 stops in “A” mode doesn’t work. Buying a grey (e.g. 10-50%) reflectance standard would be one solution if you want to stick with automatic exposure control (but generally very expensive). Another solution is to have more of the white PTFE in the background, this will help the camera’s automatic exposure control.

  30. Pedro
    July 11
    Reply

    Hi Jolyon! Such an amazing approach to measure and interpret animal coloration, I have been exploring the methods as suggested in both the paper and the toolbox manual. I have generated visual models successfully (with help from Luis Robledo) and the software seems to be working properly so far. Recently I have been trying to compare objective coloration without modelling for any visual system since I just want to obtain reflectance values from different species of damselflies and compare them, the problem I am having is that for the uvR channel, and in very rare cases the uvB channel, I get negative values. Is there a chance that this has to do with the measured material (color traslucid wings) in relation to the standard? for example, these negative values are more frequent when measuring really dark objects. My photos don’t seem to be overexposed or underexposed by the way. How should I interpret this data? or should I transform it first before preceding to analyse?. Thanks in advance.

    • jolyon
      July 13
      Reply

      Hi Pedro,

      See the FAQs above on negative numbers. These happen when the dark point isn’t correct, either because the reflectance of your dark standard is actually higher than the value you entered, or because the automatic dark point estimation is going wrong. One simple solution is to only use one standard and no dark point correction (only do this in conditions where lens flare/loss of contrast isn’t an issue). Alternatively, you could find your minimum value and add this to all of your measurements (across all channels). e.g. if your lowest measurement is -100 in vR, then add 100 to all of your channels. This is obviously a last resort though as it will subtly alter the colour ratios. Another likely cause is that the dark standard is receiving slightly more light than the thing you’re measuring.

  31. Stuart Pointon
    July 18
    Reply

    Hi Jolyon. I am after a method to simulate the effect of different light spectral response on an image. For instance if I have a reference image and I attribute an illuminant A (3200K) halogen as the light source at the time of the image capture, I then want to look at the effect of an LED light source on that image by simulation. Getting the spectral response of the light sources is easy. I was wondering if your tool kit could work in this way? Or do you know of an Imagej or other analysis plugin or app? Kind Regards

    • jolyon
      July 20
      Reply

      Hi Stuart,

      Yes you could do this with the toolbox. The caveat being that you’ll need to know your camera’s spectral sensitivities, and take the sample photos under a nice broad-band illuminant (like natural sunlight or a good arc lamp that doesn’t have a very spikey output). Then you can create models for each illuminant you want to model. The output of the toolbox has gone through the Von Kries equation, so the images make the assumption that the animal’s visual system has adapted to that illuminant.

  32. Graeme Awcock
    July 26
    Reply

    In my humble opinion this toolbox is an excellent starting point for practical imaging science investigations using DSLR’s in a wide variety of applications; – thank you very much for making this openly available!
    However, for my own part, I am interested in investigating colour shift due to heat treatment of non-organic materials, and because I lack a very uniform white light source, I would like to add to processing step to the workflow to perform a sort of ‘flat-field correction’ based on a second image file that I would have captured of the ‘white’ reference card filling the field of view. I recognize that this needs to be integrated fully into the workflow if I want to be able to take advantage of your excellent batch processing routines.
    So, am I right in thinking that your plugins are scripted entirely in ImageJ script, which is located as .txt files in the the relevant folder? e.g. Is it correct that the full functionality of the “Generate Multispectral Image” plugin is controlled by the text within the “_Generate_Multispectral_Image.txt” file? So, if I wanted to add in my “flat-field correction” step before normalisation, I *could* do so by editing that text file in the proper way?
    If that is the case, then I feel that I could make a decent attempt at writing that, not least because your script files seems to me to be a very clear example of well disciplined coding in ImageJ script.
    Can you see any reason why I should not be able to achieve my desired outcome?

    • jolyon
      August 7
      Reply

      Thanks so much for your kind comments. Incorporating light-field correction should be easy enough to include, I’ve sent you an email with more info.

      Cheers,
      Jolyon

Leave a Reply

Your email address will not be published. Required fields are marked *