Dissimilar: an experimental Image Quality Assurance tool

Dissimilar: an experimental Image Quality Assurance tool

An important part of image file format migration is quality assurance.  Various tools can be used such as ImageMagick or Matchbox, but they only provide one metric or are for different use-cases.  I wanted to investigate implementation of image comparison algorithms so began investigating.

I created a prototype tool/library for image quality analysis, called Dissimilar.  I had previously prototyped a tool that used the OpenCV libraries in Java to perform image comparisons.  Those experiments showed that, while possible, it was not ideal; a large native-code shared object needed to be packaged with the tool and some inline memory management was required.

Algorithms

For Dissimilar I subsequently implemented PSNR and SSIM algorithms from scratch in Java, making use of Apache-Commons Imaging and Math3 libraries.  The result is about 600 lines of commented, pure-Java code for performing image quality analysis.

The SSIM is calculated for an image by splitting it in to 8 pixel by 8 pixel “windows” and then calculating the mean of the results for each window.  In addition to the (mean) SSIM value, Dissimilar reports the minimum SSIM value alongside the variance in SSIM values.  It may be useful to use a combination of some of the mean, minimum and variance to set a better threshold for image format migration.  For example, setting a minimum value would ensure that the quality of all 8×8 windows stayed above a certain threshold.  Or using variance would enable identification of images where there were large differences in the individual SSIM windows, but where those values might still produce a mean that is assessed as ok.

Testing

Testing was performed using our Hadoop cluster to enable comparison of results from ImageMagick (PSNR) and Dissimilar (PSNR/SSIM).  A tiff was migrated to lossy jp2 and then back to tiff.  The original tiff and second tiff were then compared using each tool, each tool therefore having identical inputs.

It is worth noting that there is no built-in support for JPEG2000 files in Apache-Commons Imaging, and it is worth using a known decoder to decompress to tiff for comparison.  For more about that see our iPres paper in September.

Results

Results on a homogenous dataset of 1000 greyscale image files showed that ImageMagick took about half the execution time of Dissimilar.  This is a good result as the code is currently unoptimised.  The execution time of Dissimilar also includes startup of a new JRE, an SSIM calculation and saving an SSIM “heatmap” image to identify the low values.  Some execution speed savings are therefore expected.  It is possible to call the code as a library – this could be done as part of a Java workflow, thus removing the overhead of a new JRE.  Some information regarding the difference between using a Java library versus executing a new JRE has been blogged about before.

The PSNR results were identical to that of ImageMagick.  The SSIM results were not the same as Matchbox’s but I think it and Dissimilar calculate SSIM in different ways.  I couldn’t find another readily available and tested tool to calculate SSIM to verify the results – suggestions are welcome!

Future

Next steps include testing more files, producing more unit tests, optimisation and identifying suitable values for the threshold of SSIM-mean, SSIM-minimum and SSIM-variance.  I am also going to investigate adding more types of image quality assessment metrics.

4 Comments

  1. Bedrich
    September 5, 2013 @ 1:37 pm CEST

    Hi there,

    We have been developing web based application called The Image Data Validator – DIFFER along with a command line which could be possible to integrated (REST API) into an operational digital preservation framework. DIFFER stands for (Determinator of Image File Format propERties)

    This quality control application is designed for still image file formats (TIFF, JPEG, JPEG 2000, DjVu, PNG, PDF, FITS). It is capable of performing identification, characterization, validation and visual/mathematical comparison.

    The online application DIFFER is utilizing existing tools (JHOVE, FITS, DAITSS, FFIdent, Imagemagick, Pronom, ExifTool, KDU_expand, DJVUDUMP, Jpylyzer, etc.), which are mainly used separately across a whole spectrum of existing projects.

    This open source application comes with a well-structured and uniform GUI, which helps the user to understand the relationships between various file format properties, detect visual and non-visual errors and simplifies decision-making. An additional feature called compliance-check is designed to help us check the required specifications of the JPEG2000 file format.

    It would be great to cooperate with others and master the application into a usable and durable tool, which would be used by bigger community then us :). If you have any questions, please contact me anytime: bedrich at gmail dot com.

     

    Homepage: http://differ.nkp.cz (project is in progress the stable version should be done at the end of the year)

    Documentation: http://differ.readthedocs.org/en/latest/

    Source code: https://github.com/Differ-GSOC/differ

    Poster: https://docs.google.com/file/d/0B9Ah7Og9gY_ORi1kandLZVJ2NEU/edit?usp=sharing

    Youtube presenntation: http://www.youtube.com/watch?v=2u0MxhOZ5h8

    We are proudly part of the GSOC2012 and GCOC2013

     

  2. andy jackson
    July 17, 2013 @ 1:42 pm CEST

    One possibility for the difference between your implementation and the Matchbox one is the way they handle precision errors during the long summations involved in these algorithms. Even for PSNR, I found that moderatly large images with three colour channels could not be compared using the simple double-precision MSE accumulator you use in your code

    The problem is that you are adding lots of small numbers together, and as the overall total gets very large, the available precision can fail you. i.e. you add another small difference to the large number, and the large number simple does not change. You can mitigate this using Kahan summation, but can only avoid it completely by using a BigInteger (arbitrary precision) approach. You can find some very similar code I wrote during the Planets Project for PSNR calculation here, which uses both approaches.

  3. acdha
    July 24, 2013 @ 1:11 pm CEST

    As an aside, I've used approaches like Nailgun (http://www.martiansoftware.com/nailgun/) with Jython/JRuby scripts to avoid starting a new JVM each time a tool is invoked. If nothing else, it should be easy to measure the performance impact of a persistent JVM and giving the JIT more optimization information & time  without making significiant changes to Dissimilar.

  4. willp-bl
    July 17, 2013 @ 2:03 pm CEST

    I had experimented with BigDecimal but my PSNR code without it gets the same result as ImageMagick, so far at least, even on ~28 megapixel greyscale images.  I'll keep an eye out for rounding errors though – I imagine three channel image files might cause issues.  It seems that the biggest difference is the image loading library used; I first used https://github.com/stain/jai-imageio-core to load tiff files and the PSNR values did not always match ImageMagick.  Changing the image-io library to Commons-Imaging gave the same results as ImageMagick – must be something to do with the tiff compression being used.  Just need a release of the library now.

    The differences in SSIM between Dissimilar and Matchbox are different from rounding errors, need to do more testing.

    Thanks for the pointers, especially to the Planets code.

Leave a Reply

Join the conversation