Category Archives: Colour & Imaging Performance

Handbook of Digital Imaging: Image Quality Chapter

Burns, P. D. 2014. Image Quality Concepts. Handbook of Digital Imaging

Burns, P. D.,. Image Quality Concepts. Handbook of Digital Imaging, ed. M. Kriss, John Wiley and Sons, 327-371, 2014

John Wiley and Sons, in collaboration with IS&T, have published the Handbook of Digital imaging in three volumes,
1. Image Capture & Storage
2. Image Display and Reproduction
3.  Imaging System Applications
All are available in print and on line.

Michael Kriss edited Volume 1 and contributed as an author. I also wrote a chapter, Image Quality Concepts, which  turned out to be more work than I expected. The challenge, of course, is to choose important concepts and explain them simply and clearly – just enough maths and acronyms. A quotation by Einstein comes to mind. The chapter is offered free on line.

Handbook of Digital Imaging Volume I: Image Capture and Storage

Table of Contents:

  1. Digital Versus Analog Imaging, Michael Kriss
  2. Optics for Digital Imaging, Peter B. Catrysse
  3. Solid-State Image Sensors, Boyd Fowler
  4. Digital Imaging: An Introduction to Image Processing, Michael Kriss
  5. Color Reproduction for Digital Cameras, Michael Kriss
  6. Image Compression and File Formats, Michael Kriss
  7. Image Quality Concepts (free), Peter D. Burns
  8. Image Systems Simulation, Joyce E. Farrell and Brian A. Wandell
  9. Multispectral Imaging, Yoichi Miyake and Vladimir A. Bochko
  10. Understanding Glare and How it Limits Scene Reproduction, Alessandro Rizzi and John J. McCann

Peter Burns

Rapid Capture and IS&T’s Archiving Conference

by Brndan McCabeArch2015_smallSeveral articles at Smithsonian.com about rapid (image) capture reminded me of the connections to our IS&T Archiving Conference community. Briefly, Rapid Capture is used to describe high-speed image capture and storage for large digitization projects. The Rapid Capture Team* was formed with members from several cultural institutions including Ken Rahaim, Smithsonian Institution, who has been a member of the conference Program Committee. Recent examples of Rapid Capture Pilot Projects include;

  • Thomas Sears Collection at the Archives of American Gardens (photographic plates)
  • Bumblebees, from the National Museum of Natural History (45,000 bees)
  • Historical Currency (printing) Proofs, from the National Numismatic Collection housed at the Smithsonian’s National Museum of American History (250,000 items)
  • Freer Study Collection, Freer Sackler Museum
  • National Air and Space Museum

Large-scale digital collection acquisition is made practical by using automated methods for monitoring imaging performance, and custom built hardware systems for object handling. Untitled-1

Test Targets: Here Richard Kurin, Smithsonian Institution, checks out the imaging for the Historical Currency collection (Photo by Günter Waibel). The Object-level Test Target, shown expanded, is used to monitor the camera focus, lighting, color capture, etc. Such targets and associated software, are used in quality assurance programs, now part of national and international bee_squareguidelines for digitization for repositories and museums –  frequently presented at Archiving Conferences. Speaking of targets, here is the same one being used as part of the Bumblebees project – 6000 bees were captured in the first two weeks. (Photo from Smithsonian Digitization Facebook page) According to Tim Zaman this may be the Most Digitized Test Target in the WorldA bold statement, until you consider that it is designed to be captured alongside each object, and has been used in four of the five projects mentioned above, and many others in the US and Europe for several years. (more on this in another blog posting …)

Hardware: For very large projects, fully automated conveyor-belt systems are employed. Here is a frame from a video showing the operation of a system delivered and set up by Picturae BV. Olaf Slijkhuis is shown in this belt’s eye view. Olaf also presents frequently at the Archiving Conference series, most recently in Berlin 2014.olaf_aMetadata: At the Archiving Conference imaging is just one part of the program which also includes sessions on digital preservation, forensics and curation,  and metadata verification. So here is a nod to metadata. Below is an example image from the Bumblebees collection. We see the various note cards that were captured along with the insect. Presumable they all mean more or less Bombus Affinis Cresson ♂ .BombusAffinisCresson

Et Alii: The IS&T Archiving Conference is in its eleventh year and many participants have contributed to the development of the methods and tools which support rapid image capture and verification for the cultural heritage community. Although my list will be incomplete, for those interested, I suggest submitting the following terms to your favorite search engine; FADGI, Metamorfoze,  Image Engineering, Imatest, digital workflow, archiving conference … or ask via comments on this page.

 For more information on this post:

1. The object-level test target was developed by Don Williams of Image Science Associates. (I make no commission on Don’s target sales, but do have fun writing software to analyze performance and improve images based on test target images).
2. Mission Not Impossible: Photographing 45,000 Bumblebees in 40 Days Smithsonian Magazine
3. Museums Are Now Able to Digitize Thousands of Artifacts in Just Hours Smithsonian Magazine
4. Picturae BV
5. Presentations: Don and I will be presenting IS&T’s Archiving Conference next month at the Getty Museum in Los Angeles. More information at Upcoming Events

_________________
* I believe that Captain Capture is also a member of the team. He usually attends meetings remotely, but flies in occasionally.

– Peter Burns

Skin Colour Demonstration

Here is a small demo I put together based on a database of 250 skin colours, in the form of two videos. I used the CIELAB values and rendered each as an sRGB image tile. When I first looked at the set, they seemed to be arranged with small colour differences between each. When I randomized the order, things looked different. Same data (although, I did not ensure that each sample was selected only once for the random selection).

Harmony: Demo 1 in which order from the data set is preserved here

Diversity: Demo 2 in which order is randomised. Different impression? here

The data is available from Spectromatch (NO LONGER POSTED, contact Peter Burns for information)

JPEGmini – your photos on a diet?

Recently I was made aware of JPEGmini, http://www.jpegmini.com/, a product that is described as optimized JPEG compression for your photos. Compression up to 5x was touted, which is not very impressive to people developing image compression. However, the claims of virtually no loss of image quality got me interested, so I devised an experiment to verify the claims, and evaluate JPEGmini’s performance.

Background: JPEGmini’s claims are based on the compression of JPEG files. These are already compressed by the processing of digital cameras by about 3x. This is due to the natural statistics of image data, and  is routinely done with little or no apparent loss or distortion. JPEGmini compression of up to 5x, therefore implies an overall compression of about 15x. Sounds impressive, but is it? After downloading several demo. image sets (before and after JPEGmini processing), I observed little or no loss of detail or introduction of distortion, quantization, etc. So far, good news …

But how good Is it? To access the value of JPEGmini’s optimized JPEG compression, I used ordinary JPEG to compress several of their example image files. When saving a JPEG file, most software allows the selection of a quality level, usually on a [0 -100] scale. By adjusting the quality level for each image, I was able to generate files with the same size as the corresponding JPEGmini files. To compress the original camera images I used PhotoFiltre, http://www.photofiltre-studio.com/, a great free photo-edting programme by Antonio Da Cruz. The quality level settings that were needed to match the file sizes ranged from 72 – 85.

Comparison: After close visual inspection the results from JPEGmini were virtually identical to those from PhotoFiltre’s standard JPEG. So while the results were good, at least for the images tested, they were not unique – optimized or not.

BlackburnPhoto_compare

Dog’s eye, cropped from full image.

Refined Measurement of Digital Image Texture Loss / Noise-power spectrum

Posted presentation from the recent Electronic Imaging Symposium. This may be of interest to those developing imaging performance methods via (noise- or signal-) spectra. I describe a simple step that improves the power-spectrum measurement, by making it more robust in the presence lens shading, etc.

http://www.slideshare.net/Pdburns/refined-measurement-of-digital-image-texture-loss-16516950