A few more photos from a visit to Oslo to attend IS&T’s Archiving Conference last month.

A few more photos from a visit to Oslo to attend IS&T’s Archiving Conference last month.
Below is a group photo of the attendees, taken by Mogens Bech. Can you find him?
Attendees of IS&T’s Archiving Conference in Oslo
On Wednesday, April 26, 2023, Majid Rabbani and Prasanna Reddy Pulakurthi presented as part of the Society for Imaging Science and Technology (IS&T) Rochester NY chapter seminar series.
Abstract: Super-resolution refers to obtaining an image higher than the camera sensor’s resolution. Super-resolution can be applied to a single low-resolution image or a single frame within a low-resolution captured sequence. The approaches can be either computational-based (using physical modeling of the capture process, subpixel motion estimation and image registration, and regularized spatially variant deblurring) or example-based using machine learning and deep convolutional networks. This presentation provides a brief overview of the field’s history and evolution while addressing its challenges and future directions.
Video file of the presentation (YouTube), https://youtu.be/czlEG-QkKRI
Presentation File (PDF), /burnsdigitalimaging/ISandT/ISandT%20Super-Resolution%20Rabbani%20and%20%20Pulakurthi%20.pdf
Some time ago L’Oréal* developed a colour chart showing various shades of skin colour, based on a sampling of the spectral reflectance characteristics of women’s (healthy) skin around the world.
For a current project, we are addressing the requirements for, and control of, image capture for dermatologists. By understanding important optical (colour) characteristics for this imaging application, we can tailor system design and evaluation methods for improved performance. The chart and similar databases can provide information on system tolerances, and sources of variability.
Report: A short report on the region of (colour) signal-space occupied by the chart is available here, Coordinates of L’Oréal Skin Color Chart (updated) (PDF)
Color Chart: L’Oréal webpage, A new geography of skin color
_____
*L’Oréal Paris, the cosmetics company
On 15 Feb. Bob Fiete presented a talk as part of the Society for Imaging Science and Technology (IS&T) Rochester NY chapter seminar series.
When we see the images taken from satellites on the evening news, from the Russian invasion of Ukraine to the distant galaxies captured by the James Webb Space Telescope, most people don’t realize the significant role played by a group of talented people in Rochester, NY to make these images a reality. This talk will look at the history of imaging technologies and the pioneering accomplishments in Rochester, NY that give us these amazing images from space.
Dr. Bob Fiete is Chief Technologist and Senior Fellow at L3Harris with over 40 years of experience in imaging science. He was Director of R&D for ITT Space Systems Division and has chaired conferences and seminars on imaging and optics. He has briefed the White House and US House of Representatives, and worked with law enforcement on cases image exploitation.
Here is the video,
As Society for Imaging Science and Technology (IS&T)‘s Archiving conference opens tomorrow, a reminder of the scope of imaging projects for collections.
From Don Williams, part of the Smithsonian Institution‘s celebration of their National Herbarium Collection Digitization Project (4 million items scanned). Cake (right), and the item (left) I found based on the cake’s ‘metadata’. Cake photo courtesy of Ken Rahaim.
Don Williams presented our talk at the IS&T Archiving 2021 Conference last week.
Here is the presentation file
Previously, I presented frequency-domain analysis is applied to recordings, and showed results based on Perseverance Rover audio files released by NASA. Since NASA posted the original (with-Rover) and filtered (without-Rover) versions, we can use these to compute (estimate) the sounds of the Rover itself.
Based on the previous examination, I found
The Estimate: The above diagram shows how the computed Rover sounds were computed. A bit more detail is shown here.
The results are shown below, along with the original and filtered versions.
We again note that the components of the apparent Martian atmospheric sounds (at 6-8 and 10-11 sec.) are evident in all records. We can interpret this by saying that the computed signal does not completely isolate the Rover sounds, but this is to be expected.
As in Part 1, we can perform the frequency analysis on the computed signal. This is based on a moving window, a length 2000 FFT with a one-third overlap. The results for an example segment are shown.
The video (2.5 min.) includes the spectral analysis, here. (You might want to try it with headphones or earbuds)
Here is a version without me talking, Link
As before, consistent high spikes in the spectra indicate the presence of periodic components (tones) that may be due to, e.g., vibrations, motor rotation, etc. These are seen at 5-7 kHz and near 13 kHz.
– Peter Burns
[ For my technical friends, this may be seen as straightforward. For others, I hope you find this a useful example of the type of analysis that is behind the design and maintenance of many modern products and services]
First sounds from Mars: Recently NASA posted the first audio recordings from the Perseverance rover. In order for us to hear the ambient Martian sound, two versions of the audio file were posted;
Commenting on these sounds, Matt Heck suggested that, in general, attaching a microphone to engines is a useful part of monitoring and diagnosis of problems, wear, etc. He also suggested identifying conditions by frequency analysis to monitor resonant vibrations (frequencies). While this is often done in design, it is rare in normal product usage.
Our friend Fourier: A common analysis tool for signals of many kinds is Fourier analysis. Simply put, expressing signal/image content in terms of frequency components is another way to describe them. Whenever you hear someone say, ‘I don’t have the bandwidth to take on that task’, they are (usually inadvertently) giving a nod to the French mathematician, Joseph Fourier* … but I digress.
The Sound of the Machine: Since NASA posted the original (with-Rover) and filtered (without-Rover*) versions, let’s take a look at these signals. The audio signals are plotted here. The clip runs 18 sec., sampled at 48 kHz.
We note that the components of the apparent Martian atmospheric sounds (at 6-8 and 10-11 sec.) are evident in both records. Filtering signals such as these, under field conditions, is naturally challenging and it is not surprising that the atmospheric and machine sounds would not be completely isolated.
Fourier (Frequency-domain) Analysis: Consistent with Matt Heck’s suggestion, I computed the Signal (frequency) spectrum for the estimated Rover sound. [Since the statistics are not stationary, a spectrum based on a moving window is commonly used. In this case, a length 2000 FFT was used with a one-third overlap]
Shown below is the power spectral density (magnitude-squared) for a single half-second record, for both original and filtered versions. I have found the highest three components, as candidates for identification. For simplicity, the y-axes are auto-scaled, so the actual amplitude values for the two plots are different.
Variation: The results of the above analysis are subject to normal signal variation and background noise. So, to get a feel for the consistency (or underlying signature) of, e.g., the characteristic sounds that Matt Heck was referring to, I offer this short video.
The video includes the spectral analysis, here. (You might want to try it with headphones or earbuds)
Here is a version without me talking, Link
Consistent high spikes in the spectra indicate the presence of periodic components (tones) that can be tied to particular mechanisms (e.g., vibrations, motor rotation, etc.). For the original recording, we observe these at 5-7 kHz and near 13 kHz.
Other observations: 0.22 sec. of silence at the beginning, and a small delay (phase-shift) introduced by the filtering.
– Peter Burns
——
* Full name: Jean-Baptiste-Joseph Fourier
A lesson in imaging training, from TV. Amazing what you can pick up from reruns. In this case, Hawaii Five-0, 2013. Episode: Imi Loko Ka ‘Uhane (Seek Within Ones’s Soul)
Story: The Savannah Walker Show: Savannah takes a thrilling ride-along with Hawaii’s elite crime-fighting task force, Five-O. Police officer Kono Kalakaua is played by Grace Park. Savannah Walker is played by Aisha Tyler, with dialogue double, Wendy Pearson.
…
Scene: Hawaii Five-O Tech. Police Lab.
Kono: … so what we did was we took the video you shot and enhanced it in order to get a better look at our suspect, Wo Fat.
Savannah: When you say “enhanced,” what do you mean exactly?
Kono: Uh, well, Five-O is, um, equipped with the latest in digital forensic hardware, which allows us to improve image quality by manipulating zoom, frame rate, um, angle, uh, at the same time increasing resolution.
Savannah: Wow. Who on the Five-O team is well-versed in all this high-tech wizardry?
Kono: Uh, that would be me. (chuckles)
Savannah: How on earth did you get so tech-savvy?
Kono: Xbox.
–
Me: (thinks) um, who knew!
(script courtesy of https://transcripts.foreverdreaming.org)