photography of the Quabbin Region & beyond

Tundra Graphics

  • Home
  • Galleries
  • Blog
  • Bio
Home / Blog / 2015

Share
December 11, 2015

UConn Digital Production and Conservation Labs: 2015 Year in Review

My colleague, UConn Libraries' Conservator Carole Dyal, and I recently decided to put together a visual survey of some of the collaborative work that we've been doing during the past year. The idea was prompted by the Libraries' upper administration who wanted to fold such content into a greater year-end summary for the entire organization. In turn, Carole and I aimed for a quick, elevator speech type of layout using words and inline images that we eventually forwarded on in an email. No deep dive.

Since I'm occasionally asked, what is it exactly that you do at work, I thought that I would take the general idea of what Carole and I had put together and elaborate on it a bit with this post and companion gallery. So, beginning with my Digital Production Lab, these were some of the highlights of 2015...

Two pieces of artwork from the UConn Archives’ Charles Olson collection were captured for a publication of the Institute of Contemporary Art in Boston. Here's one of the objects (Cy Twombly at Black Mountain, by Fielding Dawson, 1951) as it was being photographed in the lab. (Molesworth, H., Erickson, R., Institute of Contemporary Art, Yale University Press, Hammer Museum, Wexner Center for the Arts, & Black Mountain College. (2015). Leap Before You Look : Black Mountain College, 1933-1957. pgs 280, 319.)

Additionally, work on the remaining Jerauld A. Manter photograph collection of medium format flexible negatives, which began in September 2014, was also finished this past September. More than 7,000 digital images were created which document many aspects of the early history of the University of Connecticut, Storrs up until 1965. Here's a view of one of the copy stand and light table setups that we employ to capture such medium format film:

Rare books, particularly ones with difficult-to-photograph foldouts, also fall within our scope. We welcome challenging materials like these which often come our way, since in many instances they will remain un-digitized by other holders or digitized in a folded up state because of their physical complexities. Here's an example (Versuch einer Naturgeschichte der Eingeweidewürmer thierischer Körper by von Johann August Ephraim Goeze, Leipzig 1787) that we received this year for reformatting. In this volume, the foldouts were made up of separate light weight leafs that were attached backwards onto the bound pages, then folded back into the book.

Here student photographer, Josh O'Brien, builds a custom support that allows for the flattening of one of the foldout pages for accurate photography without making a hard, possibly damaging crease where the foldout is attached to the bound page...

Ready for shooting.... and eventual deposit into our Fedora repository.

Here's an example of a large format map foldout from the United States Geological Survey Monograph 52, Plate 8, Mesabi District Minnesota, 1911 that we were asked to shoot by the USGS. Mercifully the map was not bound inside the monograph but was instead, along with a number of other folded maps, tucked into a pocket that was affixed to the inside back board.

All total, more than 130,000 images have been created so far this year from a wide variety of original formats and collections using the lab’s various imaging equipment and staff expertise. This is an aerial view of 1/2 of the lab's floor (the opposite side of the room is an almost mirror image of similar gear and workstations). It's truly rewarding to work both with such unique original materials and to be able to teach advanced digital imaging techniques and asset management skills to a bright and focused staff:

In Carole's shop, they have completely integrated the archive and special collections' digitization workflow into Conservation. So far this year they have reviewed more than 2,700 folders and treated approximately 500 individual documents.

Together, Carole and I strategize almost daily on format feasibility assessments and on object treatments and preparations in order to better optimize the overall digital production workflow. It's a fruitful, ongoing collaboration that is essential in order to conduct reformatting efforts both accurately and to scale.

Share
December 10, 2015

Hitting The Trail With Timber

Timber will be turning one year old this coming weekend. With his ongoing maturation, we've been taking him out for longer outings and hikes. This past weekend, he bagged nearby Mt. Watatic with no problem.

Howling Dog Alaska's Distance Harness, Skijoring Line and Trekking Belt combo package has been a new addition to our collective kit. Using this has been great for masters and canine alike.

The combo is particularly good at distributing the often sudden pulling force that Huskies can make when they really get into the flow. Handling them with a regular collar/leash on the trail can wear out your back and shoulders after a while. With the package, we don't feel so beaten up after a long hike. Plus the harness is better for the dog as well.

Overall, it's been a blast to further meld into the environment with our boy...

Then again, kicking back in front of the wood stove isn't too bad either.

Share
September 22, 2015

The End of Things

Here's a prepress version of the photo essay, The End of Things, that was recently published across the pond in The 88 Journal. A special shout-out to crack editorial director, Anna-Marie Crowhurst, for helping to whip this all up into final visual shape...

Print editions of the entire journal are now available here in the States through Barnes and Noble.

Share
September 2, 2015

Excellent Adventures in Email Archiving, Part 2: MBOX to Thunderbird

Welcome to Thunderbird!

In Part 1 of this entry, I recounted a tale of personal email archiving that took us from Microsoft's PST format to the open MBOX. Here, we'll take a look at how MBOX can play nice with a piece of open source, cross-platform freeware such as Mozilla Thunderbird.

Thunderbird can be downloaded and run on Linux, Windows, and OS X systems. In my particular test, I wanted simply to see whether or not I could take variously-sourced PST email archives, convert them to the open MBOX, and then have these archives work on a Mac and a Windows system.

Once downloaded, a default email account needs to be set up. Thunderbird supports both POP and IMAP protocols to active email servers and is relatively easy to configure. In fact, the program can be used much like Outlook was employed in Part 1 to directly harvest data straight from email servers for archiving purposes. For one of my live personal accounts, I actually have configured Thunderbird (using IMAP) to work dynamically with my service provider in this way. In the illustration that follows, however, it is assumed that we are trying to mainly archive static email data that has already been previously harvested by Outlook into PST from closed accounts. In a business and enterprise world still dominated by Outlook, it is not uncommon to only have the ability to create a PST archive of one's old work emails when switching jobs. That is if you even have access to this data at all.

Thunderbird, much like its close Mozilla cousin, Firefox, supports downloadable Add-ons that broaden the program's functionality. In order to both import and export MBOX format in Thunderbird, the quite handy ImportExportTools Add-on should be downloaded and installed.

Next, the stage needs to be setup for MBOX import. This is accomplished, by account, within Thunderbird's local folders. In this example, I'm working with an old Yahoo Mail archive that went through a similar PST > MBOX conversion process as the UConn archive described in Part 1...

From here, converted MBOX files may be imported into the newly-created local folder archive with a right-click entry into the ImportExportTools' context menu and selections made from the following dialog boxes...

And finally, there we have an imported MBOX archive living inside a Thunderbird local folder...

One of the nice things about the way Thunderbird handles MBOX is that, unlike imported PST files in Outlook for Mac, imported MBOX files can subsequently be exported out again from the program.

As a result, Thunderbird doesn't act as a black box. Since the program itself can be installed on multiple platforms, it can function more like an email archive and/or data conduit tool where MBOX files can flow into and out of the program from one OS to another as needed.

A few general notes on MBOX and email archiving in closing. While we have seen how MBOX can store entire email folders, it also stores email attachments in their original MIME format. With regard to such attachments, Chris Prom notes in his excellent 2011 DPC Technology Watch Report: Preservng Email, that "action will likely need to be taken to migrate them, if they are to remain accessible in the future." As a result, MBOX is viewed by some within digital archiving circles as a bit of a half measure towards true robust preservation.

Possible solutions to this issue include the use of XML. One such example is the implementation of the Email Account Schema by such organizations as the Smithsonian Institution Archives. As Prom notes on this score, "Attachments can either be encoded in the xml file itself or written in their original binary formats to externally referenced locations. The latter feature is particularly useful because the preservation of the attachments may require additional effort, including monitoring for format obsolescence and the development of future migration actions."

Meanwhile, MBOX-related preservation tools continue to be developed and funded. Of recent note, Stanford University Libraries' ePADD software package was awarded a $685,000 National Leadership Grant by the Institute of Museum and Library Services (IMLS) this past summer.

Share
August 26, 2015

Excellent Adventures in Email Archiving, Part 1: PST to MBOX

You've Got (No) Mail

At UConn, we've recently migrated our email from a local Outlook server to Microsoft's Outlook 365. For some time now, I have been meaning to organize my various work and personal email archives into a uniform, portable archival standard that could be interoperable between Windows and Mac OS clients (and Unix too?). Additionally I wanted to avoid a webmail, cloud-stored solution of this data. On principle and in order to simply continue to maintain my own overall data management chops, I like to more closely control the storage of my written and photographic output and not rely on the cloud for much. Concerning this particular email archiving project, the timing seemed right to finally get on with it.

Beyond its use as a daily email client, MS Outlook is a solid tool that can also be easily employed to harvest email account data from a variety of host servers either through IMAP or POP3 protocols. In fact, it was MS Outlook running on an old PC at home that I initially used to gather my current UConn, previous employer, and two additional personal email accounts under one roof. The resulting data was stored by separate email accounts into discrete .pst data files in MS Outlook on my PC. So far, so good, so I thought.

On my 4 year year old MacBook Pro, I had a copy of Outlook for Mac. I already knew that I could export all of my .pst archive files from the MS Outlook client either by manually copying them to a large thumb drive from their stored locations or by using Outlook's Export wizard:

The question remained, could Outlook for Mac import and render .pst files from a Windows system? After sneaker-netting a single test .pst to the Mac's desktop with the thumb drive, I then tried to import the .pst through Outlook for Mac. Sure enough, things started well...

...continuing on through subsequent Outlook for Mac import wizard steps. And eventually there it was, a Windows-sourced .pst appearing as a real email archive in Outlook for Mac.

This was heady stuff. However, things quickly turned sour. The .pst format wasn't something that one could ping-pong back and forth between OS's as I had hoped. Counter-intuitively Outlook for Mac doesn't support reciprocal .pst export. PST files are a one way street from Windows to Mac versions of the program.

This left me searching for another email format and another software strategy that could possibly be more generic and open-source. The MBOX file format was something that I vaguely remembered reading about on a listserv. So, I decided to take a look at the U.S. National Archives and Records Administration's Format Guidance for the Transfer of Permanent Electronic Records List for Email. And there was MBOX, right beneath PST in NARA's list of Preferred Formats for Aggregations of Email:

From http://www.archives.gov/records-mgmt/policy/transfer-guidance-tables.html#email

This was good. A valid alternative to try. I then thought for a while about what email client software would be able to import, read, and export MBOX while running on both Mac OS and Windows? Thunderbird, a Mozilla.org project, was something that I had heard good things about in the past. According to the literature, it could run on both OS systems and support MBOX import and export (through a plugin).  A problem remained, though.

Unfortunately, MS Outlook doesn't directly export to MBOX.  However, Outlook for Mac can!  But it's not something inherently automated.  Then again, my folder structures weren't that hideously nested or complex.  I just had a lot of them.  So, I went back to my Windows .pst > Mac Outlook sandbox.  Luckily I found that the MBOX conversion process isn't difficult.  It is just repetitive and can't be done in groupings higher than the individual email folder level.  Still all one has to do is simply drag an existing Outlook for Mac email folder into a coherently-named destination folder on the Mac, and the mail folder gets converted to MBOX on the fly.  Nice!  Here's a split-screen of that process and its result...

Next I downloaded and installed a copy of Thunderbird on both the Mac and Windows machines.  If you are still with me, you can read the next chapter of this tale in Part 2...

Share
July 29, 2015

Rapid Capture, Reflecting on Interpretations

Recently, I had the great fortune to be invited to the annual Cultural Heritage Imaging Professionals Conference held at Stanford University. The assembled group is small by design, made up of digital imaging studio managers from libraries, archives and museums who are provided with a focused forum during the three day event, "to share ideas, best practices, techniques and stories."

Cultural Heritage Imaging Professionals Conference

Part of my invitation included a request to do a short talk on any subject that I wished to choose. The only stipulation was that the discourse should provoke thought and stimulate later discussions at the event.

I had originally considered doing a condensed version of a recent presentation that I made on 3D modeling at The Getty Center. But to keep things fresh I decided to focus instead on some recent thoughts of mine with regard to where we are as a field on the notion of "rapid capture."

While there is some debate on when exactly the phrase first appeared in the lexicon, we can best trace rapid capture's most recent adoption within cultural heritage institutions to Ricky Erway's and Jennifer Schaffner's 2007, Shifting Gears: Gearing Up to Get Into the Flow. In their paper, the authors enthusiastically outline a progressive vision of digital reformatting through a proposed set of competing dichotomies. These include, Access vs. Preservation – Access Wins! and Quality vs. Quantity – Quantity Wins! The means to these ends include the notion that lowering image resolution at capture (i.e. lesser quality) directly allows for faster throughput (i.e. higher quantity). Among the stated goals of such accelerated throughput are to make digital capture an embedded part of overall initial archival processing and to have special collections' digitization emulate Google and Internet Archive's large-scale reformatting initiatives. According to the authors, "All these measures will help us to begin to keep pace with mass digitization of books." At the time of its publication on through to today, the paper has had a strong influence on the higher-level thinking of libraries, archives and museums.

Yet, mass digitized books and special collections objects can be two very different things, a truth that Erway accurately acknowledges in her 2011 follow-up, Rapid Capture: Faster Throughput in Digitization of Special Collections. There she mentions that, “collections vary, and it is important to recognize that comparing throughput rates… [we] must take other factors into account. Different materials require different approaches..." In fact one of the generally agreed upon themes from this year's Stanford gathering was that the handling of fragile and mostly heterogeneous archival objects was the major bottle-neck in subsequent digital capture.

I bring up these two papers in particular for a few reasons. Mainly, they serve as a healthy example of the gradual refinement, through implementation and fair-minded assessment, of one's sometimes raw "out with the past, in with the new" ideas. In this case, 2007's Shifting Gears... presents the radical, yet untested initial outline of a proposed future, while 2011's Rapid Capture... presents us on the other hand with a series of trials and subsequent results analysis. Erway describes the methodology of the later, "So in an extremely casual survey, we asked some of our colleagues in libraries, archives, and museums to identify initiatives where non-book digitization was being done "at scale." We didn’t define "at scale," because we thought we’d know it when we saw it. It wasn’t always so easy." The paper goes on to more closely examine a number of case studies, and some of the unexpected hurdles that were met. In her conclusions, Erway sounds a more sober note on just how far the possibilities of monolithic, rapid capture assumptions can logistically reach when applied to the often heterogeneous formats and fragile nature of archival objects.

Though by 2011 Erway's own initial vision had obviously been re-calibrated by the natural iterations of a reasoned experimentation and evidence-based feedback loop, it is interesting to think about what "rapid capture" still means today. In many ways, the early (at that point untested) assumptions of the 2007 paper still hold a powerful influence. For instance, the relationship between speed (rapidity) and quality is still viewed by many as an inflexible inverse relationship. Yet, this assumption is in many ways a carry over to an earlier time when most digital capture was done with the tri-linear sensor technology of still camera scan back systems or generic flatbed scanners. In order to get such technology to run faster, you needed to lower its sampling rate setting. Hence the idea that, high speed = low resolution or intimations of “low quality" came into being.

Epson Expression 10000 XL Flatbed Scanner
Tri-linear Sensor Array (from The Focal Encyclopedia of Photography. 4th ed. 2007)

However, the technological reality of digital imaging drastically changed in 2008 when Canon released its affordable 5DII 21.1Megapixel area array EOS camera body.

21.1 MP Canon 5D Mark II

Area Array Sensor with Bayer Filter (from The Focal Encyclopedia of Photography. 4th ed. 2007)

It was at that moment when flat objects up to 11"x17" could be captured at true 300ppi with the instantaneous release of a shutter, rather than the slow broom sweep of a scanner's tri-linear sensor array. Despite such advances in speed and quality, rapid capture can still carry with it vague connotations of quality being necessarily sacrificed for the sake of speed. It is as if the assumptions of the 2007 paper were taken as truth, and the realities of the 2011 follow up were never read beyond "Rapid Capture" appearing in the title along with the term henceforth becoming part of the greater lexicon. Yet, as imaging technologies continue to advance and become even more accessible, and as technical imaging expertise grows more focused, high quality is routinely being attained at scale through skilled workflow engineering and management.

Today, the lingering old assumptions can be problematic for a few reasons. Along with advances in digital capture technology, the advent of 4K, 8K, and "Retina" displays, and in turn the heightened viewing expectations of digital image users have arrived. At the same time, what is becoming apparent more and more is that on such super and ultra high definition devices, low quality images look progressively worse than in today's standard definition. Would it be a stretch then to envision a time in the near future when 4K, for example, will be the new "standard?" Can we not imagine how technology may end up driving us towards an almost regular re-evaluation of the importance of image "quality" whether we want to engage in ongoing self evaluation that deep or not? Our users' interest, and hence our own relevance may inevitably hinge on constantly upping our game.

It is also interesting to note that beyond simple appearance, images, particularly large image aggregations are being used more and more in big data research.  This is a topic that I have previously touched upon in citing Peter Leonard's work at the University of Chicago.  High quality images created to a consistent standard can be leveraged as a rich and useful data resource for tomorrow's researchers to plumb new data points, and approach new lines of inquiry that may go well beyond such traditional activities as text mining.

Meanwhile, the question remains... will yesterday's more coarsely-standardized archival images frustratingly limit expected use over time?  A world of big noise, rather than big data?

In the final analysis, this is not a plea to return to the time when boutique imaging methods were indiscriminately applied to all material formats.  Indeed aspects of rapid capture's original vision remain a healthy aspiration in order to get our collective efforts up to an operational level of scale.  It is just that the black or white nature of those original dichotomies feel a bit off focus in hindsight, particularly in light of the latest technologies and their applications.

At UConn, our fiscal year recently closed and with that another 12 month tracking cycle for our lab.  All total, my photographers shot 100G new image captures that I can confidently describe as attaining FADGI 3-4 star quality (4 being the highest level of the scale).  Is this an example of rapid capture?  I don't know.  All I know is that it is not simply the raw totals that interest me but the ongoing challenge to maintain an accompanying high level of image quality that makes it fun to come to work everyday.

Share
June 28, 2015

New Member of the Pack

Over the weekend, Tara and I picked up our newest pack member, Timber. He's a gray and white Siberian Husky from the folks at Northern Lights Kennel, Northfield, MA.

So far, he's been fitting right into our home. Here he is going in super fast for a face lick of the photographer who is trying to get some low angle shots...

Otherwise, all natural dog bones from the Hardwick Farmers CO-OP have been a hit...

But after awhile, they are so exhausting!

...as are my masters when they start to pull out their smart devices and try to look busy.

Share
June 18, 2015

Unexpected Guest Tests Adobe Lightroom's New Dehaze Feature

Adobe just released their latest Camera Raw v.9.1. As CR and Lightroom share the same raw processing engine, the new feature set is shared between platforms. One of the latest tools is a Dehaze feature that can be found in LR's Develop module under "Effects."

Recently I experienced a unique test case of how the tool may be able to somewhat rescue images shot under less-than-ideal conditions. This morning I awoke to what sounded like a burglar at the back of our house. Here's what I shot quickly through our deck's double-paned glass and screen door slider:

It was thrilling to be standing there with just a couple of layers of glass and a Canon 5DIII between man and young bear. American Black Bear, that is. Here you can see him extending a tongue to get at the black oil sunflower seeds from a bird feeder that we forgot to bring in last night (we're normally fastidious about this).

The following before and afters of an earlier shot in the series gives you a hint at what the Dehaze tool, in combination with some basic tonal adjustments, can accomplish. I didn't spend too much time on this, yet I was still able to get somewhere with the original raw image data:

Not bad for a seriously optically compromised image.

Oh, and here's some of the small wreckage that remained after the bear saddled over the deck's railing, climbed 10 feet back down to the ground, and disappeared again into the woods. Note the drool (bear, not mine) on the inside of the feeder:

I can still hear the crunching noise that his teeth made while putting them through that plastic. I'm glad it wasn't my arm.

View all blog by month
  • Home
  • Galleries
  • Blog
  • Bio
© Michael J. Bennett