Roving Thoughts archives

2010-06-15

My current photo processing workflow (as of June 2010)

In Thom Hogan's June 14th update (now here), he wrote:

Tonight's homework: document your workflow. Really. Write it down. Include everything that happens from pressing the shutter release to looking at the final image (wherever it may have ended up, e.g. on a wall, on Facebook, etc.).

I have some spare time today for once, since I already wrote today's techblog entry, so I feel like tackling this one just because.

To start with, a note. My workflow is strongly influenced by two things. First, I'm a Linux user, which means a limited choices for software and tools (and a bunch of scripting, because I'm comfortable with that). Second, it's strongly oriented around my Project 365 work, with an inevitable time-based focus on how I organize and approach things.

So:

  • the camera puts the picture in the default Nikon directory structure on my 4GB SD card. I have my camera set to the defaults, where it just numbers images sequentially and only resets the numbering when it rolls over every 10,000 images.

    A bit of negative workflow: I've learned the hard way that I can't tell a good picture from a bad one from just looking at the camera LCD (and it goes both ways; good pictures have looked bad on the LCD, and bad pictures have looked great). So I almost never delete pictures in the camera and generally it has to be completely and obviously a bad picture before I will.

    (The common causes are accidentally taken pictures or pictures where it is clear that the exposure is nowhere near where I want it.)

  • when I want to pull things off the camera, I use a script to rsync the entire card to my current master directory. This happens every day, generally only once.

    (Note that when I say 'the entire card', I really mean it; the master directory is an exact image of the card's directory layout.)

    I don't reformat the card when I do this. Instead, pictures stay on the card until the nominal remaining capacity drops below somewhere in the 90 to 50 images range (I typically take around 50 pictures a day, so this gives me at least a day's margin on card space). At that point I move the current master directory to my archival area, start a new one, and immediately reformat the card. Master directories are numbered sequentially as d90-pool-NN; I'm going to have to go three digits soon.

    (This is why I have to use rsync; I need something that will not re-copy already copied images.)

    My iron rule on card reformatting is that I must have run the card sync script a second time immediately before I reformat, and it must have reported nothing synced. This is designed to avoid accidentally reformatting a card with un-transferred photos.

    Yes, this does mean that I have every photograph I've ever taken (and not immediately deleted in the camera). Disk space is cheap at the low-ish rate that I photograph.

    (I am not claiming that I have a useful archive of every photo I've taken, because it's not. But if I really want to find something, at least it hasn't been deleted so it's possible.)

  • at the end of each day I use an exiftool-based script to copy all of the day's photographs to a temporary staging directory. Usually this happens at the same time as I'm pulling them off the camera.

    (This is also the point where I pop the camera battery out and drop it in the battery charger. Also, I clear the staging directory of the previous day's pictures before running the script. This is not scripted, because I don't script things that delete pictures.)

  • I use Bibble 5 on the staging directory in a multi-pass approach to decide on my selects and completely process them (all in the staging directory). At the end of this I have Bibble 5 write the 'developed' JPEGs to a subdirectory and I go through them with xli to make sure that I'm happy with them; if not, I process them some more until I am.

    The actual details of how I work in Bibble 5 are far too long (and variable) to go into this entry, which is already long enough.

  • I use a script to copy all of the bits of the final selects from the staging area to my Flickr archive area (which lives inside my general photo archive area). This is broadly organized by day (and by month and year once each is finished and I archive it). By 'all of the bits' I mean the original raw file from the camera, the Bibble data file about the edits I did, and the final generated JPEG.

    (The script picks out what to copy based on what pictures have generated JPEGs.)

    If I had to use chromatic aberration correction, I use the GIMP to trim off the last few pixels on the sides of the picture if they need it, because the current version of Bibble 5 corrupts the very edge of the picture in this case. (If I have cropped an edge in it doesn't need this, so I don't just hit every CA-processed image with an ImageMagick script or the like.)

    (In theory I could crop the image by those few pixels in Bibble 5. In practice, Bibble 5 on Linux is currently unusably slow when cropping in magnified view. So I get to use the GIMP.)

  • I upload the JPEGs to my Flickr using Flickr's basic uploader page in Firefox. After this finishes, I blank out the default filename-based titles that Flickr has given the pictures and add tags for appropriate things if they're missing.

    (Then I agonize over what to chose as my Project 365 photo, except on days when the choice is completely obvious or I only had one thing that was worth uploading to start with.)

On some days, I'm selecting images for more than just my Flickr uploads; the most common case is that I am also selecting for TBN's website. In these cases I generally repeat the last three stages for each separate reason, sometimes entirely independently and sometimes interleaved (where as I look at each image, I decide both if it's good for Flickr and if it's good for TBN).

(Note that I have two completely separate photo archive areas, one for the master directories, and one general photo archive area for all of the pictures that I've selected for various things. The second area has subdirectories for the thing, like flickr and tbn and family, and then generally date-based within each reason. If I had a higher volume of pictures, I would probably want to be more organized and consistent about my directory structures.)

As a Linux user, my strong impression is that Bibble 5 is about my only good choice for processing anything more than a few photographs in raw format. There are some free programs that will process individual raw format pictures, generally not really very well or fast, but I haven't found one that does a decent and acceptably fast job at browsing through them so I can make my selects.

(At this point I am nowhere near willing to either give up Linux or to get a second computer just to do photo processing.)

I could simplify a bunch of this workflow if I could bring myself to trust Bibble 5's 'catalog' asset management features. I would probably use multiple catalogs, with one for my master archive and then one for each reason I pick out photographs (a Flickr one, a TBN one, etc), and switch to formatting the card every time I copied the pictures off it (even though this makes me nervous; leaving the pictures on the card is vaguely reassuring just in case something disastrous happens on the computer). However, this would mean giving up the principle that nothing except my own scripts gets to go anywhere near the master archives.

(I'm a sysadmin. No, I don't trust your program.)

With a more complicated copying scheme I could change my master archives over to a date-based directory structure while still not reformatting cards immediately. I would have to rsync to a staging area, then hard-link the files into their final destinations (chosen based on their EXIF dates); anything that was already hardlinked wouldn't need to be looked at a second time, which would make it reasonably efficient.

Sidebar: looking back at the history of this

A lot of the dance around my master directories is because when I started out, I was planning to burn each master directory to DVD when it was 'done' as a backup archive; this is also why I got a 4GB SD card, because it went well with wanting roughly DVD-sized chunks of work. I never actually implemented this plan; my backups are instead just rsync'd to an external USB drive every so often.

(Don't panic, my machine has mirrored drives to start with.)

It's interesting and a bit depressing to see how pervasively this never implemented backup plan has shaped the rest of my workflow.

Back when I was using Bibble 4, my theoretical workflow was to use the staging area only to make my selects, then have Bibble 4 copy the selects to the per-day P365 archive area, re-point Bibble 4 to it, and process them there. This never entirely worked; every so often I would have to do most of the processing before deciding whether something was a select or not, and every so often I would get pulled into processing an image before pausing to copy it.

When the Bibble 5 beta came out, it forced my hand by not supporting directory based file copying (it could only copy files around inside its asset-management catalogs). If I was copying the files outside of Bibble 5 anyways, it was much easier to do all of the processing in one directory instead of theoretically splitting it across two separate ones.

photography/PhotoWorkflow written at 01:02:59; Add Comment

By day for June 2010: 15; before June; after June.

Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.