Roving Thoughts archives

2010-10-03

My reaction to Panty and Stocking with Garterbelt #1

It is not often that the first episode of an anime series leaves me boggled as my major reaction.

The best way I can think of to describe P&S is that it feels like Gainax decided that they wanted to make a modern American cartoon and then turn the age limit up (without turning up the maturity level, which appears to be set firmly at the Ren & Stimpy or Beavis & Butthead level). The result is extremely unlike almost all of the other anime I've seen, including Gainax's other work.

(In a very bad way, Gurren Lagann is the closest other anime that I can think of, in that it started out as a deliberately over the top take on Japanese giant robot anime. You could perhaps say that P&S is an even more over the top take on modern American cartoons, given a Japanese twist and a much increased age limit.)

Aroduc's summary of the first episode will give you a more detailed rundown if you want it. Do not worry about reading spoilers; P&S is not the sort of experience that can really be spoiled.

I don't know if I like it, because the question feels inapplicable. This is not a show that you like, this is a show that you watch in horrified fascination because you can't turn away. (Or that you flee from with all due speed. It is possible that this is the most sensible reaction to P&S.)

I fall into the 'boggled and unable to look away' camp, so I will be watching the next episode. Of course I have no idea what it will be like; P&S is already so bizarre that it would not surprise me in the least if Gainax changed things majorly every episode (or every few episodes or whatever), depending on how thoroughly they want to run the current joke into the ground.

anime/PantyNStocking written at 16:20:10; Add Comment

2010-07-02

My quick reaction to Ookami-san to Shichinin no Nakamatachi episode 1

Rather than just just email my comments to Author in response to his entry, I'll be Author-like and post something here for once.

The first fansub of Ookami+7's first episode has come out, and I've watched it now (partly because of Author's entry). Unlike some, I actually like the narration; without its injection of snarky commentary and the show's gleefully casual exploitation of Western fairy tales for our amusement, this would be a pretty ordinary romantic comedy anime with a vaguely unusual hook. As it is the show's atmosphere is deliberately over the top, making me interested enough to watch at least an episode or two more.

(How over the top? Well, the main part of the first episode is an extended Cinderella parody, complete with a bicycle-drawn pumpkin carriage. Said carriage is nowhere near the most amusing and absurd part of the parody.)

If you want your romantic comedies without sly asides to the audience, this is not your show and Aroduc's criticisms are completely on target. Otherwise and assuming that you don't want something too deep, the start is promising but as always first episodes can be terribly misleading and we don't know if the show can keep this up for more than a few episodes without the formula getting stale; parodying a fairy tale an episode this way could get repetitive really fast.

anime/Ookami7Reaction written at 00:49:24; Add Comment

2010-06-15

My current photo processing workflow (as of June 2010)

In Thom Hogan's June 14th update (now here), he wrote:

Tonight's homework: document your workflow. Really. Write it down. Include everything that happens from pressing the shutter release to looking at the final image (wherever it may have ended up, e.g. on a wall, on Facebook, etc.).

I have some spare time today for once, since I already wrote today's techblog entry, so I feel like tackling this one just because.

To start with, a note. My workflow is strongly influenced by two things. First, I'm a Linux user, which means a limited choices for software and tools (and a bunch of scripting, because I'm comfortable with that). Second, it's strongly oriented around my Project 365 work, with an inevitable time-based focus on how I organize and approach things.

So:

  • the camera puts the picture in the default Nikon directory structure on my 4GB SD card. I have my camera set to the defaults, where it just numbers images sequentially and only resets the numbering when it rolls over every 10,000 images.

    A bit of negative workflow: I've learned the hard way that I can't tell a good picture from a bad one from just looking at the camera LCD (and it goes both ways; good pictures have looked bad on the LCD, and bad pictures have looked great). So I almost never delete pictures in the camera and generally it has to be completely and obviously a bad picture before I will.

    (The common causes are accidentally taken pictures or pictures where it is clear that the exposure is nowhere near where I want it.)

  • when I want to pull things off the camera, I use a script to rsync the entire card to my current master directory. This happens every day, generally only once.

    (Note that when I say 'the entire card', I really mean it; the master directory is an exact image of the card's directory layout.)

    I don't reformat the card when I do this. Instead, pictures stay on the card until the nominal remaining capacity drops below somewhere in the 90 to 50 images range (I typically take around 50 pictures a day, so this gives me at least a day's margin on card space). At that point I move the current master directory to my archival area, start a new one, and immediately reformat the card. Master directories are numbered sequentially as d90-pool-NN; I'm going to have to go three digits soon.

    (This is why I have to use rsync; I need something that will not re-copy already copied images.)

    My iron rule on card reformatting is that I must have run the card sync script a second time immediately before I reformat, and it must have reported nothing synced. This is designed to avoid accidentally reformatting a card with un-transferred photos.

    Yes, this does mean that I have every photograph I've ever taken (and not immediately deleted in the camera). Disk space is cheap at the low-ish rate that I photograph.

    (I am not claiming that I have a useful archive of every photo I've taken, because it's not. But if I really want to find something, at least it hasn't been deleted so it's possible.)

  • at the end of each day I use an exiftool-based script to copy all of the day's photographs to a temporary staging directory. Usually this happens at the same time as I'm pulling them off the camera.

    (This is also the point where I pop the camera battery out and drop it in the battery charger. Also, I clear the staging directory of the previous day's pictures before running the script. This is not scripted, because I don't script things that delete pictures.)

  • I use Bibble 5 on the staging directory in a multi-pass approach to decide on my selects and completely process them (all in the staging directory). At the end of this I have Bibble 5 write the 'developed' JPEGs to a subdirectory and I go through them with xli to make sure that I'm happy with them; if not, I process them some more until I am.

    The actual details of how I work in Bibble 5 are far too long (and variable) to go into this entry, which is already long enough.

  • I use a script to copy all of the bits of the final selects from the staging area to my Flickr archive area (which lives inside my general photo archive area). This is broadly organized by day (and by month and year once each is finished and I archive it). By 'all of the bits' I mean the original raw file from the camera, the Bibble data file about the edits I did, and the final generated JPEG.

    (The script picks out what to copy based on what pictures have generated JPEGs.)

    If I had to use chromatic aberration correction, I use the GIMP to trim off the last few pixels on the sides of the picture if they need it, because the current version of Bibble 5 corrupts the very edge of the picture in this case. (If I have cropped an edge in it doesn't need this, so I don't just hit every CA-processed image with an ImageMagick script or the like.)

    (In theory I could crop the image by those few pixels in Bibble 5. In practice, Bibble 5 on Linux is currently unusably slow when cropping in magnified view. So I get to use the GIMP.)

  • I upload the JPEGs to my Flickr using Flickr's basic uploader page in Firefox. After this finishes, I blank out the default filename-based titles that Flickr has given the pictures and add tags for appropriate things if they're missing.

    (Then I agonize over what to chose as my Project 365 photo, except on days when the choice is completely obvious or I only had one thing that was worth uploading to start with.)

On some days, I'm selecting images for more than just my Flickr uploads; the most common case is that I am also selecting for TBN's website. In these cases I generally repeat the last three stages for each separate reason, sometimes entirely independently and sometimes interleaved (where as I look at each image, I decide both if it's good for Flickr and if it's good for TBN).

(Note that I have two completely separate photo archive areas, one for the master directories, and one general photo archive area for all of the pictures that I've selected for various things. The second area has subdirectories for the thing, like flickr and tbn and family, and then generally date-based within each reason. If I had a higher volume of pictures, I would probably want to be more organized and consistent about my directory structures.)

As a Linux user, my strong impression is that Bibble 5 is about my only good choice for processing anything more than a few photographs in raw format. There are some free programs that will process individual raw format pictures, generally not really very well or fast, but I haven't found one that does a decent and acceptably fast job at browsing through them so I can make my selects.

(At this point I am nowhere near willing to either give up Linux or to get a second computer just to do photo processing.)

I could simplify a bunch of this workflow if I could bring myself to trust Bibble 5's 'catalog' asset management features. I would probably use multiple catalogs, with one for my master archive and then one for each reason I pick out photographs (a Flickr one, a TBN one, etc), and switch to formatting the card every time I copied the pictures off it (even though this makes me nervous; leaving the pictures on the card is vaguely reassuring just in case something disastrous happens on the computer). However, this would mean giving up the principle that nothing except my own scripts gets to go anywhere near the master archives.

(I'm a sysadmin. No, I don't trust your program.)

With a more complicated copying scheme I could change my master archives over to a date-based directory structure while still not reformatting cards immediately. I would have to rsync to a staging area, then hard-link the files into their final destinations (chosen based on their EXIF dates); anything that was already hardlinked wouldn't need to be looked at a second time, which would make it reasonably efficient.

Sidebar: looking back at the history of this

A lot of the dance around my master directories is because when I started out, I was planning to burn each master directory to DVD when it was 'done' as a backup archive; this is also why I got a 4GB SD card, because it went well with wanting roughly DVD-sized chunks of work. I never actually implemented this plan; my backups are instead just rsync'd to an external USB drive every so often.

(Don't panic, my machine has mirrored drives to start with.)

It's interesting and a bit depressing to see how pervasively this never implemented backup plan has shaped the rest of my workflow.

Back when I was using Bibble 4, my theoretical workflow was to use the staging area only to make my selects, then have Bibble 4 copy the selects to the per-day P365 archive area, re-point Bibble 4 to it, and process them there. This never entirely worked; every so often I would have to do most of the processing before deciding whether something was a select or not, and every so often I would get pulled into processing an image before pausing to copy it.

When the Bibble 5 beta came out, it forced my hand by not supporting directory based file copying (it could only copy files around inside its asset-management catalogs). If I was copying the files outside of Bibble 5 anyways, it was much easier to do all of the processing in one directory instead of theoretically splitting it across two separate ones.

photography/PhotoWorkflow written at 01:02:59; Add Comment


Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.