Thursday, March 08, 2012

Windows 8 and other tech stuff

A previous Cannonfire post expressed scorn, or at least befuddlement, at the new Windows 8 "Metro" interface. Metro seems like a good idea if you like portable touchscreen machines, but that ain't me.

Nevertheless, now that Windows 8 is officially here -- and downloadable for free, at least for most of the rest of the year -- I'm thinking of making the switch.

Why? Well, according to the reviewers, the new OS does have some noteworthy improvements: Faster loading, a much-improved Task Manager, easier back-ups, a built-in pdf reader.

What really intrigues me, however, is a third-party app that returns the Start Menu to its proper place, allowing you to work the way you've always worked. The app, called Start8, is made by Stardock, and it's available for free. Here's a review of the app, and here is the thing-in-itself. If you're a true traditionalist, you can even give your Start icon an XP look.

(Before proceeding, let me repeat my usual warning: You know how annoying the knee-jerk "get a Mac" guys are? The "get linux" guys are almost as bad. Word to the wise.)

Now let's talk about the new iPad.

I've never owned an iPad and probably never will. But I've played with one. Not without its charms, it was.

For the life of me, I can't understand why the new version sports a super-HD 2048 x 1536 display. Was pixelization so very apparent on the old iPad? I didn't notice any problems in that regard.

If you are sitting in front of a desktop monitor right now, or even a larger laptop, you're probably looking at a 1920 x 1080 resolution image. Do you see the individual pixels?

In the world of video (as you probably know), there are two sizes which have been labeled "HD" or high definition. The smaller is 1280 x 720 and the larger is 1920 x 1080. On smaller screens, many people can't tell the difference. (There's a huge and very technical debate on this point, but this blog isn't the place for it.)

In the world of modern theatrical film-making, post production (editing, color correction, special effects and so forth) is done at what is called the "2K" resolution -- which is very close to the 1920 x 1080 image produced by consumer-level video cameras these days. That image gets projected on a screen 25, 30, 40 feet high. Looks fine.

So...is there really a need for a 2048 x 1536 pixel display on your portable tablet 'puter?

The world has gone HD crazy. I'm glad, but also a bit confused by this tech fad.

The last major motion picture filmed in 70mm was the Kenneth Branagh version of Hamlet. At the time it came out -- 1996 -- I told friends that moviemakers should film everything in 70mm, because the larger film stock allows for greater resolution and finer detail. People thought I was being ridiculous: To their eyes, Hamlet looked like any other movie.

So much has changed in fourteen years. Now, everyone understands that heaven is in the details. I'm glad they see it that way -- finally.

But the new iPad is overkill.

4 comments:

Sextus Propertius said...

Reference resolution for 2K is 2048x1536. That's 52% more pixels than 1920x1080, which is not "very close" to my mind. I think most digital "films" these days are shot in "full aperture" 4K, which is 4096x3112 (4x as many pixels as 2K).

Moving images don't require the same resolution as still images to "look" sharp, because your visual system filters the random noise in successive frames. Viewing distance plays into it, too.

To someone who is used to competently-photographed medium format or large format film images, 4K stills look like crap.

Anyone whose vision is so impaired that he/she can't tell the difference between projected 35mm and 70mm films needs to apply for disability and a seeing-eye dog ASAP.

As for the iPad, I believe one of the motivations for higher resolution is to render better text

Oh, and the words "Windows" and "tech" do not belong in the same sentence. ;-)

Jotman said...

Basically, Apple is compensating for a self-inflicted defect with the drive towards higher resolution monitors.

Mac fonts cause me eyestrain. I say this as a long time Windows user who recently switched over to Mac and have suffered markedly worse eyesight as a result.

Whereas Gates wanted fonts to be "readable," Jobs wanted fonts to look pretty. In technical terms, Microsoft respects the pixel grid, Apple doesn't. I like this explination:

Apple generally believes that the goal of the algorithm should be to preserve the design of the typeface as much as possible, even at the cost of a little bit of blurriness. Microsoft generally believes that the shape of each letter should be hammered into pixel boundaries to prevent blur and improve readability, even at the cost of not being true to the typeface.

There's a helpful visual demonstration of the distinction here: http://www.codinghorror.com/blog/2007/06/font-rendering-respecting-the-pixel-grid.html

Higher pixel densities should make the blurrier Apple characters easier to read. Whereas high pixel density probably wouldn't make a lot of difference to the Windows user, they should improve the quality of life for users of Apple products.

Joseph Cannon said...

"Reference resolution for 2K is 2048x1536. That's 52% more pixels than 1920x1080, which is not "very close" to my mind."

I was going by the famous Wikipedia chart, which has 2k at 2048 x 1080.

http://en.wikipedia.org/wiki/Display_resolution

But that, as it turns out, is not really correct.

Even so, I love the idea of an inexpensive camera that can take an image capable of being shown theatrically. This opens up a lot of room for creativity. The Canon HF10 is about the size of a soda can, and can be snatched up on Ebay for less than $200. Yet it has been used to shoot feature films.

Well, maybe not as the main camera. But...why not?

I sometimes wonder how much detail we really need. And I speak as someone who used to travel 30 miles or more by bus -- in the rain -- just to see a movie in 70mm.

When Blu-Ray came out, a friend told me about the classic films he was now going to have to re-purchase (having already bought them on laser and DVD). "Why?" I asked him. "Do you want to see the grain more clearly?"

As I wrote those words to him, I happened to be watching a (standard) DVD copy of one of my favorite films, "Night of the Hunter." Even in clear daylight shots, you can see the grain clearly. And that's the way archival prints of the film looked when projected at LACMA and UCLA.

Do today's young up-n-coming filmmakers really NEED a camera that can deliver better images than the ones Stanley Cortez gave us in NOTH?

Take another look at "Barry Lyndon" and "Days of Heaven," the best-photographed films of the 1970s. Both are VERY grainy by modern standards -- even in the outdoor scenes taken in full daylight.

Yet nowadays, young filmmakers wail and whine if they can't get noise-free shots in ultra-low-light circumstances. What babies they are! Why, in MY day...

"Anyone whose vision is so impaired that he/she can't tell the difference between projected 35mm and 70mm films needs to apply for disability and a seeing-eye dog ASAP."

Audiences don't notice. Not consciously. Once, while waiting for a movie to start, I sneaked into a theater showing "Titanic." The movie was being projected NON-anamorphically. You can imagine how it looked! Yet no-one noticed or complained, and there must have been 40 or 50 people in the house.

Anonymous said...

Anyone who's ever had to endure their work test-screened knows that what an audience notices and what they are capable of articulating noticing are two very different things.

These days there is a push to move to higher frame rate. At 48 frames/second, movement, especially camera movement, would have much less motion blur than current films. After viewing such a film it is highly unlikely that average viewers would be able to finger the framerate change. They would more likely say something like, "those actions scenes were really realistic."

But in terms of achieving audience immersion, the best tool remains good story telling.

SB