Some friends and I are currently knee-deep in pre-production for an upcoming web series we are producing called “Welcome to the World,” and in preparation for it, we’ve done some camera tests so we know what to expect from our equipment. More importantly, however, we were interested in how Vimeo would handle the compression of our footage. Since we are relying on web delivery, we needed to know how it would affect our footage. I shot a project last summer that, when I uploaded to Facebook, turned out significantly darker than I planned. Since it was a dark film anyways, it was disappointing to see how much detail I lost in the shadow regions. So we’re attempting to avoid that problem and really evaluate what results we’ll get at the end of our workflow so we can light and shoot it appropriately at the beginning of the workflow. Here is the full camera test, embedded from Vimeo:

<p><a href=”http://vimeo.com/19601622″>Vimeo Compression Test: RED One vs. Panasonic AF100</a> from <a href=”http://vimeo.com/helenabowen”>Helena Bowen</a> on <a href=”http://vimeo.com”>Vimeo</a&gt;.</p>

We lit our test scene with a wide variety of exposures, ranging from an f/.7-1 split to an f/8-11 split. We included skin tones, blacks, a bit of color and tried to get a gradient on a projector screen in the background. Here’s a screen shot of our frame with the f-stop readings:

We shot two cameras side-by-side: the Red One and the new Panasonic AF100. Again, the primary purpose of this test is to investigate what Vimeo will do during compression, but we actually got an interesting insight into how this new $5000 camera will compare to this relatively well-established $30,000 camera. Not surprisingly, the RED came out with significantly better footage than the AF100. But I’ll talk about those results a little more later. For now, more on how we shot the test.

We set up the RED with a Cooke S4 35mm prime lens and the AF100 with a Zeiss SuperSpeed 25mm. Thanks to the AF100’s smaller sensor size and resulting crop factor, the focal lengths were roughly similar. In retrospect, we really should have put another SuperSpeed on the RED, but this isn’t a sharpness test, it’s an exposure test.

I captured all the footage in Final Cut Pro using the Log and Transfer tool, not RedCine-X or any other specialty application. It was edited into a ProRes 422 1080p24 timeline. At the end, I exported using Quicktime Conversion inside Final Cut to best-quality h.264 720p for Vimeo upload. I also attempted using the Mpeg-4 Codec, but it caused more artifacting after Vimeo compression and darkened our image signficantly. h.264 was more faithful to the original footage, as you can see in the two screenshots below. The first image is the ProRes-transcoded camera original and the second is the exported h.264 720p version before upload to Vimeo.

The RED footage was RAW, unedited, with no LUT applied. It retains its native 5600K white balance and is rated at ISO320. The AF100 was shot at ISO320 at 3200K white balance. We tested various different gamma settings, but we primarily used the HD Norm setting. There are also clips of Cine-Like D and Cine-Like V, but we didn’t like that as much. HD Norm gave us the most usable footage straight out of the camera, and gave us the best compromise of exposure and contrast. Cine-Like D gave us a wider dynamic range, but looked muddy and exacerbated some of the noise issues we found in the underexposed parts of the frame. It also necessitates a certain amount of post-processing, something we’re looking to minimize on our web series. Cine-Like V, however, crushed just about everything below 2 stops underexposed. This yielded the “cleanest” looking footage since all the noisy areas were crushed to black, but without lighting the scene specifically for that gamma setting, those shots became almost unusable. The RED, on the other hand, gave us very clean, noise-free footage that held up quite well even when reaching 4 stops underexposed. We were somewhat disappointed with the overexposed areas on both cameras, with each one starting to clip at about 3-4 stops overexposed. Keep in mind that the RED shoots RAW, and some amount of detail ought to be salvageable when using RedCine-X (keyword: ought).

We shot the test at several different stops on the lens to get an idea of how each camera would perform at different levels of over/underexposure and see how Vimeo reacted to those results. In general, we found that we started seeing artifacts and pixelation/blockiness in areas of 2.5 stops of underexposure and more. If you look closely in the lower right hand corner of the screen, you’ll notice it most. The dark wood holds up relatively well, but the off-white walls show serious levels of noise and artifacting. The projector screen on the left side of the frame is also susceptible to a certain amount of banding from the gradient-like lighting that Vimeo can’t seem to handle, but it was, I admit, better than I expected.

The last couple clips in the video are quick attempts to correct some of the footage shot at different exposures. Please note that I am NOT a professional colorist, and I personally prefer the color-correcting tool in Avid to the 3-Way Color Corrector in Final Cut. I apologize for my poor attempt to color grade these images. However, as always, exposure is key here. When we pushed the underexposed RED footage, it held up quite well and was relatively noise-free throughout. However, even pushing the AF100 one stop introduced unacceptable levels of noise that show how the AVCHD codec can really fall apart.

Overall, this was a really interesting learning experience. We found that a $30,000 camera is, as expected, better than a $5000 camera, though I admit I was pretty disappointed by the AF100 after all the hype it has been receiving. I’d be interested in repeating these tests not only with the new Sony PMW-F3 when it comes out, but with the Sony EX3, an industry-standard camera that is bested by the AF100 in sensor size, but comes in at a slightly higher cost. We also hope to repeat these tests sometime soon with the Alexa and the Canon 7D. I will most certainly post those results with my full analysis and comparison to the results we found here. With all this new information in mind, we now know that we need to watch out for areas of underexposure, especially on white or off-white walls. We also know that h.264 encoding holds up reasonably well, especially compared to Mpeg-4, and that we can expect to see pretty consistent exposure from camera original to web-delivered final product.

If you have any questions about our process here or have any suggestions for future tests, please comment below! Thanks to Trevor and Helena, my partners in crime on the web series, who helped with this test!