Skip to content

Little Frog in High Def

Musings of an NLE ronin…
description of the photo


Archive for May, 2012

It started with this thread on the Adobe forums.  And then Mike Nichols (@TheEditDoctor) did this video showing the issue he was having with his Blackmagic card and CS6…and how it wouldn’t act like it does in FCP.

Basically the issue is this:  In FCP, if we were using a 1080p 23.98…or 720p 23.98 sequence, and we wanted to view this on a monitor that might not be able to do 23.98…all we had to do is set our VIDEO PLAYBACK to be 1080i29.97 (for 1080), or 720p 59.94 (for 720p), and FCP would add the proper pulldown and send out a 29.97 or 59.94 signal.  Playing back our 23.98 sequences at 29.97.  The issue that cropped up is that with CS6 you couldn’t do that.  At least with a Blackmagic Design card.

I wanted to verify this for myself, because I find this to be an important feature…and I just couldn’t believe that this was the case. There had to be some way to do it.  I have the AJA Kona 3 card, so I have different drivers…and it turns out that it works for me.

First I looked at the AJA Control Panel. And just like with FCP, The sequence setting and video output control what the FRAME BUFFER is. This was set to 1080sf23.98. When I changed the secondary to 1080i 29.97, and switched the output to match…my monitor freaked out. OK, so that wouldn’t work. So then I looked into the settings in PPro CS6.

I went into the PREFERENCES and looked at PLAYBACK. Note that my video device is the AJA Kona 3. I noticed a SETUP option. I clicked on that…and got this:

Hmmm…VIDEO FORMAT. This looks promising. So I clicked on that and VIOLA! A list of options appeared.

I chose 1080i29.97, clicked OK and my monitor flickered…then registered that it was getting a 1080i 29.97 signal. I looked at the AJA Control Panel and the FRAME BUFFER did indeed register a 1080i 29.97 signal.

I mentioned this on Twitter, and Mike said that he didn’t see that with his BMD card. Is this an AJA only thing at the moment? Can no other IO devices from other makers do this?

I know what you all are asking yourselves: “What did Shane edit that NAB wrap-up video with?” Plenty of you asked on Twitter, and I know you WOULD have asked on my blog in the comments…but I offered up the information before you could.  Because I knew you’d ask.

So yes, I used Adobe Premiere CS5.5 to start the edit…mainly the inputting and organization part…and pulling the selects I wanted to use. Because that’s what I had on my laptop, and that’s what was on the work machine I used while I had some downtime.  But then I did the bulk of the editing using Adobe CS6.something-that-is-in-beta.  So that I could do my part in testing and bug hunting, and so that I could dip my toes in the app and see how it works.

I will say that this project was PERFECT for Adobe Premiere.  I shot the NAB video with my Canon T2i (550D for all you Europeans) and a GoPro Hero 2, and I edited the footage from both cameras natively…without transcoding. This was…OK on my laptop and the work computer, but only OK. Because neither had a graphics card to enable CUDA and let the Freddie Mercury Engine loose on my footage. So scanning the footage was stuttery, playing back was as well.  But once I opened up the project on my Octocore 3.0 Ghz MacPro with 12GB RAM, NVidia GTX285 card…it was butter. And it was really cool to have external monitoring AND the Mercury Playback working at the same time.

The opening sequence and me drinking the fantastic strawberry shake from the Mad Greek in Baker, CA was done with the GoPro.  The bulk of the interview and all of the b-roll done with the T2i. And I used a condenser mic connected to the camera via a simple adapter. I did attempt to use the GoPro as an off-angle b-camera, but because I lacked the LCD attachement, I only guessed at the shot, and all but one (the MOTU interview…although that was bad too) was very poorly composed.  I blame the camera operator for not having his shit together. Oh…that was me.)

The really fast stuff was shot in timelapse mode with the camera taking one picture every 60 seconds.  Then when I took the exit to Baker, took a sip from the shake, and got back onto the freeway…I shot normal speed and sped up.  Then back to timelapse for the rest of the way to Vegas.

Now…when I shoot timelapse with the GoPro, and want to use the footage in FCP…I need to use Quicktime Pro to import the image sequence and produce a playable QT movie. The only issue is that the frame dimensions are not standard TV dimensions, so I cannot export to ProRes at full size and have it play in real time in FCP.  I either have to squeeze it and have it be squooshed.  Squashed? Squished? Whatever… Or render it out Animation or something big, drop it into FCP, repo where I want it to go, and then render.  Oh, didn’t like that positioning…repo again, render.  Not so with Premiere Pro.  Even CS5.5 allowed me to import the image sequence and have it appear as a clip in the native dimensions, and allow me to repo it how I wanted…and play it back. But not smoothly.  No new media was created, so the machine staggered a little. But once I rendered it, it played smoothly.  At least that eliminated one step…the QT image sequence render part.

With the image sequence brought in, it was time to bring in the other footage.  I copied the camera masters from the backup drive to my media drive. The full card structure.  Made a folder based on the project…and in that I made a folder for the project file, one for the footage, one for the audio (the music and SFX I would be using), and then others for outputs and whatever I needed. 

See, I start organizing myself right from the start.  The first thing I do is set up folders to organize the material.  That is the key to a quick edit…being organized.  So just like I did with FCP, I did the bulk of organizing on the finder level, then brought the footage into PPro.

I used the Media Browser to browse the camera masters and drag in the footage. I did them en masse, and when i did that, I noticed that PPro was “conforming” the media. (I saw this on the timeline, lower right). I wondered what it was doing, as I thought PPro dealt with the footage natively, without converting it.  So I posed the question on the Adobe forum at the Creative Cow…what’s happening here? (OK, fine, I asked that BEFORE I started on this project…) It turns out that PPro conforms the audio.  “The audio needs to be all in the same bit and sample rate in order to be able to mixed together. Whereas FCP would wait until you wanted to play a timeline and then render the audio needed (remember the BEEP BEEP BEEP?), Adobe Premiere (like ProTools) just conforms all audio to the same, 48khz, 32-bit file type upon import so you don’t have to wait when you want to play it back in the timeline.” At least that’s what Ryan Patch told me. I believe him.  And it did this while I was able to do other things (so I guess that is a background process…even though it causes the app to slow down).

After I brought in the footage, did I start editing right away? HECK NO! I organized things. 

I watched everything I had, labelled the clips (keeping the original file names, but adding a description in another field). That’s a big part of the editor’s job…watching everything and knowing what you have. I didn’t sit and watch it all play in real time.  I did scrub through the b-roll. But I took the time to label my footage, make bins by category (Blackmagic design, MOTU, Autodesk, AJA, Avid…etc) and organize the footage into those bins. Only after the footage was organized, did I start editing. Yes, I could access the footage natively, but that doesn’t mean I started editing immediately. I wonder about all the marketing people who tout this “you can immediately start editing!” Who does this?  Well, I can see it in certain areas like news, or if you promised a wedding video to play back at the reception. But for most of the stuff…you need to watch what you have, and organize it.

The editing progressed much like I did things in FCP. Throwing clips onto the timeline in rough order…then rearranging things as editing progressed.  Although it was really fun to be able to scrub through the thumbnails and mark in an out.  That was a fun and a great speed advancement. (Yes…I know that FCX does this too…moving on.) 

Note…the IN and OUT points STAY PUT when I come back to the clip.  Something FCX’s version does not do.  It works when roughly getting the points you want. For more fine tune editing, I resorted to the VIEWER/Program method.

I did have to drop the audio way down.  Without the ability to use the audio mixer to do this (it only works on a track level, not clip level. So adjust the levels down in the mixer and the WHOLE track audio dips) I resorted to adjusting the audio on the timeline via my mouse.  The audio levels went down in big steps.  2.6dB, then 4.46dB, then 6.83dB, 10.10dB, 15.5dB, 22.8dB…then infinity. 

So what I had to do is ballpark it, and then hold down the option key to get more fine tuning of the levels.  And I couldn’t adjust more than one track at a time (if I am wrong about this, please comment and enlighten me)…so audio mixing took a bit longer than I am used to.  I did use a lower third preset that was built into the app…because motion graphics are not my forté. I’m thankful to have those.

One thing that I noted while editing this is that, well, I didn’t get all that I wanted, and that most of my stand-ups where I am alone and describe the product were…well…dull.  Flat.  A few jokes didn’t work, and on two occasions my mic wasn’t connected fully, so the audio was either not there, or dropped in and out. That’s fine, something always ends up on the cutting room floor.  (I don’t know if half of you reading this will get that reference, having only edited on computers). So I had to write some voice over to cover things that didn’t work out, and I ended up running with it. I’m glad it turned out well, because I felt, while shooting it and when I watched the dailies, that it was going to be a huge failure. Thank goodness I’m a great editor.  (Modest, too.)

When I was done, I sent it to Media Encoder and went from the shooting format right to the delivery format…no in-between codec or step.  Encoder has lots of great presets for people like me who know jack diddly about encoding…and it was really fast. Chalk that up to 64 bit I wager.

All said and done, editing was pretty much second nature, as I come from a Final Cut Pro mentality. It also helps that Premiere has a FCP keyboard layout option.  But I was also able to do some trimming that I could only do on the Avid…so it is a bit of a hybrid.  I really enjoyed editing with it.