Skip to content

Little Frog in High Def

Musings of an NLE ronin…

Archive

Category: EDITING

Building a Mac with PC parts…

Before I get into this build saga, I want to say a word of warning. Unless you are a very technical person who knows how to navigate a basic PC BIOS (motherboard OS) or how to swap out some computer code or know some specifics about piecing together a computer from scratch, I suggest getting help. Even though I am a pretty technical person, and I have built a Hackintosh before, I still needed help. Even after getting help with the first Hack, I still needed help. A lot of it was advice on parts, but the vast majority was in installing the modified OS, and making sure that everything in the BIOS was set for the computer to work. Yes, there are step by step instructions online, but some of these assume you have some basic assembly skills, and BIOS knowhow and a bit of troubleshooting chops. I had that, but still needed help. So if you intend to do this, I suggest you find someone who has done this before, or someone who is technical enough to help decipher what some of the code and tech lingo means. I could not have done this without the gracious help, and patience, of Stefan Avalos and Patrick Sheffield. The reasons will become clear as the saga unfolds.)

PART 1 – THE NEED FOR A NEW COMPUTER

My old tower is a 2008 MacPro, 16GB of RAM, with a decent graphics card (great for Apple Color). But it’s 2016, and the machine is 8 years old, ancient in computer terms. But it has served me well. Lately I’ve been relying on my 2012 MacBook Pro when cutting in my home office. And I have fully decked it out with 16GB of RAM and two internal SSD drives. It does well too, but really only with Avid and simple Resolve and Adobe projects. Since most of the jobs I do in my home office are online and color grading, and I’ve been getting more and more into Davinci Resolve, even this laptop is getting long in the tooth, it’s only four years old, and it still has some life left in it, but forget trying to do 4K with it. And I have several 4K projects on the horizon.

So I need a new computer.

One big stumbling block in this quest for a new computer is that I am a die hard Mac user. I have been since I bought my first computer in 1991. I’ve only owned one PC, and it was solely for gaming. But Apple is fairly behind in their professional line of desktops. The latest MacPro came out in 2013…it is THREE years old. One year younger than my laptop. Sure, the latest iMac came out in Oct, 2015, but I have issues with the expandability of the model. BOTH models to tell the truth. Neither one allows one to swap out the GPU for a better one…one more suited for Resolve and Adobe Premiere Pro. Neither one allows for more internal drives to be added. Neither one allows me to be able to install any sort of card to add ports like eSATA or Fibre or SAS or an internal capture card without using an external box. Apple is dropping the ball on the professional computer needs for video post, this much is clear.

So what about the other option? A PC running Windows. And yes, there are great computers out there for this…the HP line is especially popular, and a solid choice. You can get a decent machine for $2900 and put in one or two very high end graphics cards, as well as additional hard drives. And they can go to upwards of $5000 and $7000 (s good place to look for these is Videoguys.com). These are decent prices for companies or individuals that use them all the time, and have rental income on them year round. I don’t, I do projects here and there. And have small budgets.

But I have another issue…most of my clients require ProRes exports. They are either coming from FCP-X (or even FCP 7)…and need either to be round tripped, or they need ProRes Quicktime exports to meet network delivery requirements. So if I went with a PC, I’d have to keep an old Mac around to convert the exports (my MacBook Pro will do that). It’s a BIG issue on Windows, as recently Apple announced that it would stop supporting Quicktime on Windows altogether, so just having it installed is an issue. But this would also mean double encoding (DNxHD and then ProRes) and risking the dreaded gamma shift.

This had me looking at Apple options. I spoke to many people and they said that interesting enough, the Retina iMacs were more stable with Resolve and Premiere Pro than the late-model MacPro. And there were reports of graphics cards issues on the MacPro…an issue so bad that Apple offered a repair program for it. Many people recommended that I get the best iMac there is, and maxed out with everything…better processor, the most RAM it could get, best graphics card offered. I priced that out and it came to be $3600. 4Ghz processor, turbo to 4.2, 32GB RAM, 500GB SSD, and AMD Radeon M395X with 4GB RAM graphics card. Pretty decent…but spendy. And any expansion, as I said, is all external, which means a more cluttered workstation. And no Nvidia options, that Adobe and Resolve tend to like more.

That got me to thinking…what about a Hackintosh? (That’s a computer built with off the shelf computer parts that runs the Mac OS…one that’s been modified a little to work with these components) I’ve built one before. And I have a couple friends to have also done this before, quite a few times. They have built multiple systems for themselves and others..Hackintosh models that they use professionally, for years. So I got to looking into that option.

NOTE…in order to build a Hackintosh need to get very specific computer components that have been tested to work properly with the modified Mac OS.   I set about researching this starting with the go-to site for this, tonymacx86.com. There you can see builds for MacProsMacMinis, and MacMini Deluxe. And see posts of people’s success stories, and what components they’ve used…as well as failure stories where they explain why certain OS versions or certain hardware components don’t work. So it’s a good place to see what to do, and what not to do.

In looking for a basic motherboard, I found one that many people used…it has two Thunderbolt 2 and lots of USB 3 ports, and it one of the MacPro build options. The issue was that it was a couple years old, and I couldn’t find it anywhere. When I looked for more current motherboards, I found some with Thunderbolt 3, but people said that when they built their hacks with them, the TB3 wouldn’t work. The MacOS doesn’t have drivers for that yet…one place where Apple is falling behind in tech that IT introduced. I found the other components I wanted, but without that motherboard that has Thunderbolt 2…I was stalled.

Why so adamant about Thunderbolt 2? Well, none of the boards have Firewire (I’m flush with Firewire drives and clients provide me with Firewire drives), but I have a Thunderbolt bridge from OWC that has Firewire on it. I also have a BlackMagic Extreme 4K capture box that has Thunderbolt…so I pretty much need a board with TB2 in order to be able to connect to most of my hard drives, and my video IO box that I used with all my editing apps.

So, as I said, I was stalled, and set about saving up money for that iMac, or one of the PC options. And just when I was about to do that, I started chatting with Stefan about this (one of the hack makers), and he was trying to convince me to make the hack (one thing he said was “friends don’t let friends use Windows!) . And he was able to look up a couple builds and track down the board I needed…right on the manufacturers website (At the time of writing this, they seem to have appeared back in stock in several stores).

Here’s the list of components that I chose for this build:

MOTHERBOARD:  $140 

GA-Z97X-UD7 TH LGA 1150 Z97 Dual Thunderbolt 2 ATX

RAM:   $166 ($183 w/tax and shipping

Crucial 32GB Ballistix Sport DDR3 SDRAM Memory Module – 32 GB (4 x 8 GB) – DDR3 SDRAM – 1600 MHz DDR3-1600/PC3-12800 – 1.50 V – Non-ECC – Unbuffered – 240-pin – DIMM – BLS4KIT8G3D1609DS1S0

PROCESSOR:  $340

Intel Core i7-4790K Devil’s Canyon Quad-Core 4.0 GHz LGA 1150 BX80646I74790K Desktop Processor Intel HD Graphics 4600

PROCESSOR COOLING: $40

COOLER MASTER RR-212X-20PM-R1 120mm 4th Generation Bearing CPU Cooler

GRAPHICS CARD:  $389

EVGA GeForce GTX 980 4GB SC GAMING ACX 2.0, 26% Cooler and 36% Quieter Cooling Graphics Card 04G-P4-2983-KR

HARD DRIVE (SSD): $158

SAMSUNG 850 EVO 2.5″ 500GB SATA III 3-D Vertical Internal Solid State Drive (SSD) MZ-75E500B/AM

CASE:  $120

be quiet! SILENT BASE 600 WINDOW ATX Mid Tower Computer Case – Black

POWER SUPPLY:  $110

CORSAIR RMx Series RM750X 750W 80 PLUS GOLD Haswell Ready Full Modular ATX12V & EPS12V SLI and Crossfire Ready Power Supply

Tax & Shipping for all of these components came out to $130.65. So the total cost of all the parts was $1593.65.  Less than half the price of the iMac, and four times cheaper than the MacPro (it can be equipped with slightly faster processors, and it has dual GPU…down the line I intend to add another GPU) I already owned the two 23″ displays, keyboard, mouse, and Thunderbolt 2 Bridge. And this configuration doesn’t include a wifi card or capability. Patrick (the other helpful guy) had a wireless ethernet receiver that he had sitting unused, so he gave that to me. Otherwise, you’d either need to get one, or have the ability to connect your internet router directly to the computer.

The processors in build is more on par with the high end iMac. If you configure it, you can equip it with an 4.0Ghz i7, which is the same general model I purchased. But mine can turbo boost to 4.4Ghz, where it caps off at 4.2Ghz.  And while it is called a QUAD CORE processor, it has 8 threads, so essentially an 8 core processor. As for the MacPro, those processors are called 6-core, 8-core and 12-core, but from research I can see that Apple disables hyperthreading in them. Don’t get me wrong, they still will out-perform the processor I have….but I’m also not shelling out what it costs for those machines, and their iffy GPUs.

OK…so the saga begins. I ordered the parts and soon they arrived. In part two of this saga, I’ll tell the tale of getting this machine up and running. It’s a wild tale filled with ups and downs, week long late nights, swearing…and a couple lean forward moments.

A mini review for a mini color panel. This is an unsolicited review…Tangent didn’t ask me to do it. I wasn’t supplied with a review unit. I rushed out and pre-ordered it from Flanders Scientific (where I get my broadcast monitors)…and then anxiously awaited it’s arrival for a couple months.

It arrived late June in a classy little box.

When I unpacked it, I noted that it was nice and light. Not too light, but not too heavy. Some might think that it feels cheap and plastic, but it really isn’t. The plastic is solid and the balls are pretty hefty…cheap build is NOT how I’d describe it.  Lightweight. Has a Wacom lightness to it. The construction of it is part of what keeps the cost down. Yes, it’s plastic, but it doesn’t feel like toy plastic. It’s very solidly built.

One thing I noted is that it didn’t ship with any software. No big thing, the booklet enclosed gives directions to download and install the minor plugin it needs.  The unit plugs in via USB, and that’s the only cord on the device, getting power from it as well as using it for connectivity.

For the test run, using Resolve, I brought some test footage from my BMPCC that I shot in order to test out lenses. In the preferences I chose the TANGENT WAVE setting, as the instructions stated to do. And that’s all I needed to do. I was ready to go.

The unit is very simple, and has only the most basic controls. Which is fine by me, I’m a basic colorist. I am more of an online editor, and my focus is documentary work, leaning towards historic docs with old footage. A few interviews thrown in, or decent b-roll, but my main goal in color grading is to simply make the footage look good. I don’t do commercials, or scripted TV or feature films, where one does a lot of work on each shot, perhaps using 5-10 nodes and all sorts of power windows. For that work I would tend to recommend the larger color panels. But for what I do, the controls this unit offered were perfect.

Here’s what you have… the knobs adjust the brightness level of the blacks, mids and highlights. The balls allow you to adjust the color.  The buttons are mappable with the TANGENT MAPPER app. By default, the A button is empty, the B button bypasses the grade, so you can, with one click, see what the footage looks like graded and ungraded…and buttons next to the balls are reset buttons. But the mapper app will allow you to map what you want to those buttons.

Using those controls in my little test was a great experience. The responsiveness is just what I want, not too sensative. I had issues with another panel I used, I couldn’t seem to get it just right. This was just right out of the box.  I took to it right away.

As I said, it’s a simple control surface, but I’m a simple colorist, without the need of complex controls. To that end, the Ripple is just what I needed for all my professional, and family video, needs. In short, I really really like it.

(For a more in-depth review, check out what Scott Simmons has to say over at ProVideo Coalition)

ClipWrap is dead…long live CLIPWRAP!

On Tuesday, July 28, 2016, ClipWrap from DivergentMedia.com will no longer be available for purchase. It is being end-of-lifed. It is a sad thing to see this go…it has been one of the most useful software tools in my kit for many many years, it not the most useful.

This application (Mac only, sorry Windows), in case you didn’t know, converted AVCHD .MTS camera masters into Quicktime files. You could either re-wrap it as MOV and still retain the H.264 codec, just in another wrapper…or convert it to various flavors of ProRes, DNxHD…DVCPRO HD, Apple Intermediate Codec, HDV, XDCAM or DV….editing codecs that many editing systems preferred, because H.264 can be difficult to work with.

Yes, many editing software applications had the ability to import AVCHD and convert it…but the issue with the AVCHD standard is that, well, there was no standard. Every single camera maker had their own variant of AVCHD. So when a new camera was released that shot AVCHD…it utilized a different AVCHD format than the editing software was used to and so it wouldn’t be able to import footage from that camera until the application was updated, more more likely, a new plugin was released by the camera makers to allow the editing software to see the footage.

What was great about ClipWrap is that Mike, the main guy over at Divergent Media, seemed to always be on top of the new formats, and would release updates to ClipWrap much faster than the camera makers would release plugins. Which was very helpful for those times your producer buys the latest camera, shoots a sizzle reel, and then needs you to edit it right away. If I had to rely on the camera makers and editing software companies…I’d be waiting a long time before things would work.

For example…quick story. A producer of mine bought the brand new Sony NX camera and shot a pilot with it. Of course, FCP 7 had issues bringing this in. If a clip was shorter than 5 min, all was good. Anything over, would take HOURS to import. Odd bug. And Avid? Couldn’t even see that camera. (Premiere Pro wasn’t on my radar at that time…sorry Adobe) So I tried to tackle it with ClipWrap, which had recently come out with an update. I tried it, and it worked! But there was a small glitch in the first 2 seconds of the clips…every clip. So I emailed Mike with the issue, he asked for a small sample file, I sent one his way, and within a day an update was released that made that glitch go away. How’s THAT for customer service!

So yeah…ClipWrap, that amazing app, is going away. BUT DON’T PANIC!! Divergent Media isn’t going away. No…they have a better option going forward. Everything that ClipWrap was and is, is available in EditReady…and has been for quite some time. And where ClipWrap only worked for AVCHD, EditReady works with a lot more. Not only AVCHD (MTS), but also M2T (HDV), MP4 and MXF Camera masters. It too will re-wrap, or convert to DNxHD and ProRes. AND…it will do it a lot faster than ClipWrap does. Up to three times faster in many cases. And you can make adjustments to the image like flipping, rotating, retiming and applying LUTs. Everything ClipWrap is, plus a whole lot more.

So stop recommending ClipWrap to people on help forums. Move on to the more improved EditReady. Exact same price, but tons more features. And yes, a trial is available.

Notice: Yes, I was approached by Mike to do a write up on this. But no, I’m not getting compensation in return. I didn’t want any. ClipWrap has saved by bacon more than once. And Mike has always been nice, personable, and quick to address any issues that came up.

The fifty-eigth episode of THE EDIT BAY is now available for download.

This one is a show that was cancelled midway through production.

To play in your browser or download direct, click here.

To subscribe to this podcast in iTunes, CLICK HERE.

This is the FIFTH episode of THE EDIT BAY that I did many years ago…but when I was looking for it to link to it on a forum post answer, I realize that I didn’t ever post it on my blog! It’s one of my favorite stories…so…it might be new to some of you, even though it’s pretty old.

This one is about bidding on a job where we doubled our budget….but still lost to a competitor. Because the client thought “there must be something wrong with your bid…it’s much too low.” Yup…dealing with ad agency people.

To play in your browser or download direct, click here.

To subscribe to this podcast in iTunes, CLICK HERE.

The fifty-seventh episode of THE EDIT BAY is now available for download.

This one is about a New Media company that bit off more than it could chew.

To play in your browser or download direct, click here.

To subscribe to this podcast in iTunes, CLICK HERE.

The fifty-sixth episode of THE EDIT BAY is now available for download.

This one is about the time I was THIS CLOSE to meeting one of my favorite actors…

To play in your browser or download direct, click here.

To subscribe to this podcast in iTunes, CLICK HERE.

The fifty-fifth episode of THE EDIT BAY is now available for download.

In this episode, I get a little help from camera operators…

To play in your browser or download direct, click here.

To subscribe to this podcast in iTunes, CLICK HERE.

The fifty-fourth episode of THE EDIT BAY is now available for download.

A producer I work with loves telling old Hollywood History stories. I share a couple.

To play in your browser or download direct, click here.

To subscribe to this podcast in iTunes, CLICK HERE.

Or, “I WENT TO GO SEE DEADPOOL THIS WEEKEND, AND ALL I GOT WAS THIS MUG.”

Adobe invited me to see DEADPOOL on the 20th Century Fox lot in the Zanuck Theater (Atmos sound system) on Saturday, February 13. Who am I to say no to a free movie on a studio lot? Oh, and by the way…DEADPOOL WAS EDITED WITH ADOBE PREMIERE CC!!

In case you didn’t know. Hard not to, it was all over the Twitter-sphere and post production groups and sites. and yes, many of you will say things like “Man, I could totally tell it was edited on Premiere Pro!” in a sarcastic way, just like we do when Avid goes “HEY! All the Academy Award nominees for Best Picture and Best Editing were all cut on Media Composer!” We editors…and I’m guilty of this…will say “It’s all about the story. It doesn’t matter what tool was used to tell the story, it’s all about the skill and storytelling ability of the editor. You can’t tell if a movie was cut with Avid, Premiere, FCX or Lightworks.” And no, you can’t tell. Just like you can’t tell what tools were used to build a house. BUT…having a tool that helps the editing process go smoother, that helps the editor tell the story the way they want to…that is important.

This is something that I’m realizing more and more as I look at the features that each editing application brings to the table. They all have their strengths and weaknesses…all have areas where they do things better than the competition. This is why you choose one editing app over the another. You need to look at the workflow you want to tackle, the features that you need to accomplish what you want, and then use the editing application that best addresses those needs. In the case of DEADPOOL, the post team did just this, and felt that Adobe Premiere Pro would be the best choice.

The main reason for this is that it was a very effects heavy show…with a lot of speed ramps. The director, first timer Tim Miller, was already familiar with Adobe products as he is a visual effects artist with many features under his belt, and he knew he would be involved in the post process doing many of the speed ramp effects himself. And the ability to send from the editing app to After Effects to do this work and send it right back was a huge bonus. Now, I’m not sure if the editor, Julian Clarke has also done a lot of temp VFX work on the films he’s cut, this wasn’t mentioned in the post-screening talk (and he wasn’t there). But he has cut a lot of VFX films…so, maybe. So to look into the possibilities for how to best accommodate his needs, the team decided to look at Adobe Premiere Pro, and hired post consultant Vashi Nedomansky, an editor himself, and an expert with Adobe apps. Being an editor and consultant, he knew the real world workflows that would work best for the production. He came in very early on to discuss what equipment they would need, and how to best deal with the multiple cameras and formats they would be shooting.

OK…so I watched DEADPOOL in the Zanuck Theater equipped with an ATMOS sound system, and it was GREAT! Yeah, I mean the sound, but I also mean the movie. Very violent, with sex and nudity added…it really deserves that “R” rating. But it is gloriously self aware, and broke the 4th wall all the time, and in very creative ways. Amazing movie, with some amazing effects…tons of fighting, lots of speed ramps from normal speed to slow motion, CGI…the whole bit. Over 1200 VFX shots.  This is what I learned in the post screening talk, hosted by Michael Kanfer of Adobe (who has an impressive list of IMDB credits). On the panel was the post consultant I mentioned before, Vashi. As well as the First Assistant Editor, Matt Carson and post supervisor Joan Bierman.

OK, so the post super, editor and other editorial staff were convinced that Premiere Pro would be the best option for this movie, and with a little effort they were able to convince the studio as well. Yes, you do need to convince the studio about all the tools used on films. They are very budget conscious, and want to make sure things go smoothly. And Avid has a proven track record and well established workflow, so they feel comfortable with it. Deviate from that choice and you need to convince them why. They convinced the studio that this would be the best option given all VFX nature of the feature.

But convincing wasn’t the last step, the post staff also needed training. Again, they relied on Vashi. Not only did he help provide them with the post workflow, but he also provided training to the editor, director and post staff. Vashi said that there were about 9 basic keyboard command that any editor needs to edit, and he taught those to the crew. He also asked how they liked their keyboards mapped, and spent time mapping each station to the needs of the person at the controls…making sure that their muscle memory was able to go unchallenged. Anyway, by lunch time all of the edit staff was able to dive in and edit away without much issue.

Now for some technical details. The film was shot on a variety of cameras, from the main camera, an Alexa shooting 3.5K to a 6K Red and 4K Phantom. They converted all to a common container…ProRes LT, 2048×1152 center extraction. If they needed to reposition the shot, they could easily do that at any time by going back to the masters. The Assistant Editors would receive the dailies from Encore in Vancouver and would sync them with the audio and then turn them over to the editor. Normally, when they cut on Avid, they were used to getting ALE files that contained metadata they relied on, such as color information that could be passed onto the VFX teams. As a workaround for this, they received CDL files that were integrated into the footage.  When it was onlined, E-Film conformed the cut with the RAW masters.

They had five edit bays (including the VFX editor), all Mac based, using MacPro Tubes.  All of the footage was stored on an Open Drive Velocity…a RAID server 180TB in size, consisting of several SSD drives. This system was used on GONE GIRL the year before, and proved to serve the production well. This system was able to meet the demands of playing back multiple streams of ProRes on five edit stations, without any delays or skipping during playback. It also allowed the projects to open quickly. Each reel of the film was a separate project, and the larger they got, the slower they were to open, and the more VFX shots in a reel, the larger they got. At one point the final reel, the most VFX heavy, took 10 min to open. But, with help from the Open Drive tech people and tweaking, they got it down to 2 min.

Now, I could go on and on about the post audio workflow issues and solutions, the change list challenges, the DI conform…but I don’t want to do that. Adobe will be posting the video of the event online soon, so you can watch that to see the issues and solutions.  The main thing I want to talk about is primary reason they wanted to edit with Premiere Pro. What made them choose this over Avid or FCX, because that’s something I talk about all the time. There’s alway some flame war online about what editing app is THE BEST editing app out there. What is the BEST one to use for feature film cutting? For broadcast TV? What are the professionals using?  And there will be shouting matching about this. “Avid all the way, baby!” “Come on, grandpa, look to the future! Tracks are for losers, FCX for the win!” Blah blah blah.  The truth is they are ALL professional apps, it’s all about what’s best for the given situation.  For this movie, the clear choice was Premiere Pro, because of it’s integration with After Effects. And I’ll explain why.

During the editing process, the director would send clips to After Effects for treatment…the most common of this was speed ramping, as I said before. Due to the tight integration of Premiere Pro and it’s ability to SEND TO After Effects, and have the clip link to the AE comp, it made this process very smooth. Vashi explained that one lesson they learned on Gone Girl was that simply linking to the AE comp caused the systems to really bog down. Especially when there were many instances of this. So Adobe came up with a way for After Effects and Premiere Pro to access the same render cache. That is, the same “render file.” So when the AE comp was finished, the editor could use a new feature called RENDER AND REPLACE. This would render the AE comp out to a Quicktime file, and Premiere Pro would link to that file, rather than the comp. But there would still be a dynamic link to the AE project. So if the AE artists  would make a change to the comp (or in this case, the director), they would make the change, and back in Premiere all you’d have to do is again, RENDER AND REPLACE and the clip would render a new Quicktime clip from the comp and link to that. And this is a lot smoother, and a more simple, than rendering out of After Effects and importing into Premiere pro…and keeping track of the latest export and the export folder getting clogged with exports.

(Quick tech tidbit…the edit stations consisted of Mac Pro tubes…and during the production, they burned out TEN OF THEM! This was due to a hardware bug related to the D700 graphics cards that Apple eventually figured out. Several of the stations had external fans aimed down the tubes for additional cooling.)

So the director and editor could send a clip to After Effects, adjust the speed, go back to Premiere Pro and hit RENDER AND REPLACE and there it was. If it didn’t quite look right, they’d do it again until it was right. And then move on and continue cutting. And then they locked that VFX shot, they could take that After Effects project and relink it to the RAW camera file and redo the composition in 4K, full resolution. And then link that up in Premiere pro and reframe as needed.

If they used Avid, or FCX, they’d have to do the old fashioned way of exporting files and importing into the NLE and that would slow them down.  And this movie was FULL of speed changes. Every fight scene had multiple speed ramps. So this really sped up the edit, and kept the post schedule short…and a shorter post schedule is good on the budget, which makes the studio happy.  And this was a good thing as they didn’t actually lock picture until a few days before the premiere.

One favorite feature of Matt, the AE, was the idea of “Stacked Timelines.” Where in FCP you could have multiple timelines open at the same time, as TABS (and where in Avid you can only have one timeline, period…unless you open one in the Source monitor). What that is, is you can have two timeline windows open at once…and stack them like pancakes, one on top of the other. They used stacked timelines in a couple ways. One way was to have all the selects in one timeline, and then build the cut in another…dragging the footage from the top timeline do the bottom. This helped them track how far along the selects they were, and how much time they had remaining. The other benefit was in comparing new cuts and old cuts. One of Matt’s duties is to work on the temp mix…when the editor finishes a scene, he takes it and adds music and sound effects. And then when the editor goes back and does changes, Matt can stack the two timelines and compare them…see where the changes occurred, or new stuff was added, and address those areas. Often he’d be doing the temp mix while the scene was still being cut, so the editor wasn’t working with his temp mix, still just doing scene work. So when the scene work was complete, Matt could compare the scenes, drag over the work he had done and then continue work on the new areas. Coming from FCP, I’m a big fan of having access to multiple sequences at the same time.

So there you have it…one reason why Premiere Pro was the best option for editing DEADPOOL. It didn’t take long to train the editorial staff in using it…the editor wasn’t hindered by unfamiliarity, he was still able to focus on story and not really worry about the technical as much as he might with another NLE. A few hours training and he was off! Could you look at the movie and tell it was cut on Premiere Pro? No…and that’s the beauty. We aren’t supposed to be able to tell. Editing is best when it’s invisible, so that we, the audience, can concentrate on the enjoyment of the movie. Premiere Pro was the tool that enabled the editor and director to tell the story they way they wanted to.

Now, one tidbit that I wanted to mention, a story related tidbit. The fact that Deadpool had a full face mask enabled the filmmakers to retool dialog where needed. Make the jokes better, because he really could be saying anything under that mask. And they did that quite a bit. Ryan Reynolds would send them temp ADR all the time…and they’d cut it in and replace it with final ADR when they locked the scene dialog. This makes me hope that they release the alternate jokes as a special feature on the BluRay.

OK, enough chat. Go see the movie. The laughs start right away with the open credit scene, and continue after the credit roll ends. Typical of Marvel movies, this has a bonus scene at the end of the credit roll.

Excelsior!

(For more on the workflow for this film, head on over to the ProVideo Coalition where Steve Hullfish interviews Vashi Nedomansky. And watch this video from a panel at this years Sundance festival, where they talk workflow for Hail Cesar and Deadpool)

OK, if you are an editor, and are on Twitter, you probably know about the hashtag #timelinetuesday. It’s where us editors post the timelines from our current shows…or perhaps past shows…as a way to go “Look at how complex my edit is!” Because, well, we can’t show you the show yet, but we can show you the skeleton of it, how it was constructed. It also gives us a way to show others “look how I lay out things on my timeline.” That’s what I do (OK, fine, I also do it to brag about the complexity)…show people how I like to organize my timeline, and lay out my tracks in a logical manner.

See, I’m an organizational nut! No, wait, that sounds wrong. I’m a nut about organization…ok, that’s better. Organization is the center of how I do things, so if I can impart some of my organizational knowledge to others, I’m feel good. Especially because I’ve worked with some people who can’t organize their way out of a box! Wait, can anyone do that?

ANYWAY…normally I just post a single timeline on Twitter, or now also on Facebook in the AVID EDITORS or ASK AN EDITOR section. Be it in progress or a finished thing. But this week I wanted to do something different. I wanted to show timelines from an Act of a show I worked on, starting with what it looked like at the end of day one, and ending on what it looked like at the end of day 7, with a bit of explanation about what I accomplished on each day. So….here we go. This is the timeline for Act 1 of a reality show.

DAY 1:

Link to larger image

This is a rough string out of only the reality. Normally this is something that the story producers would slap together, but this was the last episode, and since our show has an AFTERSHOW (like Talking Dead), we editors needed to do a more polished reality pass so that they could air this on the show. So, this is what I accomplished a few weeks before I actually returned to the act. So it’s only the reality moment, no VO, and audio barely addressed (I didn’t isolate mics).

DAY 2:

Larger image.

I’ve now dropped in the “shell” of Act 1…meaning the spot where the show tease goes at the head, and then the open title sequence, and at the end, the shell for the tease out. I’ve also started dropping in the VO, and putting the reality tracks into the proper configuration, and isolating mics. A couple parts that you see at the end with tracks that dip into the PURPLE and SALMON range…those were additional reality moments added by the story producer. Here you can better see how I color code my tracks: BLUE is Interview, GREEN is reality, SALMON is sound effects, and PURPLE is music. And I make different shades of each so I can see at a glance where the stereo pair tracks are. By that, I mean that all the tracks for this are MONO, but all the ODD tracks are panned LEFT, and all the EVEN tracks are panned RIGHT, so I need to make sure I add my stereo clips on the proper tracks, odd first, than even. If I do it even first and then odd, the clips have the stereo pair reversed. NOT good when you head to the MIX.

Another thing you’ll notice is that I label my tracks by what type of audio goes on them. Helpful for me at a glance, and other editors who might end up fine cutting this, or dealing with notes. AND…this information gets transferred to the ProTools session (track names). Helpful for them, too.

DAY 3:

Larger image

Finished adding VO and adjusting the reality tracks and isolating mics (meaning only having the audio up for the people talking at that given moment, to cut down on audio distractions). And I’ve started cutting the scenes, adding reactions and b-roll.

DAY 4:

Larger image

More filling out the reality moments and adding b-roll. The small grouping of clips around the 3:50:00 mark is a flashback package.

DAY 5:

Larger image

More filling out…another flashback package

DAY 6:

Larger image

ALMOST there. Added the tease at the beginning, cut by another editor.

DAY 7:

Larger image

Finished filling out. Added a tease for the upcoming at at the end, lower thirds, and addressed producer notes given just after lunch. This Act 1 is ready to go to network as a rough cut….joined with the other acts other editors worked on.

Along with straight up creative cutting, I also online edit and color correct.  This started years back when I started using FCP. The show I was on had a preliminary color pass to show the network…PROVE to the network, that we could mix the full sized tape based Panasonic Varicam, and the newly introduced HVX-200.  That grade was done on a full scale, tape to tape DaVinci system. I looked at what was done, and said, “I can do that.”

Now, I’m no stranger to online and color correction, not at that point. I was an assistant online editor for many shows, and I learned from talented people. This was the first time I decided to take it on myself. At that time, I used the simple 3-way color corrector and a little product by Red Giant software…Magic Bullet Editors.

From onlining and grading a special here and there, I landed a full year job grading two series and a handful of specials. At that time I used Apple Color. I still used that from time to time, on every FCP job that landed on my desk. But I also started digging into Avid Symphony…as more and more jobs coming my way were Avid based.

But now I have a job coming my way that’s shot on 4K, but needs a 1080p finish…with the ability to revisit it later at 4K.  The network stated that Resolve would be the better solution for the task, so now I’m learning DaVinci Resolve.

And it’s about time!  I’ve had it in my tool belt for a couple years now….Resolve 9.  I won a copy of it a couple summers back doing the first #Postchat scavenger hunt. And I’ve sat on it, never needing to use it. I always kept referring to Color or Symphony. I never needed to use Resolve to convert footage, or do a final grade. Sure, I COULD have…but I’ll admit, I was lazy…other tools did the job just fine. But now I needed to use Resolve…I just needed to figure out how to use it.

For that I turned to the Ripple Training tutorials of my friend Alexis Hurkman.  In a few hours, spread over a couple days, I got up to speed. It’s about time.

OH…and while I had DaVinci Resolve 9…a couple years old of an app…I was able to upgrade to Resolve 11…FOR FREE.  And when Resolve 12 comes out…another free upgrade.  And Resolve Lite, that tops out at 1080 support, is still free.

I’ve been editing from home lately, and using my 2012 MacBook Pro as my main editing computer. I had to abandon my 2008 MacPro tower as it’s really showing it’s age. It’s getting finicky with random crashes, the AJA card in it is finally giving up the ghost (8 solid years with that baby!) and it’s just plain slow compared to my 2012 MacBook Pro.

The thing is, my MBP doesn’t have that many external ports on it. Sure, it has a LOT more than a MacBook Air, but when it comes to all the things I need connected to it when editing…it falls short. For the record, it has:

(1) Ethernet port

(1) Firewire 800 port

(1) Thunderbolt port

(2) USB 3 ports

(1) SD CARD slot

(1) Audio In

(1) Audio Out

The Ethernet port I use on occasion to network with the tower, to transfer files. Or to connect to my control surface for color correction. Firewire 800…obviously for a FW800 drive, of which I have a dozen or so. Thunderbolt…that’s the busiest one. For I need that to connect to an IO device, the AJA IoXT that’s connected to an FSI professional color monitor, and also loop to a computer display. And then because my laptop monitor is too small to hold all the bins I need open, I use one of the USB ports for a USB to DVI adapter. And because editing/rendering/compressing causes a lot of heat on my laptop, the other USB is taken by a cooling pad. And then the audio out goes to the mixer and speakers.

Now I’m out of ports, but I need more. I need more USB for thumb drives to connect to, for backing up projects, or bringing over files from other people (fonts, pictures, etc), I need one for the keyboard and mouse, as I don’t use the laptop for that…it’s off to the side, I need one for other USB drives I have, like the RAMPANT drive I grab elements from time to time. Occasionally I attach other firewire drives, and yes, you can loop through…daisy chain…to some other drives, but it’s nice having other connections.

So I need a hub. But I want to future-proof myself so I want a major hub. Not just USB ports…but in the future I will get a new computer, as my 2012 laptop might not last long for editing either. And none of the newer models have Firewire 800. I might also want eSATA ports, as my tower has those, and I have many drives with that fast connection, but no new computers have them. So, I could either get Firewire to Thunderbolt adapters, and eSATA to Thunderbolt adapters, Ethernet to Thunderbolt (for connecting to network RAIDS), and USB hubs, or one unit that solves all my needs.

So I have been looking at Thunderbolt docks. These connect to the computer via Thunderbolt, and with that one connection, offer many connections on the other end. Multiple USB3, Firewire 800, eSATA, Ethernet, and audio ports…with Thunderbolt loop through. The ones I tested are from Otherworld Computing, CalDigit and AkiTio…all offer different options.

Let’s do this in alphabetical order…

AKITIO THUNDERDOCK 2

The Akitio Thunderdock 2 is a nice small box. It’s about the size of a small USB drive, so it has a very small footprint.

And this box sports a lot of connections…two Thunderbolt ports for loop through (very important)…two bus powered USB 3 ports (backwards compatible with USB 2 and USB 1), two eSATA ports (up to 6Gbps), and one FW800 port.

There’s no Ethernet port, but I know many people won’t need this….if you do, other options sport this. But this is the only device of all of them that has both FW800 and eSATA…so that alone makes it useful. The bus powered USB ports get their power from the box, not the computer. So even when your computer isn’t connected to the unit, the USB ports supply power…great for things like charging your cel phone, or keeping your astronaut light lit.

This unit requires power, therefore it needs to be plugged in….just like every model I tested. But this is fine with me…this is how it can offers bus powered USB ports.

How fast are the connections? Glad asked…first, a baseline. The drive attached is a CalDigit VR, the two drives raided as RAID 0, for speed. Here are the speeds of firewire directly connected to the computer. Around 75MBps read and between 70 MBps and 80 MBps write

Now the FW800 port on the AkiTio offers 66 MBps write/80 MBps read…so, comparable.

Now, my laptop doesn’t have eSATA, but my MacPro does…so I’m going to use it as a baseline. It has a CalDigit eSATA card in it. The speeds I get between it and the Caldigit VR are about 111MBps write and 125Mbps read:

The eSATA on the AkiTio? Would you believe it’s FASTER? Well, it is. Between 155-162 MBps write and 164MBps read. Impressive.

In short, the AkiTio is small, sports many connections, and has connections that are as fast, if not faster than direct to computer connections. The only issue I found was that the box ran a little hot. No, you can’t fry an egg on it, but I wouldn’t rest my hand on it for too long. Not hot…but more than warm. But after two weeks of use 10 hours a day, it didn’t seem to be an issue. Great little box. It retails for $245.99 and ships with a Thunderbolt cable.

CALDIGIT Thunderbolt Station 2

A friend of mine has this unit, and swears by it. He has a MacMini that also is running short of connections, and this has served him well. And in my 2 week trial with it, it worked great for me too.

This unit also offers a small footprint, and sits nicely behind or next to my computer.

And this one offers a different set of connections. Two Thunderbolt ports (allowing loopthrough), and 4K HDMI port, three USB 3 ports (two in the back and one in the front that is bus powered), two eSATA ports, an Ethernet port, and audio IN and OUT ports.

The HDMI is 4K at 30Hz, so it can send out an image to a 4K computer display or 4K TV. So you can send a signal out to a computer monitor via HDMI, or Thunderbolt (some monitors still needing a TB to DVI adapter). Now, one thing that this unit CAN offer over any other, is dual display monitoring from a single Thunderbolt connection. Meaning the one Thunderbolt out from your computer, can then be split to the HDMI out and Thunderbolt outs. But ONLY if your monitors connected via Thunderbolt are Thunderbolt native connections…like Apple’s monitors or LG’s TBT display. I was unable to test this feature, as I didn’t have one of those monitors.

The eSATA connection speed is comparable to the AkiTio…156MBps Write 165MBps Read. Again, faster than my MacPro offered.

Very useful to have that, as I have more than a couple drives with eSATA, and with high data rate formats, and the need to edit native formats, speed is good.

Another great box with many useful connections. It gets a little warm, but not bad. The case dissipates the heat well. It also has an AC adapter and is required to be plugged in to work, but again, that’s how you get power out to USB 3 ports. And the dual monitors via one Thunderbolt connection is a nice feature. But again, they need to be specific monitors. It retails for $199, and doesn’t ship with a Thunderbolt cable.

OtherWorld Computing (MacSales) Thunderbolt 2 Dock:

This unit is the biggest of the bunch, but it also sports more connections. And it still fits behind my computer nicely:

OK…this unit has two Thunderbolt ports (again, loopthrough), FIVE bus powered USB 3 connections, one Firewire 800 port, one HDMI 4K port, one Ethernet port, and Audio In and Out.

Where the AkiTio has both FW800 and eSATA…and the Caldigit has eSATA but no FW800…this unit has FW800, but no eSATA. Which is fine for many, as many people might not have eSATA, but need the FW800 connections, as all new Mac computers lack this connection. And we all have lots of FW800 drives that still function, and we still need to connect them to the computer.

The speeds of the FW800 connection are pretty much identical to what I get with the AkiTio box. 67MBps write, 80MBps read.

And like the CalDigit unit, this one also allows for display splitting, with the same restrictions. The monitor connected to the Thunderbolt port must be a Thunderbolt monitor.

This is my favorite unit in the bunch. Mainly because of all the USB ports (five of them) and the FW800 connections. In fact, the two ports on the side are “super-charged,” meaning they have extra power fed through them for fast charging of your tablet or mobile phone. I have a lot of things I need to connect via USB…a USB to DVI adapter (on the computer), a fan (on the computer) and then keyboard, Rampant drive, thumb drive, dongle for Resolve, Time Machine drive, or other transfer drive…all on the OWC unit. And when I do eventually upgrade, I’ll need the FW800 and Ethernet connections as I have lots of FW800 drives, and a color control surface.

And it runs pretty cool…about the same temp as the CalDigit unit. And like the rest, it also requires AC power. It retails for $229, and ships with a Thunderbolt cable.

These are not the only Thunderbolt docks on the market…these are just the ones I tested. There are also ones by Belkin, Elgato, and one by Sonnet that also has a BluRay drive for authoring BluRay disks.

The CalDigit and AkiTio review units were returned. I did retain the OWC unit as my expansion unit of choice.

Every day after school, my youngest daughter (age 11) comes into my office to watch me cut A HAUNTING. She does this in the guise of “I’m going to do homework out here while watching you edit.”  Although she does end up paying more attention to what I’m doing and asking about what I’m doing than doing homework. But I tend to forgive the lack of homework doing because she’s very interested in the craft of editing. But I do make sure she does it later…

Normally this show might be a bit intense and scary for a child of her age…and well, the age she was when she started watching me edit and watch the show…age 8. But because she sees how the show is cut, she sees the scene pre-visual effects, pre-scary sound effects and music cues, the show has been de-mystified for her. So she’s not scared. But, she still really enjoys it, and she loves watching it come together.

She will be in my office one day as I watch down all the dailies for a scene…and then start to assemble the scene. I’ll explain to her why I cut to a different angle when I do…the motivation of the cut.  Or why I won’t cut at all, just leave the one long take.  She will be in the office the next day as I cut the music and SFX for that scene.  I’ll explain why I use a certain sound effect, or why I chose the certain music cue. And then explain the process of back-timing music to make it fit the scene, and then why I might extend shots a few frames or so to accommodate the music hits.  Many times I will play a scene and ask her opinion as to what type of sound effect she thinks I should put, and when. So I can see what sort of editing instincts she has.  Most of the time they are spot on, as she will say “hmmm, I think I want to hear some sort of sound here when the person sees black ooze on their face in the mirror, and they jump. Some sort of boom…or something.”

Pretty good.  But that doesn’t top what happened last week.

I’m cutting this one scene, and it consists of single long takes.  The first angle is a medium shot that becomes a close up of a person walking down the hall after hearing a sound…Then the scare happens.  Then the next shot is of another person entering the hall to come to her aid.  The first two takes the camera is panned away from the first person, focused on the doorway as the second person emerges…then it follows him down the hall as he comforts his person 1. This is also a one shot take…no reverse angles.  Five single angle takes.  The first two takes started in the doorway, then panned down the hall, but they both had issues later on that made them not great. The far better takes are the next two takes…but the problem is that they didn’t start in the doorway…they started on a wide of the hall, angle on the first person.

This was an issue because not only would this be a jump cut, but the position she was in for the CU of the scream, didn’t match the wider shot at all. Sure, I tried it, because often this difference would be minor and not noticeable…but it was just too different. So I wasn’t sure I was going to do.  I explained the situation to my daughter as she sat doing her “homework” on the client sofa.  She put down the book, stood next to me and studied the situation.

“Hmmm…” she said.  “How about cover that cut with an interview bite?”

I was about to say why that wouldn’t work when I realized that it would work…and rather well too. See, you need to know (if you don’t watch A HAUNTING), the show is a mixture of interviews and recreations. Mainly recreations, but with interviews of the people the incidents really happened to to give the scenes more weight.  I will cut the scenes, but then need to adjust the cuts to accommodate the interview bites. And in this scene I had the interview bites happen much later in the scene, after the second person rushes to aid the first.  But I could just move the first one up sooner. Have her scream…then say “I was utterly scared, and fell to the floor,” then cut back to person 2 coming to comfort here…a few lines back and forth, and then the rest of the interview.

BRILLIANT!  This suggestion took her less than 5 seconds to come up with.

Now, I’m sure I would have figured that out eventually (maybe, after ranting about it for a bit)…but her instinct on this was so spot on, so quickly…I was humbled for a moment.  This kid has a future in editing, that’s for sure.

There’s a new trend that has come about because of digital cinematography. Not only longer takes, but more footage overall.

Back in the days of shooting film, you’d hear the words: “Roll sound.  Roll camera. ACTION!”  And then, after the scene was over, the words “cut!”  You might also hear the words “check the gate,” meaning look for hairs or dust in the film gate, but that’s besides the point.  OK, we still hear those words, especially when rolling second system audio (audio recorded separate from picture)…but a new trend is happening, something not encountered when shooting film.  Longer and longer takes.

When shooting film, one tended to be economical. Meaning, you shot only the scene, and when it was done, you stopped.  Because film is expensive, as is processing that film. So you set up your shots, rolled film, got the take, stopped, and reset for take two.  You’d use the time between takes to give the actors and crew directions, reset props and touch up makeup, etc. And when you were done with the main shots, you’d move in for inserts and pickups.  Shooting the actors saying crucial lines or giving needed reactions.

This wasn’t limited to film. This was also done when shooting to tape. Because you didn’t want to use up all your tape…you shot practically.

But now, with the advent of tapeless media, things have changed. First off the amount of footage we get has increased by a major factor. On narrative projects, when we shot on film we’d get an hour or two of dailies per day. With tapeless, that has increased to between 4 to 6 hours of dailies per day.  And us editors still need to watch every bit of that footage.  That doesn’t leave much time to actually cut.  And the production schedule…the edit schedule…hasn’t changed. So days get longer and longer. Deadlines might get pushed, but that’s rare. So this means the hours the editor works in a day…a week…increases.

What’s going on with this increase of footage?  Well, many things.  One thing that happens is that one “take” actually consists of multiple takes. The director doesn’t call “cut,” he simply says “OK, reset to one quickly” and while the camera is rolling, they do another take, or multiple takes.  Recently I had one “slated take” that consisted of multiple resets. So one slated take contained five “takes.”  That scene had four slated takes, and those four takes consisted of twelve actual takes.

Also, a lot of things can happen between the takes…and all while no one called cut For example, a friend of mine had a scene where one slated take was eight minutes long. In that eight minutes there was one minutes and thirty seconds of the director giving directions before action is started. He called for cameras and sound to roll, and then went to give directions.  Finally he called “action,”  and the shot took one minute from start to finish.  Then while the camera was rolling the scene was reset, notes were given, makeup re-applied…6 minutes of general hub bub…and then another one minute take.

This is something that would never happen when shooting film.  I recall being called to the set because they were doing a pickup of a scene and they needed me, the editor, on set so that they could make sure the continuity was right. The director asked if I was ready, then called “action.”  When it got to the part I needed input on, I paused, trying to remember. “COME ON!” the director prodded, “we’re rolling!  Quickly quickly!”  I gave my direction, they did the scene and called “cut.”  Film is expensive. (I mentioned that)

Another thing that can happen is smaller resets in the one slated take.  That same friend of mine worked on a show where one slated take contains the full scene…but also the director will stop the actors in scene and have them repeat lines. Not once, but five to six times, maybe more. And not in one part of the scene, but he would stop them several times.  The director will also stop several times to prod the actors to give better, or different, reactions.  Redo moves.  Basically the one slated take will also be minutes long and contain lots of bits and pieces to complete the scene.  The scene itself, and all the pickups.  Not only does this make cutting more challenging. But now the editor needs to scrutinize every second of the dailies, and dig through all of this for not only the best scene, but all the good reactions and lines, and then cut that all together, cohesively.

And yet, as I said before, post schedules don’t change, so this leads to longer than 10 hour days, or working on the weekends. Often without overtime pay.

Now, it can be argued that doing this makes for a better show. And that is true, because some good performances can be had, and great reactions…stuff missed when you only have 3-4 full takes. This might just produce that one golden moment or reaction that makes the scene shine. Another argument is that stopping down the scene and needing to go in an reset and slate adds more time to the production schedule. So doing multiple takes in one slated take can save precious minutes.

I can understand that.  Still, some happy medium needs to be struck. Saving time on one end adds time to another. And post is the cheaper side of that argument. But a little extra time should be considered for the post side, in order to deal with the added amount of dailies.

I didn’t mention reality shows at all, because not much has changed there. They roll and roll and roll, and always have, because they are documenting what is happening. STACKS of tapes arrived at the office daily from DEADLIEST CATCH.  Five ships, three to four cameras per ship…filming nearly 24/7.  We have always gotten lots of footage from reality shoots. And that’s one reason they take so long to edit.  The editors have to sift through all of that footage to look for the gold. Even when we have story producers who pull selected scenes for us to work with, us editors need to watch all the footage surrounding that moment…if we are given the time to do so.