OK, if you are an editor, and are on Twitter, you probably know about the hashtag #timelinetuesday. It’s where us editors post the timelines from our current shows…or perhaps past shows…as a way to go “Look at how complex my edit is!” Because, well, we can’t show you the show yet, but we can show you the skeleton of it, how it was constructed. It also gives us a way to show others “look how I lay out things on my timeline.” That’s what I do (OK, fine, I also do it to brag about the complexity)…show people how I like to organize my timeline, and lay out my tracks in a logical manner.
See, I’m an organizational nut! No, wait, that sounds wrong. I’m a nut about organization…ok, that’s better. Organization is the center of how I do things, so if I can impart some of my organizational knowledge to others, I’m feel good. Especially because I’ve worked with some people who can’t organize their way out of a box! Wait, can anyone do that?
ANYWAY…normally I just post a single timeline on Twitter, or now also on Facebook in the AVID EDITORS or ASK AN EDITOR section. Be it in progress or a finished thing. But this week I wanted to do something different. I wanted to show timelines from an Act of a show I worked on, starting with what it looked like at the end of day one, and ending on what it looked like at the end of day 7, with a bit of explanation about what I accomplished on each day. So….here we go. This is the timeline for Act 1 of a reality show.
This is a rough string out of only the reality. Normally this is something that the story producers would slap together, but this was the last episode, and since our show has an AFTERSHOW (like Talking Dead), we editors needed to do a more polished reality pass so that they could air this on the show. So, this is what I accomplished a few weeks before I actually returned to the act. So it’s only the reality moment, no VO, and audio barely addressed (I didn’t isolate mics).
I’ve now dropped in the “shell” of Act 1…meaning the spot where the show tease goes at the head, and then the open title sequence, and at the end, the shell for the tease out. I’ve also started dropping in the VO, and putting the reality tracks into the proper configuration, and isolating mics. A couple parts that you see at the end with tracks that dip into the PURPLE and SALMON range…those were additional reality moments added by the story producer. Here you can better see how I color code my tracks: BLUE is Interview, GREEN is reality, SALMON is sound effects, and PURPLE is music. And I make different shades of each so I can see at a glance where the stereo pair tracks are. By that, I mean that all the tracks for this are MONO, but all the ODD tracks are panned LEFT, and all the EVEN tracks are panned RIGHT, so I need to make sure I add my stereo clips on the proper tracks, odd first, than even. If I do it even first and then odd, the clips have the stereo pair reversed. NOT good when you head to the MIX.
Another thing you’ll notice is that I label my tracks by what type of audio goes on them. Helpful for me at a glance, and other editors who might end up fine cutting this, or dealing with notes. AND…this information gets transferred to the ProTools session (track names). Helpful for them, too.
Finished adding VO and adjusting the reality tracks and isolating mics (meaning only having the audio up for the people talking at that given moment, to cut down on audio distractions). And I’ve started cutting the scenes, adding reactions and b-roll.
Finished filling out. Added a tease for the upcoming at at the end, lower thirds, and addressed producer notes given just after lunch. This Act 1 is ready to go to network as a rough cut….joined with the other acts other editors worked on.
Along with straight up creative cutting, I also online edit and color correct. This started years back when I started using FCP. The show I was on had a preliminary color pass to show the network…PROVE to the network, that we could mix the full sized tape based Panasonic Varicam, and the newly introduced HVX-200. That grade was done on a full scale, tape to tape DaVinci system. I looked at what was done, and said, “I can do that.”
Now, I’m no stranger to online and color correction, not at that point. I was an assistant online editor for many shows, and I learned from talented people. This was the first time I decided to take it on myself. At that time, I used the simple 3-way color corrector and a little product by Red Giant software…Magic Bullet Editors.
From onlining and grading a special here and there, I landed a full year job grading two series and a handful of specials. At that time I used Apple Color. I still used that from time to time, on every FCP job that landed on my desk. But I also started digging into Avid Symphony…as more and more jobs coming my way were Avid based.
But now I have a job coming my way that’s shot on 4K, but needs a 1080p finish…with the ability to revisit it later at 4K. The network stated that Resolve would be the better solution for the task, so now I’m learning DaVinci Resolve.
And it’s about time! I’ve had it in my tool belt for a couple years now….Resolve 9. I won a copy of it a couple summers back doing the first #Postchat scavenger hunt. And I’ve sat on it, never needing to use it. I always kept referring to Color or Symphony. I never needed to use Resolve to convert footage, or do a final grade. Sure, I COULD have…but I’ll admit, I was lazy…other tools did the job just fine. But now I needed to use Resolve…I just needed to figure out how to use it.
For that I turned to the Ripple Training tutorials of my friend Alexis Hurkman. In a few hours, spread over a couple days, I got up to speed. It’s about time.
OH…and while I had DaVinci Resolve 9…a couple years old of an app…I was able to upgrade to Resolve 11…FOR FREE. And when Resolve 12 comes out…another free upgrade. And Resolve Lite, that tops out at 1080 support, is still free.
I’ve been editing from home lately, and using my 2012 MacBook Pro as my main editing computer. I had to abandon my 2008 MacPro tower as it’s really showing it’s age. It’s getting finicky with random crashes, the AJA card in it is finally giving up the ghost (8 solid years with that baby!) and it’s just plain slow compared to my 2012 MacBook Pro.
The thing is, my MBP doesn’t have that many external ports on it. Sure, it has a LOT more than a MacBook Air, but when it comes to all the things I need connected to it when editing…it falls short. For the record, it has:
(1) Ethernet port
(1) Firewire 800 port
(1) Thunderbolt port
(2) USB 3 ports
(1) SD CARD slot
(1) Audio In
(1) Audio Out
The Ethernet port I use on occasion to network with the tower, to transfer files. Or to connect to my control surface for color correction. Firewire 800…obviously for a FW800 drive, of which I have a dozen or so. Thunderbolt…that’s the busiest one. For I need that to connect to an IO device, the AJA IoXT that’s connected to an FSI professional color monitor, and also loop to a computer display. And then because my laptop monitor is too small to hold all the bins I need open, I use one of the USB ports for a USB to DVI adapter. And because editing/rendering/compressing causes a lot of heat on my laptop, the other USB is taken by a cooling pad. And then the audio out goes to the mixer and speakers.
Now I’m out of ports, but I need more. I need more USB for thumb drives to connect to, for backing up projects, or bringing over files from other people (fonts, pictures, etc), I need one for the keyboard and mouse, as I don’t use the laptop for that…it’s off to the side, I need one for other USB drives I have, like the RAMPANT drive I grab elements from time to time. Occasionally I attach other firewire drives, and yes, you can loop through…daisy chain…to some other drives, but it’s nice having other connections.
So I need a hub. But I want to future-proof myself so I want a major hub. Not just USB ports…but in the future I will get a new computer, as my 2012 laptop might not last long for editing either. And none of the newer models have Firewire 800. I might also want eSATA ports, as my tower has those, and I have many drives with that fast connection, but no new computers have them. So, I could either get Firewire to Thunderbolt adapters, and eSATA to Thunderbolt adapters, Ethernet to Thunderbolt (for connecting to network RAIDS), and USB hubs, or one unit that solves all my needs.
So I have been looking at Thunderbolt docks. These connect to the computer via Thunderbolt, and with that one connection, offer many connections on the other end. Multiple USB3, Firewire 800, eSATA, Ethernet, and audio ports…with Thunderbolt loop through. The ones I tested are from Otherworld Computing, CalDigit and AkiTio…all offer different options.
Let’s do this in alphabetical order…
AKITIO THUNDERDOCK 2
The Akitio Thunderdock 2 is a nice small box. It’s about the size of a small USB drive, so it has a very small footprint.
And this box sports a lot of connections…two Thunderbolt ports for loop through (very important)…two bus powered USB 3 ports (backwards compatible with USB 2 and USB 1), two eSATA ports (up to 6Gbps), and one FW800 port.
There’s no Ethernet port, but I know many people won’t need this….if you do, other options sport this. But this is the only device of all of them that has both FW800 and eSATA…so that alone makes it useful. The bus powered USB ports get their power from the box, not the computer. So even when your computer isn’t connected to the unit, the USB ports supply power…great for things like charging your cel phone, or keeping your astronaut light lit.
This unit requires power, therefore it needs to be plugged in….just like every model I tested. But this is fine with me…this is how it can offers bus powered USB ports.
How fast are the connections? Glad asked…first, a baseline. The drive attached is a CalDigit VR, the two drives raided as RAID 0, for speed. Here are the speeds of firewire directly connected to the computer. Around 75MBps read and between 70 MBps and 80 MBps write
Now the FW800 port on the AkiTio offers 66 MBps write/80 MBps read…so, comparable.
Now, my laptop doesn’t have eSATA, but my MacPro does…so I’m going to use it as a baseline. It has a CalDigit eSATA card in it. The speeds I get between it and the Caldigit VR are about 111MBps write and 125Mbps read:
The eSATA on the AkiTio? Would you believe it’s FASTER? Well, it is. Between 155-162 MBps write and 164MBps read. Impressive.
In short, the AkiTio is small, sports many connections, and has connections that are as fast, if not faster than direct to computer connections. The only issue I found was that the box ran a little hot. No, you can’t fry an egg on it, but I wouldn’t rest my hand on it for too long. Not hot…but more than warm. But after two weeks of use 10 hours a day, it didn’t seem to be an issue. Great little box. It retails for $245.99 and ships with a Thunderbolt cable.
CALDIGIT Thunderbolt Station 2
A friend of mine has this unit, and swears by it. He has a MacMini that also is running short of connections, and this has served him well. And in my 2 week trial with it, it worked great for me too.
This unit also offers a small footprint, and sits nicely behind or next to my computer.
And this one offers a different set of connections. Two Thunderbolt ports (allowing loopthrough), and 4K HDMI port, three USB 3 ports (two in the back and one in the front that is bus powered), two eSATA ports, an Ethernet port, and audio IN and OUT ports.
The HDMI is 4K at 30Hz, so it can send out an image to a 4K computer display or 4K TV. So you can send a signal out to a computer monitor via HDMI, or Thunderbolt (some monitors still needing a TB to DVI adapter). Now, one thing that this unit CAN offer over any other, is dual display monitoring from a single Thunderbolt connection. Meaning the one Thunderbolt out from your computer, can then be split to the HDMI out and Thunderbolt outs. But ONLY if your monitors connected via Thunderbolt are Thunderbolt native connections…like Apple’s monitors or LG’s TBT display. I was unable to test this feature, as I didn’t have one of those monitors.
The eSATA connection speed is comparable to the AkiTio…156MBps Write 165MBps Read. Again, faster than my MacPro offered.
Very useful to have that, as I have more than a couple drives with eSATA, and with high data rate formats, and the need to edit native formats, speed is good.
Another great box with many useful connections. It gets a little warm, but not bad. The case dissipates the heat well. It also has an AC adapter and is required to be plugged in to work, but again, that’s how you get power out to USB 3 ports. And the dual monitors via one Thunderbolt connection is a nice feature. But again, they need to be specific monitors. It retails for $199, and doesn’t ship with a Thunderbolt cable.
This unit is the biggest of the bunch, but it also sports more connections. And it still fits behind my computer nicely:
OK…this unit has two Thunderbolt ports (again, loopthrough), FIVE bus powered USB 3 connections, one Firewire 800 port, one HDMI 4K port, one Ethernet port, and Audio In and Out.
Where the AkiTio has both FW800 and eSATA…and the Caldigit has eSATA but no FW800…this unit has FW800, but no eSATA. Which is fine for many, as many people might not have eSATA, but need the FW800 connections, as all new Mac computers lack this connection. And we all have lots of FW800 drives that still function, and we still need to connect them to the computer.
The speeds of the FW800 connection are pretty much identical to what I get with the AkiTio box. 67MBps write, 80MBps read.
And like the CalDigit unit, this one also allows for display splitting, with the same restrictions. The monitor connected to the Thunderbolt port must be a Thunderbolt monitor.
This is my favorite unit in the bunch. Mainly because of all the USB ports (five of them) and the FW800 connections. In fact, the two ports on the side are “super-charged,” meaning they have extra power fed through them for fast charging of your tablet or mobile phone. I have a lot of things I need to connect via USB…a USB to DVI adapter (on the computer), a fan (on the computer) and then keyboard, Rampant drive, thumb drive, dongle for Resolve, Time Machine drive, or other transfer drive…all on the OWC unit. And when I do eventually upgrade, I’ll need the FW800 and Ethernet connections as I have lots of FW800 drives, and a color control surface.
And it runs pretty cool…about the same temp as the CalDigit unit. And like the rest, it also requires AC power. It retails for $229, and ships with a Thunderbolt cable.
These are not the only Thunderbolt docks on the market…these are just the ones I tested. There are also ones by Belkin, Elgato, and one by Sonnet that also has a BluRay drive for authoring BluRay disks.
The CalDigit and AkiTio review units were returned. I did retain the OWC unit as my expansion unit of choice.
Every day after school, my youngest daughter (age 11) comes into my office to watch me cut A HAUNTING. She does this in the guise of “I’m going to do homework out here while watching you edit.” Although she does end up paying more attention to what I’m doing and asking about what I’m doing than doing homework. But I tend to forgive the lack of homework doing because she’s very interested in the craft of editing. But I do make sure she does it later…
Normally this show might be a bit intense and scary for a child of her age…and well, the age she was when she started watching me edit and watch the show…age 8. But because she sees how the show is cut, she sees the scene pre-visual effects, pre-scary sound effects and music cues, the show has been de-mystified for her. So she’s not scared. But, she still really enjoys it, and she loves watching it come together.
She will be in my office one day as I watch down all the dailies for a scene…and then start to assemble the scene. I’ll explain to her why I cut to a different angle when I do…the motivation of the cut. Or why I won’t cut at all, just leave the one long take. She will be in the office the next day as I cut the music and SFX for that scene. I’ll explain why I use a certain sound effect, or why I chose the certain music cue. And then explain the process of back-timing music to make it fit the scene, and then why I might extend shots a few frames or so to accommodate the music hits. Many times I will play a scene and ask her opinion as to what type of sound effect she thinks I should put, and when. So I can see what sort of editing instincts she has. Most of the time they are spot on, as she will say “hmmm, I think I want to hear some sort of sound here when the person sees black ooze on their face in the mirror, and they jump. Some sort of boom…or something.”
Pretty good. But that doesn’t top what happened last week.
I’m cutting this one scene, and it consists of single long takes. The first angle is a medium shot that becomes a close up of a person walking down the hall after hearing a sound…Then the scare happens. Then the next shot is of another person entering the hall to come to her aid. The first two takes the camera is panned away from the first person, focused on the doorway as the second person emerges…then it follows him down the hall as he comforts his person 1. This is also a one shot take…no reverse angles. Five single angle takes. The first two takes started in the doorway, then panned down the hall, but they both had issues later on that made them not great. The far better takes are the next two takes…but the problem is that they didn’t start in the doorway…they started on a wide of the hall, angle on the first person.
This was an issue because not only would this be a jump cut, but the position she was in for the CU of the scream, didn’t match the wider shot at all. Sure, I tried it, because often this difference would be minor and not noticeable…but it was just too different. So I wasn’t sure I was going to do. I explained the situation to my daughter as she sat doing her “homework” on the client sofa. She put down the book, stood next to me and studied the situation.
“Hmmm…” she said. “How about cover that cut with an interview bite?”
I was about to say why that wouldn’t work when I realized that it would work…and rather well too. See, you need to know (if you don’t watch A HAUNTING), the show is a mixture of interviews and recreations. Mainly recreations, but with interviews of the people the incidents really happened to to give the scenes more weight. I will cut the scenes, but then need to adjust the cuts to accommodate the interview bites. And in this scene I had the interview bites happen much later in the scene, after the second person rushes to aid the first. But I could just move the first one up sooner. Have her scream…then say “I was utterly scared, and fell to the floor,” then cut back to person 2 coming to comfort here…a few lines back and forth, and then the rest of the interview.
BRILLIANT! This suggestion took her less than 5 seconds to come up with.
Now, I’m sure I would have figured that out eventually (maybe, after ranting about it for a bit)…but her instinct on this was so spot on, so quickly…I was humbled for a moment. This kid has a future in editing, that’s for sure.
There’s a new trend that has come about because of digital cinematography. Not only longer takes, but more footage overall.
Back in the days of shooting film, you’d hear the words: “Roll sound. Roll camera. ACTION!” And then, after the scene was over, the words “cut!” You might also hear the words “check the gate,” meaning look for hairs or dust in the film gate, but that’s besides the point. OK, we still hear those words, especially when rolling second system audio (audio recorded separate from picture)…but a new trend is happening, something not encountered when shooting film. Longer and longer takes.
When shooting film, one tended to be economical. Meaning, you shot only the scene, and when it was done, you stopped. Because film is expensive, as is processing that film. So you set up your shots, rolled film, got the take, stopped, and reset for take two. You’d use the time between takes to give the actors and crew directions, reset props and touch up makeup, etc. And when you were done with the main shots, you’d move in for inserts and pickups. Shooting the actors saying crucial lines or giving needed reactions.
This wasn’t limited to film. This was also done when shooting to tape. Because you didn’t want to use up all your tape…you shot practically.
But now, with the advent of tapeless media, things have changed. First off the amount of footage we get has increased by a major factor. On narrative projects, when we shot on film we’d get an hour or two of dailies per day. With tapeless, that has increased to between 4 to 6 hours of dailies per day. And us editors still need to watch every bit of that footage. That doesn’t leave much time to actually cut. And the production schedule…the edit schedule…hasn’t changed. So days get longer and longer. Deadlines might get pushed, but that’s rare. So this means the hours the editor works in a day…a week…increases.
What’s going on with this increase of footage? Well, many things. One thing that happens is that one “take” actually consists of multiple takes. The director doesn’t call “cut,” he simply says “OK, reset to one quickly” and while the camera is rolling, they do another take, or multiple takes. Recently I had one “slated take” that consisted of multiple resets. So one slated take contained five “takes.” That scene had four slated takes, and those four takes consisted of twelve actual takes.
Also, a lot of things can happen between the takes…and all while no one called cut For example, a friend of mine had a scene where one slated take was eight minutes long. In that eight minutes there was one minutes and thirty seconds of the director giving directions before action is started. He called for cameras and sound to roll, and then went to give directions. Finally he called “action,” and the shot took one minute from start to finish. Then while the camera was rolling the scene was reset, notes were given, makeup re-applied…6 minutes of general hub bub…and then another one minute take.
This is something that would never happen when shooting film. I recall being called to the set because they were doing a pickup of a scene and they needed me, the editor, on set so that they could make sure the continuity was right. The director asked if I was ready, then called “action.” When it got to the part I needed input on, I paused, trying to remember. “COME ON!” the director prodded, “we’re rolling! Quickly quickly!” I gave my direction, they did the scene and called “cut.” Film is expensive. (I mentioned that)
Another thing that can happen is smaller resets in the one slated take. That same friend of mine worked on a show where one slated take contains the full scene…but also the director will stop the actors in scene and have them repeat lines. Not once, but five to six times, maybe more. And not in one part of the scene, but he would stop them several times. The director will also stop several times to prod the actors to give better, or different, reactions. Redo moves. Basically the one slated take will also be minutes long and contain lots of bits and pieces to complete the scene. The scene itself, and all the pickups. Not only does this make cutting more challenging. But now the editor needs to scrutinize every second of the dailies, and dig through all of this for not only the best scene, but all the good reactions and lines, and then cut that all together, cohesively.
And yet, as I said before, post schedules don’t change, so this leads to longer than 10 hour days, or working on the weekends. Often without overtime pay.
Now, it can be argued that doing this makes for a better show. And that is true, because some good performances can be had, and great reactions…stuff missed when you only have 3-4 full takes. This might just produce that one golden moment or reaction that makes the scene shine. Another argument is that stopping down the scene and needing to go in an reset and slate adds more time to the production schedule. So doing multiple takes in one slated take can save precious minutes.
I can understand that. Still, some happy medium needs to be struck. Saving time on one end adds time to another. And post is the cheaper side of that argument. But a little extra time should be considered for the post side, in order to deal with the added amount of dailies.
I didn’t mention reality shows at all, because not much has changed there. They roll and roll and roll, and always have, because they are documenting what is happening. STACKS of tapes arrived at the office daily from DEADLIEST CATCH. Five ships, three to four cameras per ship…filming nearly 24/7. We have always gotten lots of footage from reality shoots. And that’s one reason they take so long to edit. The editors have to sift through all of that footage to look for the gold. Even when we have story producers who pull selected scenes for us to work with, us editors need to watch all the footage surrounding that moment…if we are given the time to do so.
Finally, another regular blog post, after 2 years of sparse activity…barring a few reviews (another coming). This was due to the fact that I was under NDA at the company I worked for. All I could say was that I was working on a certain show. Beyond that…I couldn’t say anything. On that note, for the past two years I’ve worked on two seasons of ANCIENT ALIENS, two seasons of AMERICA’S BOOK OF SECRETS, a short series called BIBLE SECRETS REVEALED and two seasons of THE CURSE OF OAK ISLAND. With a handful of specials and a few shows that I’ve onlined and color corrected. One of those series that I color corrected was for Werner Herzog…which is really cool! A cinematic hero of mine. But other than saying “I made Werner Herzog shows look pretty,” I didn’t have much to say about the shows…technically speaking.
OK, there was one great thing. He framed one interview to include a window, because there was a view of the freeway outside of it. It was blown out and he wanted me to do a power window and fix it as best I could, because he wanted to see the cars. He loved cars, ever since he was a young boy growing up in Bavaria, where he was so poor that he nor his friends could afford one. When one would drive by on the road, he and his friends would all run to see it drive by. OK…there, got that out of the way.
Now, back to A HAUNTING. I did cut 3 episode of season 5 this fun show 3 years ago…and I blogged about it starting with this post. And I have done the fine cut notes one episode, and recut another episode with another editor in the past year. Those fixes lured me back. That and being able to work from home…that’s a big bonus. No commute so those hours can be…well…used to work a little longer on the show.
Another bonus is being able to have my kids watch me work, which they like to do. And they like the show, too. And they really like seeing how the show is put together. My youngest (age 11) will come into my office and watch me watch dailies, and then assemble a scene. I’ll even be a bit sly and educate her on editing, by ask her advice about which take she thinks is the best. Most often it’s the one I thought as well. Other times, she’ll point out some part of a take that I might not notice. Some reaction that she likes…and it’s nice to have that perspective. I’ll ask her when she thinks I should cut…and why. And what she thinks I should cut to. It’s pretty fun.
Today she sat behind me as I was beginning to score a scene and add SFX. She really likes to see a scene come together with sound effects and music. She marvels at how a small effect will impact things…and how many I layer really fills out a scene. “It makes the scene…you know…more rich. Less hollow.” Good way to describe it.
So, in this scene the character begins to vacuum and hears a sound in the kitchen…but he’s alone in the house. He stops looks around, calls for his wife…then shrugs and goes back to vacuuming. A sound happens again, and again he stops to investigate. While he’s looking about he hears another sound, turns around and then the ghost appears. Well, in the filming of the scene when the actor starts the vacuum, there is no vacuum sound. That is something I need to add. And in both cases where the actor needs to react to the sounds, the director can be heard saying “SOUND” in the background. “How are you going to get rid of the director talking?” my daughter asked. “Why did the director even talk..that makes this difficult.”
“Well, let’s see. If I remove the sound of the director…what does that do?” And I do…and as expected, having audio cut out to complete silence was jarring. “Well, you can’t do that,” she replied. “Now it sounds empty…and it’s too abrupt.” I was able to explain to her about the use of “room tone” in situations just like this. But this was tricky as the actor was performing an action….moving a vacuum on the floor. The room tone worked for the third time the actor hears the sound, for the vacuum is off and the actor is still, looking about the house. But for the other two times. The first thing I did was add the sound of the vacuum…which helped a bit, but you could still hear the audio cut out as the sound of the vacuum moving on the floor can be heard, and then suddenly not heard.
“What do we do about that?” she asked.
“Well,” I replied, “we need a similar sound…of the vacuum moving on the floor, right? So I just need to grab some audio from a time he’s moving the vacuum and match the motion…so we cut out when he’s pushing it forward. Let’s find another place where he’s pushing it forward, and use that.” I did and it worked great. So after adding the vacuum, and patching the hole where the director spoke, I then looked for a sound to occur. I chose a loud floor creak, as that’s something that one might think a person could make…but could also be a sound a ghost makes. And then I sweetened that with a low boom sound effect when the actor reacts to the sound. And looked or a place where the music goes from a low tension drone to have a musical sting. And then back timed that and mixed the track so the edit was smooth.
After a couple hours we had the scene edited and with layers of audio. And with that my daughter excused herself to go inside and have a snack, and then go for a bike ride. But during the time I had her with me, she was engaged and really enjoyed herself. Thinking that I might have one of my kids following in my foot steps.
More and more I’m coming to rely on my MacBook Pro as my main editing machine, as my 2008 MacPro is really showing it’s age. OK, to be honest I’ve been relying on it as my main edit machine since I bought it. But my laptop isn’t all the new either. It’s a mid-2012 non-retina MacBook Pro. Now, this model is also getting a bit long in the tooth, but I feel it still has quite a bit of useful life in it yet. And since I’m not one to chuck away a computer when it’s just a little old, I relied on my 2008 for four years…I’m hoping I can get more life out of my laptop. I thought I’d look at options to extend it’s life as long as I can.
When I first bought it, I buffed it up a bit by installing 16GB of RAM. A year later, I buffed it up even more by replacing the main drive in it with an SSD drive.
Most of the time I’ve used it, it’s been on side projects, as I’ve been working in an office on their Avid machines. These smaller side projects always end up needing me to attach a hard drive to my computer. This is quite normal…but there are some inconveniences. First off, the power drain is pretty quick, requiring me to constantly be plugged in. Second, and this one is the biggest problem…those hard drives occasionally unplug and throw all the media offline, and the app to crash.
Then I came up with a solution. I have this optical drive that I rarely use. I happen to have an external one for the two Mac Minis and two MacBook Airs in the household, so if I needed one, I have one. So why not take that thing out and put in a second SSD? A large one, and fast one, to store media on? I’ve been seeing ads online, and hearing about the Mercury Extreme Pro 6G from MacSales (AKA OtherWorldComputing) from friends, and how fast they are. So I sent for one (the 480GB model), and for an optical drive replacement kit (known as a Data Doubler).
They arrived last week. Instead of going into all of the boring details of how to install it as they are in the nice manual they send…
which I at first I thought would be confusing, but as it turned out, it was very straightforward and easy:
So the first thing I had to do was get the SSD drive into the Data Doubler…that was easy enough…just a couple mounting screws:
Now to get it into the computer. I’ll just say that while I had to disconnect a couple cables and take out a lot of screws (and keep track of what went where), and disconnect and reconnect a couple small and delicate cables…I was able to do that without much effort, and pretty quickly (20 min total).
This computer surgery took place on the dining room table, with my kids watching on…wondering if I’d end up breaking or losing anything. Nothing of the sort happened, and I couldn’t tell if they were happy, or saddened by that. Sometimes I think they want me to break something by accident just once, as I often take apart my computer or other electronic thing.
But I did have a couple leftover screws. Which is always a bit worrisome…but in this case, the manual said this is perfectly normal, as they provided two screws to mount the Data Doubler, making these two no longer needed:
So the new drive is in…and happily on the desktop underneath my system drive:
And it is FAST. See for yourself…here’s the AJA Speed Test:
And the Blackmagic Design Speed Test.
Plenty fast for 1080p ProRes or DNxHD…or 2K footage. Not that I get 2K footage, or 4K. I mainly deal with 1080p. But it is plenty fast for my needs. Faster than the SAS RAID I have in my office, and TONS faster than eSATA. Approaching Thunderbolt RAID speeds. F-A-S-T.
I’ve been using it for the last few weeks for a sizzle reel…a show pitch for a network. The only issue I’ve had is a slightly higher power drain. My battery used to last about 5 hours, now it lasts 4 hours. That’s longer than when I have an external drive attached…my battery drops to between 2-3 hours with one of those attached. But I tell you, when I installed my first SSD, replacing the spinning OS drive…THAT was a big change. That alone will make your machine like new.
So I’ve extended the life of my computer even more. I’m hoping to get two more good years out of it. And with all I’ve put into it…I think I will.
If you happen to have a laptop with optical drive, and don’t need to author DVDs…I highly recommend replacing it with an SSD. Or at least replacing your main OS drive with one. Having an SSD as the main boot drive is amazing. 12 seconds to boot the computer, just a few seconds to launch applications. Night and day compared to the older spinning drives.
Worried about TRIM? How some drives can slow down over time…and how certain OS versions, such as Yosemite, don’t support TRIM? Well, no need to worry. OWC SSDs don’t do trim.
(The Mercury Extreme Pro 6G and Data Doubler were provided by MacSales at no cost to the reviewer. I’m pretty sure this did not sway my review in any way. It would have been just as fast and just as handy had I bought it, which I was planning on doing.)
OK, this is a tip for the more advanced user. One who knows their way around the Avid Media Composer software enough to dig themselves out of any hole they might dig themselves into. Or those who know a bit about how Avid MC operates under the hood. Although some if this is easy enough for the beginner. Just…be careful.
If you edit with Avid Media Composer (MC), you may notice that Avid has a very specific way that it organizes media, and there’s not much you can do to change it. Avid media is ALWAYS stored in the following path: Hard Drive>Avid MediaFiles>MXF>1. Or other numbered folders, as there is a file size limit of about 5000 files. Get near that and Avid makes a number “2” folder, and so on. Avid MC will ONLY see media if it is in that very specific file structure. Any media outside of that, and it won’t be seen. Even if you have a “1” folder just on the root level of the hard drive, Avid won’t see it.
And you can’t make specific folders and direct Avid MC to render/import/capture to those specific folders on the project level. Meaning you can’t have the 1 folder be for Project X, and the number 2 folder be for Project Y. Avid will always default to putting media into the 1 folder until it is full, then shift to folder 2. Because of this it makes it difficult to separate your media from one project into separate folders…like we did with Final Cut Pro Legacy (1-7). For those of us who are hyper organized, and like to do this on the finder level…it’s frustrating.
See, Avid MC does all the organization inside the application. That’s where you access the media, copy the media…and sort the media. Want to delete the media for a specific project…then you need to open the Media Tool and search for media for that specific project, and delete it. Or if you want to move it, you use the same thing…Media Tool and then CONSOLIDATE. It can be a hassle
Now, if you have media that isn’t for one project…but is to be used in multiple projects, you want to keep that media separate from the project specific media. Say the same music cues, sound effects, basic b-roll or stills. Well, I have a solution. My original solution was to just have different numbered folders, because Avid will see the folders if they are numbers in the MXF folder. So I numbered my music as “10” and my SFX and “20.” This way I could easily find them. I would love to have had a folder called “SFX” and one called “MUSIC,” but if I named them that, FCP would see them. They needed to be numbers or Avid MC wouldn’t see them.
But then I saw something that raised my eyebrow.
The folders started with numbers, but then had names with letters after that number. And yes, Avid still did see the media. I took out the number, and the media went offline…add it back, they returned. Did I think of this solution? No, I can’t take credit for that. I had a project arrive with things done this way, and I was a bit amazed to see that it actually worked. You see, I thought that the folders had to be ONLY numbers. You see, if you had a number first, Avid will see it. So you can still add a name. Sweet!
BUT…Avid MC still defaults to putting media into the “1” folder when you render or import new media. So it might be best to make the numbers higher…starting at 10 or 20. And if you import new media that needs to go to one of those folders…after you import, do a REVEAL FILE to reveal the media, and move it to where you want it. Just know that any new footage goes right into the “1” folder. So if you need it moved, you need to do that manually.
This is where the dangerousness of this comes into play. If you don’t know Avid MC well, you can dig yourself into a hole. Make media go missing. So you need to know how to fix things if you mess up. If you have media from your project that you need, and you THOUGHT you moved it to a new folder, but you didn’t…oops. So BE CAREFUL.
More and more footage is coming from tapeless formats. From formats where the camera medium used to capture that footage is used over and over again. So one needs to back up that footage onto another medium. Yes, there are LTO and DLT solutions…tape. And that is very reliable. And while still expensive, the price is dropping. But not enough to let a small independent editor like myself afford them JUST yet. Soon.
In the meantime, I archive the tapeless footage, as well as finished shows and graphics and other elements to hard drives. TWO hard drives to be precise. As a post supervisor I know puts it…”the footage doesn’t exist unless it is in two locations.” Two copies…at least. And I have clients who I do work for that I need to keep their media on ice for a few months just in case changes come along.
Now, I can use expensive drives in enclosures, or keep this all on a big RAID tower, but I use my RAID for active media, and enclosures…who needs those for simple archiving? They are more expensive than bare SATA drives. That’s what I use.
To access those I’m going to use a Drive Dock. These have been around for a while…allowing you to take a bare 3.5” or 2.5” SATA drive and connect it to your computer. Slide it in, connect it to your computer, turn on the power…bam, there it is.
I did rely on a USB2 dock for quite some time…but that connection is proving to be very VERY slow. OtherWorldComputing.com (OWC) approached me about reviewing a few of their products…and the NewerTech Voyager drive dock was first on their list of options.
What I really liked about this is that it wasn’t only USB 2 or 3. Yes, USB 3 is plenty fast and my computer has USB 3 so I should just be happy with USB 3…but I do have other older computers. And many places I work also have older computers…none that have USB 3. But they do have eSATA or FW800…and if your computer is so old you only have FW400…well, this unit has ALL of those. USB 3, eSATA, FW800, FW400 (Firewire is daisy chainable…so you can add this to your chain). So you aren’t limited to USB on your older machines…which can take a long time. You can benefit from the speed of FW800 or eSATA.
I first used this unit to clone my computer’s hard drive, as I was going to do an OS update. I ALWAYS clone my working system before I update….in case something goes wrong, I can always revert back to the old working system. So I pulled out my 2.5” drive that I used for this purpose (It’s the one that originally shipped with the computer, but I had since replaced it with a 500GB SSD from OWC). I connected via FW800 and cloned my system. 216GB copied over in 30 min.
Then I did the update. And wouldn’t you know it…things didn’t work out as I had hoped. (the OS version and my main editing app didn’t like each other)…so I had to revert back. NOW…how did I revert back? I booted to my cloned drive. I’ve always had issues with trying to boot from an external drive connected via USB…even USB3. It’s hit or miss…sometimes It’ll boot, sometimes not. But with Firewire…400 or 800…it’ll always boot. Because this unit had FW800, booting to the clone was a breeze, and I had my working OS back up in under an hour.
VERY handy to have a dock with this many connections. And OWC is kind enough to provide you with all the cables:
Oh, and if you want transfer speed numbers for the various connections, here they are for USB3, eSATA and FW800…the three connectors I have access to on my laptop. The eSATA connection is through the Thunderbolt bridge…so speeds might vary if you are using a card in an older tower.
I transferred a 1.2GB file using various connections…USB 3, FW800 and eSATA. I didn’t use FW400, as my computer didn’t have that connection. Sure, my tower did, but I was too lazy to turn it on and try it out. It’s a VERY legacy connection…barely anyone uses it anymore. Anyway, here are the various timings:
USB 3 – 21 seconds (100GB file would take 35 min)
FW800 – 22 seconds (100 GB file would take 37 min)
eSATA – 18 seconds. (100 GB file would take 30 min) FW400 – not tested, but bound to take longer than all the above
And I ran the BlackMagic Speed Test with all the connections. I won’t post all three as they all were almost exactly the same.
The drive used was a simple Western Digital Caviar Green 1TB drive…5400 RPM. And again, this wasn’t a review on how this can be used for editing, as you can see it MIGHT be able to do one stream of ProRes SD. But rather as an archive solution. The speed you want is in the transfer. So for the fastest transfer….use eSATA. Or USB3.
OWC also provided a NewerTech Universal Drive Adapter, a simple cable kit that allows you to not only connect to bare SATA drives, but also connect to even older drives, such as PATA drives that don’t have the typical SATA connection, but rather a long two-rows of pins. This is very handy as I do have a few drives that are PATA, that I did use to archive. This allowed me to access those drives, and copy the information to more current drives….ones much younger that will last longer. So, ‘yaay’ to that. This only has the USB 3 connection, but most if not all current computers have those, and it’s plenty fast. Nice and simple kit…works with 3.5″ drives and 2.5″ drives.
Why this rather lengthy post for a couple of drive dock products? Well, I didn’t just want to go “Here’s this thing you can put a drive in and here’s how fast it works….blah blah boring tech talk blah blah.” I like to give real world examples of how you might use the products…how I use the products. So the main takeaway from this is BACK UP YOUR STUFF! In two locations. And clone your OS before you update…just in case. And here’s a darn good tool to do all of that stuff with.
As for the review products…OWC is letting me keep them. Because they say the cost of returning and restocking them won’t match the resale value. And no, this didn’t sway my review. I happen to love the new drive dock…much better than the USB 2 one I already owned.
GoPro Hero cameras are everywhere lately. It seems like there isn’t a production I am working on that doesn’t utilize this camera in some way. They are mounted in cars to either see the driver and passengers, or aimed at the road. They are mounted on back hoes as they dig, mounted on drills as they burrow into the ground. They are mounted on people as they do crazy things. They get angles that you normally cannot get.
First, let me mention the three models currently available from GoPro:
Hero 3 White Edition can shoot video at 1080p30, 960p30 and 720p60, and 5MP photos at up to 3 frames per second. It can shoot timelapse from half second to 60 second intervals. It has built in WiFi, and can work with the GoPro WiFi remote or a free smartphone app.
Hero 3+ Silver Edition does all that, but shoots up to 1080p60 and 720p120, and shoots still photos at 10MP up to 10 frames per second.
Hero 3+ Black Edition does all that the Silver Edition does, but adds 1440 at 48fps, 960p100, as well as 720p100 and 720p120. It also shoots in ultra-high resolution, going to 2.7k at 30fps and even 4k at 15fps. And it has an option called SUPERVIEW, which enables ultra-wide angle perspectives. It can shoot stills at 12MP stills, 30fps. All cameras have built in WiFi and work with the remote, or smart phone app, and all perform much better in low light situations than their predecessors.
For this post, I was provided with a Hero 3+ Black Edition camera, and a slew of accessories. What is really handy about the Hero 3+, is that it can shoot in a wide variety of ways that might suit various aspects of production. For example, The ultra high speeds is shoots makes it great for smooth slow motion conformed shots.The ultra-HD frame size it shoots allows for repositioning the shots in post to focus on the areas of interest we want to focus on. They all can be controlled wirelessly from an iPhone or Android device with a free app…and you can change the settings in those apps, far easier than the in-camera menus.
OK, so the GoPro Hero 3 line of cameras prove to be very useful cameras, enabling you to get all sorts of useful footage. But the point of this post is to showcase workflows for ingesting the footage into various edit applications so that you can take advantage of these advanced shooting modes.
AVID MEDIA COMPOSER
Let me start with Avid Media Composer, only because that is what I have been using the most lately. If you set up the camera to shoot in normal shooting modes, like 1080p30 (29.97), 1080p24 23.98 or 720p60, then importing is easy. Simply access the footage via AMA, and then transcode to DNxHD…either full resolutions like 145, 175 or 220…or an offline codec like DNxHD36, DV25 or 15:1 so you can cut in low resolution, and then relink to the original footage and transcode to a higher resolution when you go to online.
First, go FILE>AMA LINK and you’ll get the following interface. Select the clips you want to link to:
Once you have all your clips in a bin, go to the CLIP menu and choose CONSOLIDATE/TRANSCODE:
If you shot 720p60, so that you can use the footage either normal speed, or as smooth slow motion in a 29.97 or 23.98 project, then you need to first import the footage in a project that matches the shooting settings…720p60. Then copy the bin over to your main project and cut the footage into the sequence. You will note that the footage will appear with a green dot in the middle of it, indicating it is of a different frame rate than the project:
The footage will play at the frame rate of the project, or you can adjust it to smooth slow…take all of the frames shot and play them back at a different frame rate. First, open the SPEED CHANGE interface, and then click on the PROMOTE button:
That enables more controls, including the graph. When you open the graph, you’ll note that the playback speed is different. If you shot 60fps and are in a 29.97 project, then the percentage will be 150%. Change that number to 100% and now the clip will play back in smooth slow motion.
If you shot at a higher frame rate and want it to be slow motion…say 720p 120fps, then you’ll have to use the GoPro Studio app to convert that footage. The cool thing about that application is that it’ll conform the frame rate, and convert the frame size to suit your needs. I’ll get to that later.
NOTE: You can edit the footage native via AMA. When you bring it into the main project, and drop it into the timeline, it’ll be 60fps, or 120fps (note the image above of the timeline and green dots…those are AMA clips, thus why one shows 119.8fps). So when you promote to Timewarp, and adjust the percentage, it will play in slow motion. But know that editing MP4 native in Avid MC is anything but snappy. It will cause your system to be sluggish, because there are some formats that Avid MC doesn’t edit natively as smoothly as it can Avid media.
One trick you can do is to AMA the GoPro footage, cut it into the sequence, promote to Timewarp and adjust the playback speed…and then do a Video Mixdown of that. Then you’ll have a new clip of only the portion you want, slowed down. The main issue with this trick is that any and all reference to the master footage is gone. If you are doing an offline/online workflow this might not be the best idea. It’s a simple trick/workaround.
Now let’s say you shot a higher frame size, such as 2.7K or 4K, and you want to reframe inside Media Composer. First thing you do is use AMA to access the footage. But DO NOT TRANSCODE IT. Once you transcode, the footage will revert to the project frame size…1920×1080 or 1280×720. Avid MC does not have settings for 2.7K or 4K. I’ll get to the workaround for that in a second.
Once you add the clip to the timeline, you’ll notice it has a BLUE DOT in the middle of the clip. Similar to the GREEN dot, except where green indicates a frame rate difference, blue indicates frame size difference. If you then open the EFFECT MODE on that clip, FRAME FLEX will come into play.
You can then use the Frame Flex interface to reposition and resize the shot to suit your needs. If you shot a nice wide shot to make sure you captured the action, Frame Flex will allow you to zoom into that action without quality loss. Unlike zooming into footage using the RESIZE or 3D WARP effects on regular 1080 footage.
One drawback is you cannot rotate the area of interest. The other is that you cannot convert the footage to an Avid native format…something I mentioned earlier. So you can either work with the 4K MP4 footage native…which might prove to be difficult as Media Composer doesn’t like to work with native MP4 footage natively, much less at 4K. So one workaround is to do your reposition, and then do a VIDEO MIXDOWN. This will “bake in” the effect, but at least the footage will now be Avid media:
ADOBE PREMIERE PRO CC
The workflow for Premiere Pro CC is by far the easiest, because Premiere Pro will work with the footage natively. There’s no converting when you bring the footage in. Simply use the MEDIA BROWSER to navigate to your footage and then drag it into the project.
(the above picture has my card on the Desktop. This is only an example picture. I do not recommend working from media stored on your main computer hard drive.)
But I highly recommend not working with the camera masters. Copy the card structure, or even just the MP4 files themselves, to your media drive. Leave the camera masters on a separate drive or other backup medium.
So all you need to so is browse to the folder containing the media, and drag it into the project, or drag the individual files into your project. Bam, done.
CHANGE IN FRAME SIZE
Ok, let’s say you shot 720p60…but you want to use your footage in a 1080p project. When you add the clip to the timeline, you’ll see that it is smaller:
That’s an easy fix. Simply right-click on the clip, and in the menu that appears select SCALE TO FRAME SIZE:
But what if you want this 720p 120fps footage you shot to play in slow motion? Well, that’s very easy too. Right-click on the clip in the Project, and in the menu select MODIFY>INTERPRET FOOTAGE:
Then in the interface that appears, type in the frame rate you want it to play back as. In this example, I choose 23.98.
Done…now the clip will play back slow…even if you already have it in the timeline.
FINAL CUT PRO X
Importing is really easy; File > Import > Media. You can either work natively, or choose the OPTIMIZE MEDIA option. Optimize media will transcode the footage to ProRes 422.
You get a nice box to import with an image viewer.
Now, as I said before, you can work with the footage natively, but I’ve found that GoPro, because it’s H264, it likes to be optimized. I haven’t worked with GoPro native extensively in FCPX so I cannot attest to how well it works compared to how it does in Premiere Pro CC. Premiere has the advantage of the Mercury Engine and CUDA acceleration with the right graphics cards.
OK, so to transcode all you need to do is right click and choose TRANSCODE MEDIA:
Get these options:
You can create ProRes master media, and proxy media at the same time if you wish. Or just full res optimized media (ProRes 422), or just Optimized Media (ProRes Proxie) that you can relink back to the masters when you are done editing, that you can transcode to full res Optimized Media when you have locked picture. When you create the optimized media, or proxy, the frame rate of the footage is retained.
When it comes to speed changes, unlike FCP 7 and earlier that required you to use CINEMA TOOLS, you conform the GoPro footage internally in FCPX. As long as you set the timeline to the desired editing frame rate, 23.98 for example, then you can conform any off frame rate clip to it by selecting it and choosing Automatic Speed from the retime menu.
OK, lets say you shot 4K, but want to use it in a 1080 or 720 project. FCPX has what is called Spatial Conform. When set to NONE the clips go into a timeline at the natural resolution. For example, a 4K clip will be at a 100% scale, but will be zoomed in. All you need to do is scale back to like 35% to see the entire 4K image.
GoPro STUDIO
All right, let’s take a look at the tool that GoPro provides free of charge…GOPRO STUDIO. I use this application quite a bit, not only to pull selects (only portions of clips), but also to convert the footage into a easier to edit codec. H.264 works OK in Premiere, better if you have CUDA acceleration. But my laptop doesn’t enable that, so I choose to use the CINEFORM codec that GoPro Studio transcodes to. I also use it to convert higher speed rates for use in Avid Media Composer…like I mentioned earlier. If I have a 120fps clip, I cannot bring that directly into Avid and transcode it to that same frame rate. So I will convert it here first, to match the frame rate of the project….then AMA link and transcode.
Importing is easy. In the main window, on the left side, simply click on the “+” button, that allows you to import the clips. Grab as many clips as you want to. And then when you click on a clip to select it, it opens it into the center interface, and that allows you to mark IN and OUT points…if you only want portions of the clip:
To adjust the speed of the clip, click on the ADVANCED SETTINGS button. You’ll be presented with the following interface:
In here is where you change the speed to what you want. Simply click on the frame rate drop down menu and choose the one you want:
You can also remove the fish eye distortion from the footage if you want.
If the speed change is all you need to do, then click on ADD TO CONVERSION LIST and be done with it. But since the 120fps frame rate is only available at 720p, and most of my projects are 1080, you can also up convert the size to 1080 in GoPro Studio as well. And the conversion is pretty good. For that you go into the Advanced Settings again, and in the Frame Size drop down menu, choose the frame size you want:
If you want to convert 720p 120fps to 1080p 23.98, then the settings would look like this…I also removed FishEye:
So there you have it. Some of these workflows are just the basics, others go into more detail. But I’m sure there are lots more tips and tricks out there that some of the more “power users” of the edit systems employ. My hope is that these tips will enable you to use your GoPro Hero cameras to their fullest.
(Thanks to Scott Simmons (@editblog on Twitter) of the EditBlog on PVC, for helping me with the FCPX workflow)
A GoPro Hero 3+ Black edition was provided to enable me to test various aspects to the workflows. GoPro was kind enough to let me keep the unit, enabling me to shoot family activities in super slow motion, or in “Ultra HD” resolutions. It was used to shoot a sledding outting, including a couple crashes…and a couple cat videos. They weren’t interesting enough to post…my cats are boring.
It’s time. It’s time I took my 2008 MacPro out of regular use and start using my newer computer, a 2012 non-retina MacBook Pro. Why now? Well, I was onlining a series for MSNBC using Avid Symphony (in 64 bit mode) and working with some effects took longer than with my laptop. Rendering the show took longer on the tower than on the laptop. Some tasks were lagging on the tower, but not on the laptop. By the end of the series, I was finishing everything on the laptop.
Now, I do happen to have some things on the laptop that helped. The MacPro has a Kona 3 card, and I happen to have an AJA IoXT (and BMD Ultrastudio Mini Monitor) that connect to the laptop via Thunderbolt. So that part is covered. The IoXT has Thunderbolt loop through, so I am able to then connect a Thunderbolt to DVI adapter and have a second monitor. That leaves me with a Firewire 800 port and a USB 3 port for connecting hard drives (I use one USB port for a keyboard and mouse). I do happen to have an eSATA to USB drive adapter, so I can still connect drives via eSATA, but the speeds aren’t quite the same. Things have been going smoothly thus far with USB 3 and Firewire 800 drives. The onlines I just wrapped up all were on small 1TB USB3 bus powered drives.
But if I want to edit larger projects with lots of media, I’m going to have to be able to connect larger arrays to my system. My 4-bay eSATA bay…my CalDigit HDOne 8TB Tower (next to my tower in the above pic). And if I want RESOLVE to really crank out the renders, I’ll need to get a compatible graphics card. Unlike a MacPro that has extra slots to install such cards…eSATA, MiniSAS (HDOne) or secondary graphics card…my laptop can’t do that. Nor can you do that with the Apple iMac, MacMini…or new MacPro Tower. Apple is betting everything on Thunderbolt…as you can see with the six Thunderbolt connectors on that new MacPro. So what are we to do now?
PCIe Thunderbolt bridges.
Because this transition happened slowly, there have been several companies that have come out with these bridges. There’s the Magma ExpressBox 3T, mLogic mLink R, a couple options from Sonnet Tech and more. (Sonnet just announced the first Thunderbolt 2 expansion chassis, with three slots. Perfect for the new MacPro, that is shipping with Thunderbolt 2 connectors) It turns out that a friend of mine happened to have a chassis that he was trying to offload. It was to be used in a DIT station he was going to build for a client, but never happened. So he sold it to me for half price.
The one I bought is the Sonnet Echo Express Pro II…which is currently discontinued. But Sonnet makes an updated version of the Echo Express, and a few others (full list of supported PCIe cards found here). What’s great about this, and the other options, is that they all have two Thunderbolt connectors. This means that you can connect your computer to the bridges, then possibly to a Thunderbolt RAID, an IO Device, a second computer monitor (if it’s Thunderbolt compatible) and so on. I have one IO device that only has one Thunderbolt connector, so I can’t use it and an external monitor…unless I add a graphics card to the bridge. But the IoXT has loop through, the Sonnet has loop through, and then I connect to the DVI adapter…done.
I unpacked the Echo Express and read the instructions. Pretty simple…take off the cover, add the cards, put the cover back, add power, connect via Thunderbolt to my laptop. No drivers needed…at least not for the bridge. You still need to install the drivers for the cards you install…just like you would need to if you installed them into an older MacPro tower.
Now, when I installed the cards, I had an issue arise. The CalDigit FASTA4 card I installed…it’s a 4 port eSATA card…worked fine. My drive connected to it via eSATA showed up on my desktop. But the HDOne did not. It lit up like it should, but nothing appeared on the desktop, nor in the Disk Utility software.
I emailed both CalDigit tech support and Sonnet Tech support, explaining the situation. CalDigit was the first to respond asking for more details, and providing a different driver for me to install. This driver was for their HD PRO 2 tower. This one is called THUNDER EXPRESS…and makes their SAS cards compatible with Thunderbolt bridges. Sonnet emailed next stating that the PCIe card makers need to write drivers that make them “Thunderbolt compliant.” Makes sense…and is exactly what CalDigit did with their driver. I installed it, and sure enough the HDOne mounted fine. How did the eSATA one work right away? Well, I downloaded the latest driver from the site and apparently that one made the card Thunderbolt compliant.
This might be true for all the bridge options. I do know that the mLogic one was designed with the Red Rocket specifically in mind, so the driver might not be needed for that. But since I don’t have either a Red Rocket or the mLogic…I can’t say for sure. All I can say is that if you plan on upgrading to a newer computer that has Thunderbolt ports, and you want to bring over some of your devices that connected via PCIe…make sure that the company makes drivers for them to make them Thunderbolt compliant. Don’t get stuck, like I thought I did…when I bought the box, and the HDOne didn’t mount…I was pretty frustrated. I didn’t do my homework properly. Thank goodness for CalDigit being on top of things.
(yes yes, I still have that second monitor behind the laptop. I still need it on the tower, and it’s the best place to keep it for now)
(For the sake of this post, I’m going to speak in terms of editing documentary or reality or corporate presentation type projects, not scripted. The approach to music in scripted projects is a little different)
More often that not lately, editors in my end of the production spectrum have been tasked with using library music in our shows. Meaning that there isn’t a composer. Well, there MIGHT be a composer, but they simply provide us with stock music, or music used in previously scored shows. Sometimes they might be utilized to score part of a show, so we have some original music. But lately, more often than not, the music us editors add to the cut is THE music that ends up in the final project.
And this brings me to the realization that there are many editors that simply don’t know how to edit music.
This is an issue that pops up all the time, specifically when working on a show with multiple editors. Very often I’ll be watching a cut, and something odd happens midway through a scene or near the end…the music will “jump” suddenly, or shift to a different tempo mid stream. When I solo the tracks I’ll note that either the music simply cuts from one cue to the next, rather clumsily, at a point where the editor wanted to change the mood of the piece. Or there might be a simple dissolve joining the parts of the music cue together. I understand how people can have trouble with this, as music can be very hard to edit, especially to a specifically timed scene. The music needs to change when you want it to change. But you cannot accomplish this simply by adding a dissolve.
This will not do.
You need to find a natural cut point in the music. Those typically happen on the beats. And not just any beat…you can’t cut to a down beat when an upbeat is coming up….they need to both be down beats. This is VERY HARD to explain in a blog, and when I lack the music language knowledge to know what all the vocabulary is. My daughter will be shaking her head in shame right now. But if you listen to music, you’ll hear it have upward movements and downward ones…and beats. You can’t suddenly change direction on your beats…say two downward beats, or it will sound odd. You need to find the similar beat to cut on. This will mean that you need to adjust the timing on your cut…for sure. You might need to space your narration further apart, or the dialog, but if you do it right, if you can get it to land on the right beat, then the music can actually accentuate the statement that someone makes.
On documentary projects (and some types of reality), one trick that I have come to employ is to cut the music to the narration and interviews and recreations right away…in the “radio cut” phase. In essence, really make it a RADIO cut. Make it sound like a piece you might hear on THIS AMERICAN LIFE (don’t know this show? I can’t recommend it more!). Make it work as a radio show that you later add images to. So first I’ll string together my narration and interviews, then I’ll hunt for the music cue I think fits best and cut it in. I’ll listen to the rough with the music, and if I’m lucky, there’ll be hits or rises that happen that might be perfect for when someone says something major. If it takes a while for that beat to hit, then I’ll adjust the music so the impact happens after the statement. I’ll need to edit the music. So I’ll see if there is a repeating movement that I can simply cut out, or make blend properly. If I want it to have more impact, I might add a sound effect to punctuate. Or…sometimes other musical instruments like a rising cymbal to slowly signal a coming change to the music.
It’s also tricky when you want to edit a section with the impact at the end of a music cue, but it doesn’t “back time” properly to meet up with the first part of the cue. Then you need to get tricky and creative, and really hunt for beats that match, and adjust timing to they match well. And then there times when you want a cue with one tempo to start, say a nice slow moment, then boom, cut to a fast paced exciting moment, you need the music to blend. And not just with a long dissolve, they might need to have a common beat. At times like this it’s like I’m a club DJ that needs to transition from one song to the next. But while they can adjust the speed of the songs slightly to compensate, I really can’t.
This REALLY is hard to blog about. This needs to be heard to be understood.
I guess all I can really convey is try to blend the music better, see if you can get a good radio edit of your footage. Having the right cadence and pauses in the right place, and even adding sound effects to make a point impact more, will really help you figure out visuals. This is why finding the right cues matter, and why finding music that works just right can take time…a lot of time. Many times I find myself spending more time finding the right music cue for a scene, then actually cutting the scene itself. And once I find the right cue, I’ll need to adjust the scene to accommodate it.
Finding the right music cue is VERY important. It is often the difference between a scene working, and it not working at all. One day your producer might watch your cut and hate it, and the next day love it, and all you did was change the music cue. In a screening not long ago a new editor came aboard an existing show and was…new to the show style. When his act was screened, we hit this one section where the music cue really hit the producer wrong. While it was a moment of celebration, the cue used was…well, he said “Oh my god, I can’t take it. This sounds like a graduation cue! No…stop it, I can’t watch this scene…I can’t…stop it.” The rightwrong cue can make a good scene unwatchable.
OK, one final note I’ll make…make sure your music “buttons.” That means…make sure it ends, not just fades out. It needs to resolve, end on a “bahm BAHM bahm!” or some other musical thing that has it end. It might fade after that…meaning it won’t just cut to silence, but have that last note slowly die off. Unlike many songs you might hear on the radio where it’s just the chorus fading to silence as the song ends (I hate that), the cue needs to have an ending. Needs to button. Watch commercials and documentary shows to see what I mean.
If this isn’t difficult enough, revisions throw a wrench into the works. If we are told to cut a line here, or swap things around…that messes with the music timing. Or we are asked to move chunks of story from one act to another, or swap scenes. Now we need to do real damage control. Blend the music with the new scene, maybe find entirely new cues so they match better, because what you had before no longer works. Fixing that can take a while. Good producers know this, and allow for that time.
That’s it for this blagh post. If I get enough of you commenting, asking “what the hell do you mean? Can you show us what you mean?” then I might be persuaded to make a podcast about this. If I can find a scene that I can show to people. I’ll try to dig something up.
PS – I know one production company that specifically asks if you play a musical instrument, because they require people to do a lot of music editing, and understand how music works together. It took a lot of convincing to get that job, as I don’t play an instrument.
Let me start with a preface…I’ve been working with the same company since last December. In that time I’ve worked on four series and one pilot. So I have footage from all those shows floating in my head. Three of those shows are on the same ISIS system where I’m working, so I have access to all that footage.
Which can be dangerous. I’ve been on feature docs that are full of ‘temp images’ ripped from YouTube or some other online resource, and I’ve needed to find replacements…and the sticker price on those shock the producers. “But, that other shot is perfect, and it’s there already, can’t we just use it?” No. Or we can use it, but the quality of the footage is 360×240, and this is an HD show. “Can you bump up the quality to match?” No…I can clean it up, but only so much. and 240 to 1080…that’s quite a leap! There are many reasons you don’t do this.
Today I starting doing stuff that would drive me insane if I was the online editor or assistant editor on the show. I’m on a series that just started, so we don’t have a lot of stills and stock footage to draw from just yet. The fact that we started a week early because we have a very short time before this starts airing doesn’t help. So I’ve been assigned an cct to cut, but have darn little footage to add to it. Normally what I need to do in cases like this is add a slate stating FOOTAGE TO COME and what that footage should be…say “FIRE” or SHOVEL DIGGING IN DIRT, CIRCA 1530.” And then I prepare a list of footage needs and give those to my producer and researchers.
But see…slates drive me nuts. I want footage there, even if it’s temp. And…well, I have this ISIS full of footage from other shows, and since I worked on those other shows I know that, for instance, in one series we have a bin full of fire stock footage, and on another show, I know that we have recreation footage of someone digging in the dirt that I might be able to make look like it’s from the 1530’s, even though it’s supposed to take place in the late 1780’s. So I KNOW we have this…but I also know that I can’t use it. Because the producers and researchers can’t track it properly, and some of it was shot specifically for another show. I KNOW I CAN’T USE IT…
…but I do, because I want to see something. I did slap TEMP on it…with the intention of saying “I want something like this, but not this.” But this stuff has a way of accidentally sneaking it’s way through the show’s edit and ending up in an online where suddenly we find that we can’t use it and need to replace it (this has happened before).
I emailed my researcher asking, “OK…what will be the consequence of grabbing b-roll from, say, SHOW X for this show? Or a shot of a shovel digging take from SERIES Y? I know I shouldn’t, but here I am, needing to put SOMETHING on the screen, and know ‘Hey, I saw that in X,’ or ‘I know we have this shot in Y and it’ll be almost perfect.’ Shall I throw TEMP on it? Or just not EVEN go there and just slate it?”
I work in documentary TV and film, therefore I see and use stock footage. On the latest two TV series I am cutting, they are pretty much ONLY stock footage. Very little original footage is shot for them, other than interviews and some b-roll of locations or objects. Everything else is sourced (a term meaning “obtained from”) stock footage libraries, or past TV shows the network has produced.
So I’m familiar with using stock footage, and issues pertaining to them, such as the “rights” to that footage…meaning how it is licensed. Some you can license for one time use, some for festivals only, some for domestic TV, for a set number of years, but mostly the networks I work for want it for domestic and international, in perpetuity (basically forever). And the images you use must be clearable…meaning that you have the rights to show the footage, and possibly the people or things in that footage…everything in the shot.
This is where a big issue arises. Let me give you a few examples:
1) A feature documentary I onlined had a section that talked about the subject’s childhood. What era they were raised in, what part of the country, that sort of thing. Well, at the time they didn’t have a movie camera (super 8 was the camera of choice when they were growing up) so they didn’t have footage of their life. Thus we needed to rely on stock footage. So they searched a few companies for what they needed, found some great Super 8 from the area and era they grew up in, and downloaded it. All was grand, until we had a shot pan across a 1960’s era living room, and there, on the TV, was THE FLINTSONES. This presented a big problem. Sure, you licensed the rights of the film from the person who shot it, but what is playing on the TV…they don’t have the rights to that. For that, we’d need to contact CBS (the network THE FLINTSTONES aired on) and pay a separate fee for.
You know how sometimes in the credits they show at the end of movies and TV shows, “TV SHOW X FOOTAGE COURTESY OF” and then list the network? No? I guess I’m one of the few that notices that. Anyway, that is because they got permission, and paid for that permission, from the network, and then needed to credit them. So if we wanted to use THE FLINTSTONES, we’d need to pay CBS, and I ‘m sure it is no small fee, and we couldn’t afford that…so…I blurred the TV. Simple solution.
2) I’m working on a TV doc about presidential assassins, and of course the assassination of JFK is featured. Now, the most famous bit of film from that incident is called The Zapruder Film. That’s the iconic super 8 color film shot by Abraham Zapruder that we’ve all seen, and that was featured in Oliver Stone’s JFK. Now, I have worked on a Kennedy assassination doc before this, and I know that that particular film is very expensive to license. So much so that on the Kennedy doc, we used every single angle of the assassination BUT the Zapruder film.
So, here I am on this TV doc and working on a section about Kennedy, when what should I see, from the same stock footage company as in example 1…but the Zapruder film. Now, this company is known for selling the stock footage they have for cheap…cheaper than the competition. So here was this iconic footage, not full frame, but in the center of the picture, about 50% the size of the full frame, surrounded by all sorts of sprocket holes and clutter and stuff to stylize the image. Well, not matter how much lipstick you put on it, it’s still the Zapruder film. You still need to pay the Zapruder family that huge fee in order to use this footage on a TV show. Sure, you could BUY that footage clean, but LICENSING it…that was the problem.
3) Example three comes from the same stock footage company in example 1 and 2. I’m beginning to see why they are cheap…they must not be staffed with enough people to catch these issues. Today I needed to use footage of how crystals are used in current, leading edge technologies. So I used a shot of someone using an iPad. Simple enough, right? Nope…in that shot the first thing they access is SAFARI, and then the main GOOGLE splash page shows up. Sorry, but if you want to use the GOOGLE page, you gotta pay a license fee. So I look later in the clip and what do they look up? iPAD! So the next shot is the Apple page for the iPad. Another image we’d need to license.
Dude, what’s up with that? Sell a stock shot that you cannot clear? Someone’s not paying attention.
We did find a better example…someone using the iPad to look at schematics and then a spreadsheet (not Excel), so generic that it worked. That shot was sourced from a different company.
The other issue I have with this same stock footage company is so different I can’t call it #4, it’s not about licensing. No, this is about COMPLETENESS….if that is a word. If not, I hereby coin it. Nope, it isn’t underlining in red, it must be a real word. This issue is that a LOT of the footage this one company has, say of a crowd cheering, or a car racing down the street, or a forest scenic shot…does NOT have any audio on it. So this crowd is cheering, looking very loud, but are in fact, very quiet. The trees in the scenic move in the breeze, but there is no audio for that breeze, for the wind whipping through the trees. There is no traffic noise as the car drives down through Hollywood. That’s bad. That means that I now need to pay for sound effects, or look in my grab bag of sound effects I already own to see if I can build the audio to fit picture.
This could easily have been avoided if they just included the audio in the image. You KNOW they recorded it. And audio is very important. If you see an image of people screaming and cheering a football team, but don’t hear it…even if it is happening under narration or an interview…if you don’t hear it somewhat, it’ll throw you. It’ll take you out of the moment where you are engrossed in the story, and have you wondering why that shot is odd. Why are you distracted by it? Your body might guess it and figure out that its the audio. Or it might not and just send a “something is wrong with this” signal. Audio is important.
Want to know another issue with stock footage? This issue has nothing to do with the company seen in the top 4 examples. NO! This is a website that is known the world over, and is an issue that plagues independent docs, and some TV ones.
YouTube.
I cannot say how many times I’ve worked on a doc, or show pitch, and have been asked to source YouTube videos. People seem to think they are free…public domain. People put this footage up for all to see, therefore we can get it to use in a doc. Well, no, you can’t. You still need to license the footage from the owner. Even if it is for a single event with a small audience.
This brings up a great example of FREE footage. Footage that you can ask for, and use…for free! And it all comes from our government. NASA footage…all free to use. Now, they might have low res versions on the web, but if you call and ask, they will provide you full quality clips. Why? it’s YOURS! You pay taxes, your taxes pay for NASA…therefore it is yours to use.
Same goes for the Library of Congress. Any images or docs contained within, that they own (they store some items/footage that they don’t own, but are safeguarding for the owner, because they are important), is also free. We …the citizens of the United States…own it (remember those taxes again!), so it’s free for us to use those images on TV.
OK, back to editing, and searching through bins and bins of stock footage.