Skip to content

Little Frog in High Def

Adventures in Editing
Little Frog In High Def

Archive

Archive for January, 2013

If you happen to follow me on Twitter, you were no doubt privy to the barrage of tweets I did while at the LACPUG meeting on Jan 23. Dan Lebental was showing off this cool editing app for the iPad, TouchEdit, and I live tweeted interesting points he made, and pictures I took.  I’d like to go a bit more in depth here.  More than 140 characters for sure.

The reason this app came about is because Dan bought an iPad and when he moved the screen from one page to another…he went “hmmm, there’s something to this.” And then he would browse through his photos, moving them about with gestures of his hand like he would if he was holding them, and he said, “hmmm, there’s something to this.” Eventually he figured out that this could mimic the tactile nature of editing film. Being able to grab your film strips and move them about, and use a grease pencil to mark your IN and OUT points. So he went out and found a few people to help him develop this. No, he didn’t do it on his own, he went through about 14 coders (if I’m remembering right) to eventually come up with version 1.o of his software.

Who is this for? Well, he honestly said “This is designed for me. For what I want…my needs.” And I like that attitude. Because if you like something, chances are you’ll find someone else that likes that something. And that is a great way to develop a product. To fulfill a need/want/desire that you might have.

Anyway, moving on.

He showed off the basic interface:

The film strip above is the source, and the film strip below is the target…your final film. Now, the pictures of the frames don’t represent actual frames. You don’t need to advance to the next picture to be on the next frame…that’s just a visual reference to the film. Slight movement advances the film frame by frame…and there’s a timecode window on the upper left (sorry for the fuzzy picture) and the clip name on the upper right. So you can see what clip you have, and what the timecode is. You’ll scroll through the footage, or play it, until you found the section you wanted, and then mark your IN and OUT points. To do this you swipe your finger UP on the frame you want to make a grease pencil like mark for the in point. Now, the pencil mark won’t be on the frame you selected, it will be on the frame BEFORE the one you selected. Because you don’t want grease pencil on your actual frame. A swipe down marks the out point, and then you drag it down into the target where you want to put it.

There are a couple big “V” letters to the left of the footage on the timeline. The big “V” means you are bringing audio and video. Click it to get the small “v” and you will bring over only picture.

When you do this, you’ll note that your cut point, where your footage was dropped into the timeline, is marked with a graphic depicting splicing tape:

One thing to note too is that the GUI (graphic user interface) of the film appears to run backwards when you play or scroll it. That’s because it mimics the way actual film moves through a KEM or STEENBECK editor. Really meant for the film people. But, if it’s too distracting, Dan said he would take all comments on the matter and make it an option to play the opposite way…in case it’s too distracting.

OK, Dan flipped the iPad vertically and the interface changed:

Now we see just the source strip, and 8 tracks of audio. This is where you’d be doing your temp audio mix to picture. And with the tap of a button…

And you have a mixer, to allow you to adjust your levels.

I did mention that I felt that 8 channels wasn’t quite enough for the temp mixes I was required to do. He replied that he could perhaps add a second bank of tracks so that you could then have 16…or 24…or 32. This is a possibility on later versions.

BINS.

Dan didn’t call them bins…he said the more accurate term was “collections,” as they are the place that holds the collection of clips you have to work with. That area looks like this:

There is also the main project window. That, interestingly enough, does like like a bin type thing, with film strips hanging down, representing your projects. In graphic only…they are actually below in the window:

IMPORTING

Here is the import interface:

There’s even a help menu for importing:

Importing footage can be done via iTunes Sharing, iPad Video (which is called Photos on the iPad) or Dropbox. For maintaining Metadata you use iTunes Sharing or Drop Box as iTunes Videos tends to drop some Metadata. The footage can be low resolution proxies, like 640×360 MP4 or H.264…or full resolution…but in a format that the iPad can work with…thus MP4 or H.264. So you can use the app as an offline editing machine, or for editing your project at high resolution for exporting to the web straight from the device.

STORING YOUR FOOTAGE

The question I had for Dan was…how do you store the footage? Well, it’s all stored on the iPad itself. There currently are no external storage options for the iPad. So you are limited in the amount of footage you can store at one time. How much depends on how compressed the footage is. A lot a low res, not much at high res. Yes, I know, VERY specific, right? Specifics weren’t mentioned.

I did ask “what if you are editing, say, THE HOBBIT, and have tons of shots and takes…a boatload of footage. What would you do then?” His answer was “Well, you can have the footage loaded in sections…for certain scenes only. Or have multiple iPads.” I pictured a stack of iPads in a bay….one with scenes 1-10, another with 11-20, and so on. Not altogether practical, but the loading of sections seemed OK. And Dan did have three iPads present, including a Mini…so he might just be headed that second way. (joke)

Dan mentioned that he loaded an entire indie movie on a 64gig iPad at 630×360 with room to spare.

EXPORT

It eventually gets to a point where you are done editing…now what? Hit the export button and you have a few options: Export final MOV to iTunes sharing, export the final MOV to Dropbox so you can share it with others, export it to your PHOTOS folder, or export the FCPXML to iTunes Sharing or Dropbox.

FCPXML you ask? Yes, that is the current way to get the “edit decision list” out of the app and have you reconnect to the master footage. It exports an FCPXML meaning that it interfaces with FCP-X. But that is only in version 1.0. The TouchEdit folks did mention that a future update, Version 1.1, will feature FCPXML Input/Output and AAF Input/Output (AAF support is for Avid). Good, because I was wondering how you’d edit this feature film on your iPad and then deal with it in FCP-X. That’s just temporary…other options are in the works. But Dan did say that the application is based on AV Foundation, and not Quicktime..so that points to working tightly with FCP-X…and working well with the future Apple OS’s.

In addition to all of this, TouchEdit has partnered with Wildfire studios in Los Angeles. Wildfire is providing an a large number sound effects library to TouchEdit free of charge in version 1.0. You heard it…free SFX. In version 1.1 or 1.2, TouchEdit will add a SFX store where you can buy SFX rather cheaply.

TUTORIALS

Yes, there are already YouTube tutorials on the TouchEdit YouTube Channel, to get you up and running. Nice guys…thinking ahead!

COMPATIBILITY & PRICING

TouchEdit works on any model iPad 2 or higher…including the iPad Mini. And it will be available in early February for a price of $50.

Let’s start off 2013 with a review of a really cool product…the AJA T-TAP.

Until recently when you wanted to send a signal to an external monitor from your edit system, you needed to get an “I/O Device.” I/O meaning “In and Out,” and device being either a card that you installed internally on a tower computer, or an external box…or combination of the two. These devices allowed one to capture incoming video signals (from tape or directly from cameras or switchers), and output video signals (to client and color correction monitors). In the age of tape this was the way to get footage into your system.

But in the current age of tapeless capture, the “I” part of the “I/O” is no longer needed. All we want/need/desire is output to a client monitor…or broadcast color correction monitor. So instead of shelling out $500 to $8000 for an I/O device…you can get the AJA T-TAP for a mere $299.

The device is remarkably simple. It connects to your computer via Thunderbolt (so unfortunately it won’t work on Mac Pro towers or PC towers as they lack this connection type) and then outputs full 10-bit video via SDI or HDMI with 8 channels of embedded audio. And it’s so small, it can fit into a small compartment in your backpack, or in your pocket, and allow your edit system to be very lightweight and mobile. The T-TAP is also very versatile. It is compatible with the three major editing systems: Avid Media Composer 6 and 6.5 (and Symphony), Adobe Premiere Pro CS6 and Final Cut Pro (X and 7). Unlike other options that AJA has, the audio out of this device is only available via HDMI or SDI, so you will have to monitor audio from the client monitor, or patch audio from that monitor to your mixer…depending on the edit software you use. FCP 7 and Adobe Premiere Pro allow you to route audio through the computer speakers, while Avid Media Composer locks the audio output to the device.

The T-TAP supports resolutions from SD (525i NTSC and 625i PAL) all the way up to 2K. Frame rates from 23.98, 25, 29.97, 50 and 59.94.

I ran three real world tests with the T-TAP, and had great success with all three tests.

First…the out of date, end of line Final Cut Pro 7. After I installed the driver I got a call from a client to do changes to a sizzle reel that I had cut in FCP. So I opened it and worked on it for two days. With this option, I was able to play audio out of my computer headphone jack directly into my mixer. The video offset was similar to what I used with the AJA Kona 3 and AJA IoXT. The video output was very clean…similar to what I get from other I/O devices. And I got all the flexibility of output I have come to expect from this…now discontinued software. It worked well.

Next I tested it with Adobe Premiere CS6. For this I used it with a family video project. Now, prior to this I hadn’t used and I/O device with CS6. I had tried to use CS5.5 with the AJA Kona 3, and it was less than solid. You had to use custom AJA settings, and I could see the canvas (program monitor) output, but not the Viewer (preview). I had used CS6 to edit, but not monitor externally. So when I launched it with the T-TAP attached, I was very pleasantly surprised to find that it worked, and worked VERY well. No longer did I need custom AJA settings, the base T-Tap driver and Adobe plugin was all that I needed and I got a solid signal from CS6. Viewer, Canvas…zero latency and no audio drift. No slowdown in performance. It simply worked, and worked well. And like with FCP 7, I could either monitor audio via the T-Tap, or route it through the direct out (headphone jack). It was perfect.

The final test was with Avid Symphony 6.5. And this was a full on, frying pan to fire test. I was hired to do a remote edit…travel to the location to edit footage being shot on location, and turn around the edit in one day. The shoot was tapeless, shot with XDCAM EX cameras. The footage came in, I used AMA to get it into the system, and then edited on my 2012 MacBook Pro, and I monitored externally via the T-Tap and the hotel’s HDTV. For the first part of the edit I didn’t use the device, I did everything with the laptop. That’s because Avid locks the audio output to the AJA T-Tap….meaning that audio followed video, and I’d have to monitor audio via the HDTV. A tad difficult as it was bolted to the dresser. Unlike FCP 7 and Adobe Premiere CS6, I couldn’t choose an alternate output for the audio. So I did the initial edit without the T-Tap, but when it came time to show the client my cut, I connected it to the TV and was able to play back (with zero latency and frame offset) for the client at full quality. All while I was confined to the really small hotel table. My computer, hard drive and T-Tap barely fit…but nothing was really crammed in, there was elbow room. And the edit went smoothly.

Unfortunately I did not test this with FCP-X, as I do not have that on my system. However, I do know that it works with FCP-X, and the latest update of FCP-X and the T-TAP drivers make external viewing very solid.

Bottom line is…the AJA T-Tap is amazingly simple, and simply works. It’s great no-fuss no-muss video output for the major editing systems. The simplicity, the price point, small footprint and the flexibility of this little box make it a must have in my book. It works with any Thunderbolt equipped Mac and it perfect for low cost, high quality video output monitoring. AJA has a reputation, and track record, for compatibility and stability…and that tradition is carried on with the AJA T-TAP.

(NOTE: The T-Tap review unit was returned to AJA after a 4 week test period).