I was lucky enough to be invited back and got to talk with Mark Spencer, Steve Martin, Alex Lindsay, Mike Matzdorff and Bill Davis, all of whom are smarter than me.
Anyway, a big thank you especially to Alex Lindsay and Pixelcorps who are doing some really amazing things with Live Streaming on that show. They know their stuff, and I think the format for it is pretty fantastic.
As far as I’m concerned it’s kind of the model on how to do a user group right… was a total pleasure to be a part of it. Also profiled over on FCP.CO.
Anyway, hope you guys like it as much as I liked being there… and let me know if you need any clarity on anything we discussed in the comments.
I’m noticing RED proxies may no longer be needed in FCPX. So… way back when 10.1 got released there was a little feature in the release notes that’s actually a big deal, but no one really talks about… and I’m not ever sure anyone really noticed it:
If you have transcoded RED RAW files to ProRes through a third-party application, you can relink to the original RED files within Final Cut Pro.
For me the Proxy workflow with RED stuff always worked fine… but last week, I did a little test. Basically, I brought some RED files into FCPX, did a quick batch rename, some prep, etc. Then, I went and transcoded out a 1:1 5k prores LT file from the Epic footage in REDcine-X. I went back into FCPX to relink from the R3D to the RCX prores file… relinked with no problem.
It would seem that you don’t need proxies anymore to be offline/online with RED footage… you can import your RED files right into FCPX, get prepped, etc… in the meantime, you can be trancoding that same footage through RCX to whatever codec you want (I’d typically recommend Prores LT for offline)… and then when you’re done transcoding, just relink to your to your transcodes, edit away, and when you’re done, relink back to your RED files and then finish. There should be no downtime and your relink should be almost as fast as flipping from proxy, except that you won’t be stuck with the prores proxy codec for your offline, and you can work with other non-RED formats in the same timeline in optimized/original mode largely without issue. Kind of awesome.
One small caveat – when you’re transcoding your RED footage, make sure your timecode setting matches the timecode displayed in FCPX. I did a test that had the timecode set to Edgecode for some reason, and it caused some relink issues until I noticed that my timecodes for the Prores transcodes weren’t matching the timecodes for the RED files in FCPX. Once I was on the right timecode setting, I was able to relink without issue.
Anyway, for smaller RED only projects, I’m still going to use the Proxy workflow, only because it’s so easy and I can transcode in the background right in FCPX, and flip modes as necessary… however, for longer form work where I know I’ll be working offline for an extended period of time, the flexibility of being able to easily relink to RCX transcodes is great.
Now… here’s the million dollar question that I haven’t tried… will this relinking business work with non-RED formats? If anyone has a chance to check, let me know in the comments…
While I’d love it if it supported full 4K DCI (4096×2160), this is still a big step in the right direction for them.
What’s really nice about this is that’s it’s going to shoot out Prores, and not some ridiculous, impractical RAW format that’s going to be hard to work with… although it seems from the diagram like they’re not quite supporting Prores XQ, which is a bit of a drag. That’s the capture format I’d love to see for feature/high end work.
Regardless, a huge step in the right direction.
Whether you all like the way the ARRI Amira is set up or not… you need to start thinking about how you’re going to shoot and master in 4K. It’s going to become the new standard.
So… I know we’re all excited about the Apple Keynote today and of course IBC…. but in other news…. I just wanted to let you know I’m going to be hanging out with a bunch of really smart people on Wednesday September 10th, 2014 and we’re going to be talking FCPX workflow:
On the show are going to be the usual suspects Steve Martin, Mark Spencer, and Alex Lindsay as well as Mike Matzdorff and Bill Davis, who are both doing some awesome things with FCPX.
Show’s going to be starting at 1pm PDT, and if you tune in live, you can ask us all some questions that hopefully we’ll be able to answer.
The show has a really innovative format where all of the guests are sitting around a roundtable that has 6 screens all connected up automagically to the same footage and computer. I have no idea how they set it up this way, but it makes for a really collaborative, free flowing show that’s a lot different and I think more interesting than what you might see from a typical user group/workflow presentation.
If you missed the first two, they were kind of awesome and you should check them out here:
So… just in case you hate being locked into a subscription/rental model, here’s a list of alternatives to Creative Cloud apps you can use to move away if you find that you want to.
For the record, from the list of the Apps in the chart below, the ones I use in my day to day are: FCPX ($299.99), Motion ($49.99), Compressor ($49.99), Davinci Resolve (free for lite/$1000 for paid), Pixelmator ($29.99), and Logic Pro X ($199.99). All are available on the Mac App Store.
A late addition to the list but still something we love for quick and awesome looking design is Canva.
For the high end jobs I do, when I send off an edit for sound design or VFX, the pros I’m collaborating with are typically using Nuke or Pro Tools… However, I’m not going to count those in the following price comparison, as I don’t use them in my day-to-day and have never needed to buy them.
Anyway, here’s the total cost to buy the Apps I use in my day to day as an editor/colorist (using the Lite Version of Resolve):
About $630
Number of computers I can install these Apps onto from the same Apple ID:
Unlimited
Total cost to rent the Cloud for 3 years, which is what I would consider the typical paid upgrade cycle for software to be:
About $1800
Number of Computers I can install the Creative Cloud on before I have to start deactivating machines:
2
Not only that, but even if I count updating to the paid version of Resolve ($1000), things would still be cheaper than they would be with a paid license from the Cloud over that 3 year cycle.
In my humble opinion, one system feels like it’s providing a lot more value and flexibility than the other. With the Apps I’m using (all of which are on the App Store), I’ve never had to pay for an upgrade since I bought them, and I’ve never had a problem with a download or had my access to an App I’ve already installed affected by a cloud service outage or for not making a payment (as many Creative Cloud users have experienced).
Also, I’m only paying for Apps I use, whereas with the Cloud model, I’m either locked into single App rental pricing (which at $9.99 for Photoshop or $19.99/month for other Apps is still going to be more expensive over three years than the most expensive App I’ve listed), or I’m going to have to get the whole suite of Apps, most of which aren’t going to be my first choice for the work that i do.
"For an editor/colorist like me, especially one who is not a After Effects/motion GFX centric user, I simply just don’t have a need for the cloud at all."Sam Mestman
In fact, out of the whole suite of what I consider to be the “creative apps”, I’d only really rate After Effects, Illustrator, and Photoshop/Lightroom as the industry leaders in their respective categories… and for most editors, Motion and Pixelmator are more than suitable replacements for the type of things they’re typically asked to do by clients. For an editor/colorist like me, especially one who is not a After Effects/motion GFX centric user, I simply just don’t have a need for the cloud at all.
I’m not writing this to get anyone upset or to attack the Adobe suite of products. I actually happen to like a lot of what they’re doing and would be a Premiere user if I wasn’t cutting with FCPX… but I’m not a fan of the Cloud model, and I don’t think it offers a lot of value for users, in general.
So… without further ado, please take a look at the chart below if you’ve been looking for alternatives to Creative Cloud Apps, and let me know in the comments if you think I’m missing anything or there are other apps you’d recommend over the ones I’ve listed:
Most of the vector graphics on this site, including the FCPWORKS logo were created with iDraw. It’s a legit alternative and less complicated app than Illustrator.
I love Pixelmator. By nearly all accounts, it would seem that with the latest update, it’s now a pretty suitable replacement to Photoshop for what 95% of editors do.
If you’re doing High end visual effects work, Nuke is generally perceived as the best there is. It’s not cheap, though. I think After Effects would still win based on price vs. performance.
I’m a Motion user and I love it. For most motion graphics tasks that FCPX editors need, Motion is fantastic. For more specialized tasks, After Effects is the way to go… until you graduate to something like Nuke. But if you just need to make some nice looking Motion Graphics stuff quickly, Motion is the fastest, most intuitive thing out there because of how closely it integrates with FCPX.
By everyone I’ve spoken to who has used it, Smoke is extremely powerful and is a full fledged editor/finishing station. It’s great for the graphics/effects centric editor, and lots of people love it. There’s a pretty steep learning curve with it though.
Not a huge fan, mostly because it doesn’t play well with any other Apps outside the Avid ecosystem (something Adobe Apps do really well). However, for high end studio/union editors, Avid is without a doubt the industry standard, even though the code/ interface/ workflow/ business model is archaic and outdated.
Resolve is now pretty much the industry standard… that’s really all there is to say about it. Pretty soon, it could also be a legitimate NLE competitor to the big three (Apple, Adobe, Avid)… but right now, it just happens to be the best color correction software on the planet in terms of price/performance.
Pro Tools is the industry standard for Sound editing for movies. I really wish it had more competition because I think it would be good for the industry, but there really isn’t much. When it comes to doing heavy sound editing for picture, Pro Tools is currently the best there is and it’s not much of a debate.
Personally, I don’t think Logic is great for film/video sound design/editing, but it is fantastic for scoring/ mixing/ music creation/ podcasts, which is typically how I’m using Logic… although i’ve got a long way to go before I really become competent with it.
So… while that comparison article I listed is a deep dive… the bottom line is that both programs work just fine for what you’ll likely need to do with them.
FCPX’s metadata workflow once you combine it with tools like Shot Notes X and Lumberjack, is light years better than anything you can do with Prelude. You can do metadata entry very easily in Resolve as well.
Maybe I’m missing something, but outside of RED RAW workflows where you can actually import that RAW footage natively into your NLE and work with it easily (and get real time playback), I literally don’t see any advantage to using a camera codec that isn’t ProRes in this day and age.
It made sense when the onboard processors and memory capacities were low enough to require codecs optimized for compression speed and image quality but not so much for playback and editorial. Nowadays, that’s just not the case. If something is a RAW format, I don’t want to mess with it if it takes up endless amounts of disk space and it won’t import natively/play back in my NLE.
The image flexibility that RAW provides on the finishing end becomes counterproductive if I’ve got to go through an elaborate conform/maintain a gigantic archive. It’s just not practical. Digging into the RAW is mostly for sending to VFX and to correct mistakes. It shouldn’t be necessary on a fundamental level. I’d rather have properly exposed, correctly lit ProRes XQ masters. Those are more than enough for color correction, VFX, or keying.
XDCAM, AVCHD, and all the other formats are certainly not better for editing than ProRes is, and it gets really annoying that you need a special plugin or application to unwrap video you’ve shot so you can watch it. At the very least, if you’re going to design a codec, don’t make playing it back a difficult thing for the average person. You should be able to just hit the space bar from the finder and watch your clip.
AJA Cion Camera with ProRes
I realize I’m howling at the moon… but the truth is that when it comes to editing, there isn’t a better, more versatile codec than ProRes. It would be nice if I could just start with those files. ARRI, AJA, and BMD have the right idea with their cameras (all can record natively to ProRes)… would be nice to see the other camera manufacturers follow suit… or at least come up with a coherent explanation as to why they insist on their proprietary codecs that don’t in any way help the end user.
Our man in Wisconsin, Dustin Hoye (FCPX Editor of The Next Bite), is back again making our lives a little easier with this video about the little understood, hidden away in a submenu, but really useful “reimport from camera archive” option:
One small thing to add to this tip – basically, if you ever find you have something going wrong with media not linking back up properly from clips imported from cameras that require a plugin in order to import (usually Canon or Sony), hitting “reimport from camera archive” and then finding the original media (make sure it’s still in the original card structure), and letting the import process happen fully is usually going to solve the problem.
In general, best practice while things are importing (especially if FCPX if making wrappers of clips in the background) is to not do anything that might make your system angry while this process is happening. I realize this is vague… but if for even 1 second the question of whether something you’re about to do while things are importing/waveforms are generating might make FCPX angry… don’t do it. Wait for your import to finish and your progress bar to get back to 100%, and then do the thing you were thinking about.
And if you do make FCPX angry and you have to force quit for some reason and then find your foootage now needs to be relinked… you now know what the “reimport from camera archive” button does and your blood pressure can go back to its normal level.
Was fortunate enough to spend some time with Phillip Hodgetts from Intelligent Assistance last week and we had a longer discussion about Lumberjack, his amazing tool for real time logging and pre editing in FCPX.
I had watched him demo it, and had even taken it out on a test drive with our How We Make Movies Podcast a couple weeks ago… and I had some miconceptions/issues on the first go round with it.
Anyway, there were a few things I noticed as I talked with him…
First, for some reason it had taken me forever to finally work up the nerve to try it out on a real project… it was like I was afraid of it somehow. The concept had seemed easy enough to me… but there was something about it I couldn’t wrap my head around… and I realized that while talking with him I was actually just over complicating everything… I was making the app and the process harder than it needed to be.
Second, things that are obvious to Phillip Hodgetts are not obvious to the rest of us. After him answering two of my questions, literally everything else about the app made sense.
Third, I had some misconceptions about the app (for instance, I thought that wireless was required to do logging), that simply aren’t true.
The Content Creation Date Sync thing is ridiculously easy – Forget timecode and slates, etc. if you’re using lumberjack. Just tell your cam ops to set the clock on their camera to the same time as the Lumberjack App and you’re good to go. If you want a really good insurance policy, you should have all your camera ops make their first shot of the day be an image of the Lumberjack logger screen (where the exact time is listed). You should have nothing else to worry about after that.
Sync your multicam clips in FCPX first and then send to Lumberjack – I asked him whether you could have the metadata applied to multicam clips… the answer is yes… and what you should do is bring your footage into FCPX first, make the multicam clip, and send out the XML containing the multicam clip into the Lumberyard App, and it will apply your logging info to the multicam clip.
The Lumberjack Logger (Web) is different than the Lumberyard (OSX App) – While you’ll be doing all of your logging through the web/IOS App, you’re going to still need the OSX app to do the XML interchange.
You can still log if something happens to your wireless – Basically, if you’re experiencing problems with slow wireless (like I did on my first shoot), if you create your event ahead of time, you can still log through the IOS App, get done what you need to get done, and hook up to the Lumberjack Server when your wireless situation improves. You are not a prisoner of your wireless network while using Lumberjack. Just use the IOS App. This was a big misconception for me.
You can use Lumberjack after you’re done shooting or with old footage – The IA guys just but into beta their new Backlogger App where you can log footage you’ve already shot (or catalog video masters you’ve finished for things like Promo departments).
Anyway, for all the info you need about getting going with Lumberjack, check out the newly published Quick Start Guide… and if you have any problems, Phillip and Greg are awesome with support.
If you haven’t checked out Lumberjack yet, and you’re doing a lot of Doc/reality/non-scripted work… it’s probably going to become your new best friend.
There’s no reason to be afraid of it. It’s easier to learn than you think.
Imagine you’re a company that has two products in the same product category. One is the beginner product designed as an introductory tool that is also powerful enough to be used by most hobbyists and is perfectly fine for the average person. It’s easy to learn, simple to use, free, and widely used by millions.
Your other product is designed for professionals. It has a completely different methodology from your beginner tool and requires people graduating from your introductory tool to completely relearn everything they had learned from that tool in order to use this new, “professional” tool.
Most people would read the above and conclude this was a stupid business strategy. There’s no good reason that your two products in the same product category should be so different. Not only that, but In terms of the bigger picture, a smart company would design their long term plans around the product that has the most users, and build from the product that is accessible to the largest number of people.
In case you haven’t figured it out… I’m talking about what Apple did with iMovie and Final Cut Pro. iMovie has WAY more users that Final Cut Pro 7 ever did, and the potential for far more long term growth. It was the obvious platform to build on top of.
Essentially, the “pro” editors wanted Apple to have their students spend grades K-12 learning everything in English, and then when those students went to “college”, all of their courses would be taught in French.
Most outsiders would think that was a really dumb idea. However, the professional editing community loves to throw around the iMovie Pro insult like it’s actually, you know, insulting. If you want to see how contentious it can get, check out some of the comments here and here.
Your introductory tool should be something that paves the way for users to graduate to the more advanced tool. They should not need to relearn everything they had already learned in order to become “professional”.
The truth is that some of these “ Pro” editors simply do not see the bigger picture. When they call FCPX iMovie Pro, what they don’t realize is that they’re actually complimenting Apple for having competent business strategy and common sense.
My guess is that most of these “Pro” editors will understand what Apple had in mind when all of the iMovie kids start showing up at post houses and start asking the “professionals” why they can’t do all the things on their “professional” software that they’re doing on their home computers.
For many people who have been around for awhile, it will be eerily similar to what happened when the original Final Cut Pro became popular and an entire industry was caught off guard.
What’s really ironic is that the people who are complaining the loudest about FCPX are the Final Cut Pro 7 editors. I kind of feel like they should know better. They seem to not like the taste of their own medicine.
I find it all a bit hypocritical. Things change, and tools change… but in terms of the changing of the guard… the more things change, the more they stay the same.
Over this labor day weekend break, Videomaker (via Creative Cow) serves up a true blast from the past, a preview of the at the time revolutionary new non-linear editing systems. These days with all the fervor over Final Cut Pro X and whether it’s professional enough compared to other NLEs it’s good to look back and recall what the world of non-linear editing was like just 20 years ago. Some choice quotes:
After making your edit decisions, the computer transfers each scene from your original master tape to the record VCR. Hence the final product is never actually digitized. This system offers the speed and flexibility of non-linear editing and the uncompromised image quality of analog tape.
At this point, tape is the archival format you might pull up some stock footage from or an older camera someone is shooting. But really, have you actually seen a tape deck in years? DV or Analog?
When the register finally rings, the average price for a basic non-linear package comes in between $5,000 and $10,000. For stand-alone systems that approach broadcast quality, expect to spend twice that much or more.
Final Cut Pro X is $299 from the Mac App Store and you can get a pretty decked out Mac Pro starting at around $4-5K that’s capable of cutting 4K footage at feature film quality. “Broadcast quality” as a term is rarely even thrown around any more because it’s just an assumption made with most NLEs. You can still work ‘offline’ say if you’re cutting proxies for a feature film to be finished on 35mm film negative. But these days the workflow easily exists to cut 4K or 5K or 6K original media throughout your edit and print out to DCP.
What if your work will never end up on videotape, but instead on a computer hard drive or CD-ROM? Non-linear is certainly the best option. Find a system that records video and audio files supported by your distribution format. Many non-linear software packages can export either Microsoft’s Video for Windows or Apple’s QuickTime formats.
Wow, some terms in there we still use but just barely. Video for Windows was shuttered into several other APIs and initiatives for Windows over the years (ActiveMovie Anyone??) while the .AVI container somehow refuses to completely die off.
To read more about the history of the future of non-linear editing, please visit Videomaker’s original piece here.
And if you want to go even further back in the past check out this 1990 video from the Computer Chronicles show on the Video Toaster: