Buy Mike Recommended
edit systems & gear
from Silverado Systems
Buy Books, Software, & More
at HD for Indies Amazon Store
Buy New Movies from
HD for Indies Amazon Store
Or, you can also support
HD4NDs by contributing
to the tip jar...
Help Support HD for Indies
Great HD Links
- HD For Indies Home Page
- HD For Indies FAQ
- HD 24
- Bare Feats
- 24p Entertainment
- Light Illusion (was Digital Praxis)
- OneRiver Codec Resource
- HighDef.org Info
- Understanding RAID
- Video Systems (Reviews)
- DV Film (DV=>Film)
- Plus 8 Digital (vendor)
- Digital Cinema Society
- Texas High Def (local F900 guy)
- Creative Cow (news & forums)
- Philadelphia FCP User Group
- Los Angeles FCP User Group
- Cinema Tech
- DV Info's forums
- HVX User
- Pro App Tips
- Bluesky Media - Instruction
- little frog in high def
- VideoMaker Learning Section
- Stu Maschwitz's ProLost
- March 2004
- April 2004
- May 2004
- June 2004
- July 2004
- August 2004
- September 2004
- October 2004
- November 2004
- December 2004
- January 2005
- February 2005
- March 2005
- April 2005
- May 2005
- June 2005
- July 2005
- August 2005
- September 2005
- October 2005
- November 2005
- December 2005
- January 2006
- February 2006
- March 2006
- April 2006
- May 2006
- June 2006
- July 2006
- August 2006
- September 2006
- October 2006
- November 2006
- December 2006
- January 2007
- February 2007
- March 2007
- April 2007
- May 2007
- June 2007
- July 2007
- August 2007
- September 2007
- October 2007
- November 2007
- December 2007
- January 2008
- February 2008
- March 2008
- April 2008
- May 2008
- June 2008
- July 2008
- August 2008
- September 2008
- November 2008
- December 2008
- January 2009
- March 2009
- April 2009
- May 2009
- June 2009
- July 2009
- August 2009
- September 2009
- October 2009
- November 2009
- December 2009
- January 2010
- February 2010
- March 2010
- April 2010
High Definition Video for Independent Filmmakers
A How To Guide for Digital Filmmakers
Welcome all! This is my blog to share my latest research,
thoughts, etc. on utilizing HD for independent filmmaking.
YES, I am available for consulting
Contact me at email@example.com
All content copyright 2004-2007 Mike Curtis.
Saturday, April 29, 2006
I got home from NAB at about 4:30am on Friday morning, and slept the sleep of the dead...no, the sleep of zombies run over with a steamroller. Got up around 2pm and didn't do much but cook dinner and watch War of the Worlds on DVD (doesn't hold up to the Big Screen experience at all).
So what's next?
-Rita Sanders was kind enough to come in and help organize bins from The Texas Shootout while I was gone
-in a very Rube Goldberg-esque manner, the XDCAM HD cartridge that went home in a camera was handed to a person who took it to NAB who gave it to another person whom I'll have lunch with this week to get it back. Then I just have to get my computer, an XDCAM HD deck, and that cartridge in the same place at the same time.
-in the "The Universe Taunts Me" category, I was at NAB and saw 24p footage from the Sony F350 XDCAM HD, Canon XL H1 and JVC GY-HD100U HDV camcorders being cut natively in FCP 5.1....but I don't have it. I'm debating whether to wait until that ships, or just DEAL WITH IT in the meantime. Probably the latter.
....all of which leads me up to it is time to start going through all that footage and doing some actual analysis of the over 500 clips that I have (6 cameras, 24p, 24pA, 50i, 60i, live, from "tape", etc.) rather than just flipping through them and saying "that's interesting." The goal is to go through all the footage, document some hard numbers, do some analysis, render some opinions, render out a lot of 6-up comparisons, and make that footage available on SD and HD DVDs for purchase.
-other major tasks - I need to migrate the server for HD For Indies from the freebie/buddy server I've had to a "real" server, that'll be able to handle some of the new tasks I want to throw at it
-I want to do some serious testing of the stack of PCI-X, PCIe, AJA and BMD HD-SDI cards I have and see where the differences might lie.
-oh yeah - and got through the few hundred photos and 168 audio notes I have from NAB and post all that stuff up, so that's probably my first priority since it is a matter of timeliness about all that info.
-just as a "I just thought of this" point, I'm kind of glad there aren't major new OS, NLE, or camera options shipping imminently that would obsolete all this research that I want to do that'll take over a month I'd bet.
-also along those lines, since the "technology demonstrations" that Apple showed in the booth were of minor improvements (support for new cameras), and there was ZERO mention of a Final Cut Studio v5.5 or v6.0, I'd think that a significant new version is some distance away - if they were going to ship this spring/early summer, they would have previewed it I'd bet.
So I'm probably taking tomorrow off to do some Catch Up On Life stuff, but Monday, I'll split my time between server migration and starting to go through all the pics and stuff.
But feel free to check the website Sunday, I might post some stuff. Check often and compulsively. Click reload lots. And click on some ads on the right if you see something interesting if you feel like that too (or not).
...is an interview James Masters did with Ted Schilowitz of Red Digital Cinema Camera Company on Day Two of NAB. Ted talks about deliverables, Jim Jannard's role and involvement in the company, etc.
Scroll down and look for:
and click on the words "Quicktime Movie"
Friday, April 28, 2006
I'm sitting bleary eyed in McCarran Airport in Las Vegas, plane is set to board in half an hour.
Today was the blitzkrieg assault on NAB for what I hadn't seen yet. I missed a lot of stuff, but saw a lot of stuff too. Flipping through my pictures, I see that I visited the following. More to follow, just touching on what I saw, it'll take me some time to write it all up. So consider ALL of this Late Night Beer Talk until I can go back through my audio notes and confirm stuff. If I get stuff wrong, tell me but don't bitch at me - this is alll off the top of my head! And yes, I should be posting links to all this, but it is now (as I revise and edit) 1am and I'm on an airplane at 37,000 feet travelling hundreds of miles an hour. Even if the WiFi signal got up here, I doubt I'd be able to surf for free on an unlocked network for very long!
Stuff I saw on NAB 2006 Day Four:
Silicon Color - makers of Final Touch HD that I am intimately familiar with but need to finally write a review of - recently released Final Render, their distributed rendering product. It is shipping and everything.
-ADTX makes very high speed fiber channel RAIDs, I think I recall seeing 585 MB/sec - wow! uses ATTO 4 Gbit cards for PCIe Macs (other choices too, that works though)
-S.two has a new packaing of their field recorder, they call it S.two Take2, it costs something arouind $50K. There is also a docking station and ADIC tape storage, either SDLT 600 or LTO3, I can't recall which. It does two simultaneous backups, the idea is one drive mag to two identical backup tapes. Slick, solid workflow, kinda pricey, is that the only way to do it?
-LaCie has some new itty bitty storage options that are bus powered, 160-320 GB, but my image is corrupt and I can't see much. I'll have to refer to my audio notes. And my &*^($&^$ card in my camera is corrupting images, so I'm losing a bunch of my pictures and don't know it until I try to review them. Time for a new chip! They got some award too, I have in notes somewhere. for Litle Big Disk (that new little bus powered guy - and it's cute!)
-EditShare is a hardware/software NAS solution that runs on GigE for shared workgroup stuff for Mac/PC. They specifically are addressing the workgroup edit needs of sustained throughput, etc. Definitely need to follow up with this, could be lowest cost workgroup editing solution I've heard of (except for that 10GigE Small Tree solution I saw at MWSF but haven't followed up on).
-Panasonic was showing a 103" 1080p plasma HDTV. Wow. Damn. Gimme!
-also a model "only" 65 inches across (about 5 1/2 feet!)
-the FireStore DTE FS-100 is shipping, This is the gadget that plugs into the FireWire port on your Panasonic AG-HVX200 and replaces the functionality of a P2 card mostly, but the price/GB is vastly better. $2200, holds 100 GB, and ALL recording modes go onto it at 100 mbit, so it is inefficient in it's storage. Got an imperfectly clear answer as to whether you'd have to manually remove 3:2 pulldown or not when shooting 1080p24, etc. Guy thought so, not sure.
-Panasonic's AJ-HDX900 is very much like the SDX900, except high def. I think writes 1080p23.98 but with 3:2 pulldown. Jeff somebody was back who explained Varicam pulldown to me so well two years ago. Tape not P2
-Saw a P+S Technik Mini35 adaptor on an HVX200...but it had tape around the lens joint. For real, or a mockup?
-Ikegami Editcam - no 1080p23.98 supported at this time, so no indie filmmaking, but records to DNxHD codec, so great for Avid users, suxors for everyone else. Records to a hard drive.
-Grass Valley Infinity - I think I missed it...Doh!
-JVC GY-HD200 and GY-HD250 - they BOTH record to 720p60 in HDV....but still using 19 mbits for 60fps, not 30fps - sounds like with lower data rate per frame, which scares me in terms of image quality. They are downplaying that, saying they'll probably go with a 10 or 12 frame GOP (so obviously that is still up in the air). Due September, $8Kish for the 200, more for the 250. 250 also available in studio trim. 250 has an OPTIONAL S16mm lens adaptor, ability to flip image at the chip (for when using the adaptor), and has HD-SDI and timecode out BNC ports. Oh! And RUMOR (totally unconfirmed) has it that it is essentially the same codec - so the "sticktion" problems of pixels sticking in patterns between GOPs in subtle motion may still be there, and that is a darned shame, because that is one of my quibbles (not lethal, a quibble) with that camera.
-talked to guy from red rock micro about his 35mm lens adaptor he was showing on the JVC cameras. Adaptor, rods, and follow focus for around $1600. Pretty cool, I'll have to catch up with him on this product. Nice guy.
-JVC working on a 24" 1920x1080 production monitor, the DT-V24L1D. Boy, that just rolls off the tongue...
-Doremi still makes DDRs, and they are still expensive. Looks like very competent, professional gear, but in the back of my mind I know I'm still holding a grudge against them for their snarky attitude towards me in the booth last year. Not fair to them, I know, but it makes it hard for me to like their product/be totally fair.
Along those lines, DAMMIT, I realize now that I missed out on getting over to RaveHD to talk to Ramona, who is a total princess of sweetness, and they are very open about client feedback and changing the product to suit the client's needs in their attitude. They have a new version that I have press releases on but I really wanted to talk face to face, she was very cool last year. If I had to pick a company to work with between those two, RaveHD would win in a heartbeat based on price and attitude.
-Thomson IS making a modfied Viper body, but it's not big deal - mostly to make it more robust to mount more heavy gear on it, maybe some ergonomic stuff. But nothing radical.
-oh yeah - the Sony F900R is lighter and has HD-SDI on it, but was largely built to be a green product that they can sell and be legal according to the stricter European regs. A Sony rep, who was manning the F900R station, said that if you put it side by side with an F900/3, it'd look the same.
-saw it they other day - the Sony 1080i 180i (yes, 180 fields/sec) is $270,000. Yeeeeeehaaaaaw! Sign me five of'em. Yeah, right. Oh, and they can do 720p180 as well (and 180i150 and 720p150). But for the networks to shoot high speed on the football field, these should do great. Output goes down a big fatty cable (I'd guess) to a big honkin' box of hard drives, roughly double microwave sized. Arnold Schwarzenegger in his prime couldn't have shot handheld with this on his back in his prime. Not a portable solution AT ALL.
-the Thomson/Grass Vallley Venom flash memory (no drives) recorder for the Viper is out, is about $50K, don't recall the capacity but hope I got it in my notes.
-the ARRI D-20 is getting rented out these days, around $3000/day, 3 day weeks. It does have an optical viewfinder/mechanical shutter arrangement, and BOY that makes a difference! Even with the 1080i viewfinder on the Panasonic HDX900 (which is B&W anyway), the clarity is a tremendous difference, and this is a BIG boon for the operator. ARRI is doodling with recording RAW at full 2800ish by 1400ish (I forget the 4:3 image sensor res, I'm doing this all off the top of my head) RAW feed to a modified Quantel eQ or iQ like device in a road case for recording, realtime playback of RAW and grading on set. No price yet, just experimenting and showing off the idea. This is a good thing, though. No plans to ever sell the D-20 at this point in time. They slap a rebadged Venom on it and give it an ARRI-esque name. Forgot to ask max frame rate, but take everything I say about this as potentially wrong until I confirm with their guy that he said it right and I remember it right. Consider this beer talk that might be wrong, in other words, as I write this now. Lots of cool modes on their modified viewfinder. If the ARRI guy is reading this - I said I'd check with him before publishing, sorry not doing that, but all are duly warned this is NOT verified, OK? Later stuff will be verified.
-ReflecMedia is cool stuff - grey cloth with reflective microbeads (like the shiny stuff on your running shoes that reflects headlights directly back). That combines with a ring of LEDs around the lens (in three different sizes to work with different lenses) that are either green or blue. Since the blue or green light reflects directly back, it works amazingly well (or at least it looks that way). When I was looking off-axis at it, it looked flat grey. But looking at the image from the camera, which had the ring of lights, it looked pretty darned blue. More on it later.
-Vision's Phantom Camera has a very interesting high speed camera - a big one with a 65mm sized imager that'll do 125 fps, and a smaller one (35mm sized I think?) that'll do up to 1000 fps at HD res. Much more on this later, very very interesting. But $100-$125K for HD model, $200-$225K for bigger one. Records the raw output from single sensor CMOS. They showed some super slomo (like 1000s fps) from their other industrial cameras in the booth that had clearly been color corrected, and it looked pretty damned good, so this bodes very well for the future of this product, I wish them all the best - Connie from the company was earnest and open about the fact that they have mostly done industrial cameras in the past (and for over 10 years I think), and are just showing these cameras for the first time here and
-Most Wicked Tech Award goes to the mk-v.com (that URL may be wrong, could be pron for all I know as I write this offline) for their incredibly bespoke bit o'gear - it is a steadicam rig that let's you rotate the arm on a pole and keeps the camera (mounted on a horizontal beam about 6 feet long) to stay up as you rotate it around. Hard to describe, amazing in action. About $80K, but obviously an incredibly specific bit of gear.
-Miranda has the DVI-Ramp2 that does DVI to HD-SDI conversion.
-JL Cooper has a Final Cut Pro control surface with jog/shuttle and a bunch of automated fader switches. Very slick in action. Also a smaller jog/shuttle and a bunch of buttons. Looks like serious gear, more robust and client presentable than the Contour Designs' ShuttlePro v2 (which I own and use).
-Cobalt Digital makes an HD analog component to HD-SDI converter for about $1200 I think it was. No DVI to HD-SDI.
-Teranex has dropped price on their big stuff to around $50K and does realtime dust removal, and user guded scratch removal. Supposed to do a very high quality upres, but I didn't have time to watch (got the 3 minute demo at 3:41 when the show closed at 4 and I had lots left to see).
-Quantum makes SDLT 600A, which is like their SDLT 600 tape drive, but with a GigE port and an FTP server on it (for one root user at a time - tape seek times are up to 6 minutes, not 6 milliseconds like a drive!). About $7K, and uses MXF wrappers (doesn't transcode/alter/degrade video content). Can even grab just sections of a clip based on timecode. Very interesting idea. Stuart English from Red booth talked to me for a few minutes about it and asked what I thought, said an interesting concept but I needed to mull on it - would it be stable, reliable, around for a while? What was the break even point as compared to just backing up projects to drives? 300GB per tape. Sounds like some possible workflow advantages, but my paranoid self wants to investigate the alternatives.
-stopped by Ciprico (now owns the Huge brand, Medea is now owned by Avid but I forgot to ask about'em during my interview the other day), they are making some KILLER fast storage, like 300 MB/sec plus from a single 10 drive 4Gbit fibre channel array (I think that is right, not sure), and they striped several of some sort of RAID together and got over 600 MB/sec. Better yet, they have what seemed a prototype (not pricing set yet), with 40 (yes, 40) 2.5" SATA 7200 rpm 100GB Hitachi drives, 4 groups of 10 drives, throughput (don't know what RAID level) in the 1.3 to 1.4 GB/sec. Yeah - gigaBYTES, not gigaBITS, per second. Wow. That'd be sufficient to do the full res output of the Red camera RAW I think at max size, bit depth, and frame rate based on my napkin math. I recall something about an Infiniband to fiber bridge for speed. Infiniband is FAST - 20bigabit/channel - so that's what, 2.5 gigabytes per second per Infiniband port? Fast stuff, bodes well for the future.
In the last few minutes of the show, I was practically sprinting around the show with my map folded to show the last few booths I wanted to hit. I did a hit and run at Flip4Mac, just enough to snap a pic of their booth graphic and steal 90 seconds of time from a very tired but polite rep who answered my last few questions. It looks like in about a month there will be an import module to pull Ikegami Editcam footage in (and transcode I guess? Gotta look it up on the website) and also one for XDCAM HD (but Sony will have their own import module - or are they merely licensing the one from these guys?). Both modules will be about $500 apiece, due in about a month.
Rumor around the campfire from VERY unofficial sources (other showgoers talking to third party vendors) is that the next version of FCP might come out this summer. I could buy that for the support of new formats (the technology demonstrations of support for 24F mode on the Canon XL H1, 24p mode on the JVC GY-HD100U, and Sony's app to import XDCAM HD and edit it natively), but that doesn't feel at all like a major release. Maybe the next release is a 5.2 or a 5.1.1 or somesuch - that's the next RELEASE, not the next VERSION. The vibe _I_ got from talking to the oh-so-please-don't-quote-me-on-that Apple folks I talked to was that a new version would be out before too long, my own PERSONAL interpretation of that was that I might see it in the May/June timeframe, but that's just my personal guess based on them saying that technology demonstrations preceded shipping versions by a not very great amount of time. We'll see, we'll see.
Actually, after the show ended I was walking back to the RED booth to collect up my stuff and on the way passed by Matrox and realized I hadn't checked out this MXO thingy that I'd heard some buzz about. The folks there were nice enough to give me the 60 second scoop, and I asked for an eval once it shipped, so maybe that'll be coming my way. So here's what it does - it's a Mac (PC later) hardware device that plugs into the DVI output of your computer and converts the output to an SDI or HD-SDI signal. 720p, 1080i, 1080p maybe in future releases (all configurable hardware). BUT, unlike some other products that I've seen in the past or heard of, it ALSO has a software component - a QuickTime module that takes the QuickTime output and routes it to the DVI output to go to their hardware gadget for SDI/HD-SDI output. They say it does interlace correctly, and if it is doing what I think it COULD and MIGHT be doing, this sounds like a really, REALLY nice idea that could make it possible to do good quality output from even a laptop. So if your ingest if FireWire based, this could let you capture then monitor and then output to high quality stuff. I forgot to ask/look to see if it also has analog SD/HD component output, which is what would make this product killer. Oh, and it's $1000, waaaaaaay below other similar products. I think I recall something about gamma and white point control, BUT that might be me mixing it up in my head with another product. It is nearly 1am on the plane as I write this. Late night beer rumors, remember? Don't take any of this as gospel.
Somebody passed on a second hand story about how at a FCP user group my name came up (the good part) and a speaker dismissed me as a BlackMagic rep or on the BlackMagic side/team/something, with the possible implication that I was not to be trusted or was biased or something (the not so good part). To set the record straight - I worked for BlackMagic in their booth at last year's NAB, no secret there, and for doing so I got some "store credit" that I used to discount the price I paid for the Multibridge Extreme I bought - which was the same deal the other folks got working the booth who weren't BMD employees (mostly resellers). I wrote a review that went in some magazine I think, they have me on their site as a user story, and I had previously purchased some other BMD cards (at the time because they cost less than AJA products, and I am tight when buying gear for myself), so I have a tendency to write more about BMD than AJA - because I use'em regularly, and because I check for new versions of drivers and stuff. I have some AJA gear that I'm going to be running through it's paces, and I plan on a big ol' head to head review/comparo thingy in the not too distant future of all these bits o'gear, and I intend to be as non-biased as I can. But I like the AJA gear just fine, used it daily without major problems at the color correction business that I'm no longer involved with for about 6 months, and it's totally solid gear (that was a Kona2 for those keeping score). Their gear is solid, I love their UI to control stuff, and I get along fine with those folks - I just worked with Ted for a week with Red (he no longer works with AJA), but we've been totally friendly and cool over the last coupla years I've known him, and I hope to develop a rapport with Jim Thorne who is his replacement at AJA and I got to briefly say howdy to at the FCP User Group meeting Wednesday night (which Red totally crashed the party, coming on stage and giving away a Red "r", the paying sponsors must have been miffed when we took the stage and got wild, jubilant cheers from the crowd, I'll post pics and vids, it was OH SO MOST EXCELLENT to be the Rock Stars).
Interesting conversation with Grant Petty of Blackmagic Designs - he sees their market more as the indie content creators, and small shop creative professionals in general. Talking to Ted Schilowitz (formerly of AJA, now of Red), and hearing AJA messaging over the last year, they tend to focus their message more towards the broadast and more broadcast/business type of clientele. This is the first time I'd considered these two companies weren't aiming at the exact same type of users (although there is certainly overlap in their product lines).
Also, ran into Lisa from hdrvfx.com in the Red booth - they sell HDR panoramas, reflection maps, stuff like that for 3D rendering. All their stuff is real world shot, not 3d rendered, and is purchaseable onesy-twosey off their website, not on a big honkin' CD or DVD with zillions of other things you don't want (great thing about the web - make purchasing a hands off operation, no human intervention required, makes selling things in smaller quantities toally doable). Pricing starts low, I recall something was $4, something else was $13 to start. Available in a variety of resolutions, sounds good for those needing these niche products. Keep in mind these are HDR - high dynamic range - so it takes about 30 photos to make a stitched panorama with multiple exposures taken at each viewing angle. Haven't seen any of their stuff yet, but she gives good vibe as a reasonable human being (not everyone at the show did, BTW), so worth taking a look.
I ever so conveniently, late in the day, stumbled across eCinemaSys' booth, and was that a stroke of luck - I got there right as they were starting their last presentation of the show, and I had to sign an NDA (having to do with European patent stuff, so I got a "private viewing" not a "public demonstration"), and blazed into the darkened tent they'd set up in their booth and plowed right into Martin Eurejian (sp?) himself - The Man That Is The Company. He has a new, 40" LCD based critical color monitor that was doing some nice very low level blacks at a true 24p (dunno 23.98 vs 24.0, press releases coming next week), he said it was either 10 or 12 bits at the pixel level of real world response (I can't remember which, will have to check my notes), but that even after gamma and calibration settings should get at LEAST 1000 gradations of real world performance (so therefore must be 12 bit to be cutting DOWN to 1000 levels). Wow. Shipping sometime later this year, can't remember when. Martin's a very smart guy and has industry credibility, LCDs have been hard to make Do Right for color critical work, he's doing some of the most bleeding edge work out there.
CineTal has a 24" 1920x1200 pixel LCD monitor for color work, I talked for about 5-10 minutes with a couple of their guys; their product range covers from about $8K up to about $30K for 24" LCDs with various features (some of which are pretty hard core, like frame stores, built in wave/vectorscopes and such). They come with presets to calibrate for both rec 709 (HD) and for the DCI stuff.
Back at the Red booth, I ran an errand and got to meet Les, The Dude who owns Cooke lenses, and got to see a P+S Technik 35mm adaptor for the first time, face to face (seen picky-churrs, never held one). It's a heavy and solid thing, heavier than the redrock product (not that means one is better than the other, just observing here).
And DRAT! I also missed out on all holographic storage, and I really wanted to check into that. Sigh - that's the tradeoff of working 2 or 4 days at a humongously huge show.
They did have a MyNAB website that I was "too busy" to go look at, but I saw a sign at the show that it helps you find what you want, even helps plot a map or something - it would have been worth the half day it would have taken me to go through all that, because just like time on set is invaluable and never replaceable once it is gone, time during NAB is the same way - you need either sleep, or partying (excuse me, "networking"), or to be hittin' the show full on and it's VERY tough to find time get organized, or the potentially bad decision to blog during show hours, or even find time to keep in touch with the rest of your world back home during times like these. The right way to hit NAB? You make a list of what you KNOW you want to hit, then you use the MyNAB to find other things of interest, then you map out a course to efficiently work through the halls (it took 20+ minutes to go fetch something in Central from South hall), and you mark up your map with destinations circled, names written, and even then it'd be tough to see everything you want over 4 days, so you triage and prioritize - see the most important Day One, and work your way down to Day Four is follow up for new questions you thought of staring into space in the shower, and seeing those things that would "be nice to catch" rather than "I'll be hacked if I miss it" stuff. Leave a little time for catching the neat new stuff that there's buzz about, like for me the crazy revolution steadicam thing and the Phantom camera I hadn't heard squat about unti Jarred Land from dvxuser.com clued me into it (and it was Jendra that told me about the Revolution, for that matter).
OK, I think that is enough for now - I'll probably post this around 5am Austin time from home, right before I crash out.
I'll be going though notes and posting bazillions of pictures related to all this (unless my damn memory card decides to eat them all) ASAP, but I gotta roll back into my life once I get home - I've been gone for over a week now.
(edit - it's 4:35, got home 10 minutes ago, time for posting and BED!)
Thursday, April 27, 2006
Went to Apple booth - it's better to not have a press badge showing, just slows things down for what I want to ask.
Went to Avid booth - had a nice long interview with Xpress Pro product guy, I hope my audio recorded interview is more than just murk.
Same for the nice long chat I had with Matt Dowlong from BlackMagic about their product line, and specifically their new under $1000 PCIe card that does EVERYHING except 4:4:4 (but does do analog HD component in, among other things). It sounds very very similar to the AJA Kona LHe.
Went to Grass Valley and found the Infinity camera, spent 10 minutes until he raised his hands when I asked about frame rate stuff and he said "Stop - it's not a film-like camera." (meaning no 24p). So I said thank you very much and that was it for that camera.
Went to Sony booth, saw lots, too much to say here. Vegas is 8 bit pipeline, still.
Went to Adobe booth, saw no new product except for some Adobe Reader stuff that was new to me.
In their booth, had a long talk w/lots of pictures and video about the Silicon Imaging camera, that also won an award. It's an 2/3" inch CMOS sensor image block that sounds the RAW output down a GigE connection to a WinXP Embedded computer is the gist of the thing. Some cool UI stuff, more later.
What else? Autodesk has high end toyz.
Went to the Final Cut Pro user group, ran into Greg Bernstein, haven't seen him in at least 7 years! We worked together in San Francisco 10 years ago. Hi Greg! I also met Shane Ross, the Little Frog in High Def guy. Along those lines, I've run into tons of folks I've only emailed or heard about. It was also incredibly gratifying all through the show to be referred as, and I quote exactly: "Hey!...you're the HD for Indies guy!" Everyone has been super nice, capped off by the incredibly kind gentleman from Venezuela that brought me a bottle of rum all the way from Venezuela just as a thank you for doing the blog.
"Ah, shucks folks (looks down and kicks dirt with toe of boot in the dust of the corral)" is about all I can say - everyone's been great and nice and extended a very warm welcome, and I really, really appreciate it.
One eye closing itself now, gotta crash.
I tried to blog this one from the show floor, but I couldn't get online. But this morning, a package was dropped off at the Red booth, and opening it up, Ted discovers that Red won a major award at the show - the AIM Award for Innovation In Media for Content Creation (or similar verbiage to that effect).
Score one for Red! Ted told a hilarious story at dinner about how he stole the RED t-shirt off Mark's back zipped over late to receive the award at some big industry luncheon - so it was a row of grey suits, then Ted and JoAnne Yu (who is of course, whip sharp as well as gorgeous) in bright red RED t-shirts. So the photographers were all madly snapping away at their most blatantly obvious metaphor they were going to be presented with all week - a row of bland, grey, same as always Guys In Suits, who just lost out to a guy in a bright red t-shirt with the hot babe standing next to him.
It'll be interesting to see the NAB daily newspaper tomorrow, I'll bet a shot just like that is in there somewhere.
Update - no pic unfortunately, but official coverage here.
Wednesday, April 26, 2006
And more importantly, for the ones with no booth indicated, anybody know where I can find them?
Autodesk - Maya & Toxic - sl3719, S110-MR, S112-MR
Colorspaceinc.com Center Central Hall Booth # C11331
Vision Research, Phantom camera - C10225
JVC - C3217
Canon - SU131
Sony - deck? (HC5/7/FX2) - Vegas? - SU107
Ambarella - H.264 based camera NO BOOTH #
Panasonic - cameras and 26" monitor C2518
Grass Valley Infinity SU-2906
Arri - C6926
Panavision - ???
Thomson - Viper stuff - SU2906
ADobe - Silicon Imaging, too - by Apple in SU
Apple - JPEG2000 codec? in SU
Avid - SL701, SL2387, S229
Silicon Color - SL1816
Cinetal - C7729
Codex Digital - DDRs - ???
Christie - SL1519
Vydeo - eSATA for laptops -??? NO BOOTH
Sonnet? - no booth #
Firmtek/Seritek - SL1238
Huge Systems - SL4987
Medea - IN AVID BOOTH?
Gefen - SL541
G-Tech (in B&H behind Sony)
Kinetta - NO SHOW -
Cineform - NO BOOTH #
Iridas - Framecycler, Speedgrade HD - SL4987
Pixel Corps - and the other plugin Pavillion stuff behind Adobe-ish
-Bella Catapult (and other keyboard stuff) - SL1773
Tangent Devices? - no booth #
Other hardware controller surfaces? a little help, folks?
Sonic - HD DVD authoring - sl3750
EditShare - SL1410
Edirol - sl3781
FINALLY getting around to organizing some pictures:
Pictures from Day Zero and Day One with Red - exclusive, behind the scenes pics of setting up the booth, getting ready, and then the first official day of Red in public
Crowd Pictures from Day One pictures from 9am to 9:40 or so showing how fast the crowd grew on the first day - pretty amazing for a stealth launch with minimal publicity and zero advertising. Times beneath each photo.
Pictures from Day Two at the Red booth, including Snake Girl - the booth across the way from us was getting similar crowds, but they make some boring stuff and had to resort to fire eaters and a belly dancer with a snake - no kidding! So of course I had to get a picture with her.
The proposed specs on the Red camera are all very exciting, and it should be very interesting to see how it all plays out.
But one thing I want to come back to is WHY this camera exists, and what it is trying to do.
One of the first conversations I had with someone about this project involved dicsussion of film vs. video. Disregarding all the quality differences, it was a discussion of the two completely different approaches that the two types of products were produced with.
Film cameras were really about buying a body, and putting different lenses on the front and different media (film stocks) on the back.
But with video cameras, it is very, very different. If you buy a nice enough video camera, you can swap out lenses on the front end. And that is a very good thing. But what you can't do is swap out media on the back end. The closest thing you can do to it is plug different decks into the back of it, but that is an incredibly limiting choice - you're tethered to that huge blob to capture it's data, which is itself tethered more often than not to an AC power outlet.
If you want to record to a different media and still be mobile, you have to go out and buy an ENTIRELY NEW CAMERA. I'm not aware of film stocks that come out that you have to buy a whole new camera for as long as it is still the same basic format - 16mm vs 35mm, Super or not.
One of the goals of Red, in my opinion, is to be closer to the film camera kind of model - you buy a camera body with a lot of capabilities, but you can put different lenses on the front and put different recording media on the back. You can buy new lenses or recording media to suit the needs of your project.
Need 720p or 1080p or 1080i or 2K? You can do that now with the built in recording to RED-DRIVE.
Don't want to subject it to heavy g-loadings for fear of crashing the heads? Use REDFLASH.
Want uncompressed RAW? Use REDRAID. Need RAW and to be untethered? Use REDRAM.
"Well gee, what if something better than REDRAID comes out?" - there's a high speed port on the back of the camera that will connect to the REDRAID. At this time, they are talking about using something like Infiniband of multi-link fiber channel on an adaptor to go to the REDRAID. But since it is a port, it's possible that if some GooglePlug comes out in a couple of years, you could use that to connect to some new kind of storage technology.
Want a new image sensor? It is upgradeable.
Want to record to something else or new that comes out? The camera has built in HD-SDI single and dual link, 4:2:2, 4:4:4, even 4:4:4 RGB log output. Plug into these industry standard taps and away you go.
Studio Daily | First Look at RED!
Steve Gibby has this nice detailed interview with Jim Jannard, founder of Red, and details on the camera. Should have linked to this Monday, just lost it in the fray.
I had told the Red folks I'd only be able to work for a day and a half in the booth and that I'd need to boogie on out mid-day Tuesday to start hitting the show floor.
But I didn't - the experience has just been too good for too many reasons to leave.
Tuesday morning the crowd at the booth was a little slower getting started than Monday, but I think that was just because everyone was hung over from partying.
But it quickly built up to the kinds of crowd that it did on Monday, as in massive. Frederic Haubrich and Ted Schilowitz were taking turns doing the presentations out in front of the booth, and every time the crowd filled up the aisle and spread wide. We were joking that some boring optical connection and routing booth around the corner was getting about the same kinds of crowds we were, but they had to resort to fire eaters and a belly dancer with a snake, whereas we had a closed tent and just talked about our product out front and people were lined up waiting, running around back to sneak a glimpse through the gap in the corner of the tent! (I'm not making this up, those guys really DID have a fire eater and snake girl, I'll link to pics in a bit)
So I ended up working the booth for the rest of the day. I personally saw Disney's technology folks came through, Technicolor staff, Sony, Canon, Zeiss, Cooke, all kinds of industry folks, on top of the crowds of regular show goers.
People continue to believe in the vision - at the end of the day, when the rest of the phone calls get returned, I expect around 185-200 cameras will have been reserved before the show opened this morning. Passing on my shot at #12 yesterday...hmmm. Bad idea?
I literally didn't see a single other thing at the show, Red was nice enough to bring lunch in for us instead of us having to scatter to the winds to forage for sustenance, so today is my first day getting out there. Thanks muchly to all who posted comments about what I should go see, that is largely my guide for the day.
Please continue to post comments with booth #s and I'll see if I can check mail through the day.
I'll see if I can get some pictures posted too before I head out.f
Oh, and a bit more on Red - the sensor is upgradeable, Jim has committed the company to that.
Did I say you can send a Viper FilmStream like 10 bit RGB 4:4:4 log signal out the dual link HD-SDIs? You can.
For 2K, you'll also be able to record to 10 bit RGB 4:4:4 log REDCODE codec, too.
So the codec will have 4:2:2, 4:4:4, and 4:4:4 RGB log modes clearly.
For anyone who says that Red can't do it, this thing isn't for real (not concern of doubt, but a definitive statement of CAN'T), I overheard someone say "grow a pair of balls and go say that to Jim's face."
If you have any questions about Red, email me or post them in the comments - I can say as much about this camera as any Red employee.
Tuesday, April 25, 2006
...so I've had zero time to look over press releases and figure out what I should go see at NAB 2006. So I'm turning to you folks - what should I go see? If you know of a company or technology that fits into the stuff I usually cover or that you think would be of interest to the readership, please click on the Comments link below and let me know. If you know the booth #, all the better - please include it.
It's been an incredible day. It is after midnight and I'm finally getting back to my room after getting up at 5am.
For those that don't know, I've been working in the RED booth today and will again tomorrow. Why? Because it is Sliced Bread 2.0 as far as cameras and workflow goes. Many web friends and acquaintances walked up to me and asked quietly essentially the same questions - "Dude, is this thing for real?" and I unhesitatingly said yes. With that, they'd walk their credit card over and order one. It's a no brainer - if you can afford to let RED hold your $1000, you get a place in line (a line growing longer about 10 cameras per hour when booth is open), and can get a refund any time you feel uncertain about it. You'll be able to see test footage and hear people's comments about it, and if you order now, there will be over 100 units already shipped that you'll be able to hear user feedback on before you're committed to pay the remaining $16,500 for the camera body. This isn't some shakily financed deal, this is Jim Jannard's, the founder of Oakley's, personal baby project. If folks want their money back, they'll get it. So what's to lose?
When the doors opened at 9am, there were already a few people standing around the booth. But 9:10 there started to be a group standing in the aisle outside the closed tent of the booth, and by about 9:20 it was a LARGE crowd. Don't believe me? Check out the pictures for yourself - time pics were taken below each picture.
I went back and forth between manning the booth and manning the workflow demonstration area - we had a Kona3 demonstrating the newly revealed (via driver update) 2K capabilities of the Kona3. That's right - 2K playback in Final Cut Pro. It's awesome - you can play back either traditional 2K (2048x1556) or the newer DCI spec 2K res (2048x1080). Even with a 2048x1556 timeline (using PhotoJPEG to compress since we didn't have a huge fast array in the secondary demo room), I could still play back to a 1920x1080 device via a combination of scaling vertically (1556 to 1080) and cropping slightly horizontally (2048 to 1920). It worked great, and looked AMAZING on the $14Kish CineTal 24" LCD panel monitor (configured as was, but they can run anywhere from $8500 up to $25K depending on bells and whistles). I'll have to go by their booth and have a long chat Wed/Thurs.
The Kona3 also can now use the dual link HD-SDI as an HSDL (High Speed Data Link) to communicate with the kinds of devices that use that - datacine, scanners, etc. The advantage is that it is wicked fast (15 fps 2048x1556 10 bit with alpha), and can go LOOOOOOONG distances (using two BNC cables). This'll be a good thing in a bunch of ways that aren't immediately obvious.
To be fair, I should mention that BlackMagic shipped drivers earlier this year that support 2K playback on my Multibridge Extreme (but not HSDL transfer as far as I know) earlier this year. I just haven't had a chance to plug it into a 30" LCD to test it, so this was the first time I did 2K FCP playback.
Some more RED tidbits I picked up during the day - there will be a battery included. The camera will support dual link log RGB 4:4:4 out the HD-SDI taps like a Viper does. All taps (outputs) will be hot, simultaneously, all the time. With two BNCs to support dual link HD-SDI, there's no reason why they can't be configured to do twin mirrored single link connections as well. There is a number to call on Red's website for those that want to call in to reserve one, but you'll be trying to get through to Chris Petrillo who is also busy taking orders on the show floor. How busy is he? By lunchtime, around 60ish orders had been placed and he had 75 or so cell phone messages. By the time we practically had to shove people out of the booth, at 6:30, half an hour AFTER the show closed, the count was around 100 people that had put $1000 on a credit card (Visa/MC, no Amex at the moment) to reserve their own. Some asked about should they reserve a lens, and the response was "Not yet." And when you reserve a camera, you get this gorgeous little hewn ingot of metal in the shape of the logo with your own serial # engraved on the bottom. Remember that "r" shape from the old website for red? It's a cast metal one of those. It's nice, just another class touch from Jim and his guys. Smart folks are asking Jim to sign them as a keepsake. I spent a long, hard time thinking about whether I wanted to get camera # 12, and in the end I passed - I'm a post production consultant, for Pete's sake - I don't even own my own DV camera, why would I want to own a $20Kish high end camera of my own? (Because it is sooooooo cool!). I figured if I keep on good terms with these guys I could always borrow one if/when I needed it.
Numerous Apple staff came by, including a high level FCP person I know (name withheld, not sure if I can go public on who) who was there for the first preso at 9:20 or so. His eyes lit up more and more the more stuff we told him about what it would do. Later in the day, I was told by another, even higher level person in the Pro Apps Group that I could say that "Apple has seen this, and is very interested" (or did he say excited?) about what RED is up to. I think/hope that will bode well for trying to get native codec support in FCP, which was one of the most common questions I was asked while working the booth and demo room (where we're showing 2K workflows, since RED can do 4K/2K/1080p/1080i/720p resolutions).
By the end of the day, we had to practically shove some folks out of the tent at 6:30, half an hour after it was announced the show was CLOSED for the day. Around that time, I checked in with Chris who said they'd booked about 100 reservations that day in 10 hours - a busy day of filling out forms for him. He figured that by the time we open for bidness tomorrow and 9, he'll have made enough callbacks from his huge long phone message list that the count will be in the 120-125 range.
So these things are moving FAST. Call or come by if you want to reserve one (and you can get your money back at any time no questions no hassles) Call 949.206.7900 if you're ready to reserve one.
Later in the day I briefly got a chance to escape and check out the Apple booth. Other than the new 17" MacBook Pro, which has FW800 and a gorgeous bigger screen than the 15", it's pretty much like the 15" as far as I could tell in my high speed skim of the specs.
Xsan got bumped to v1.3, but sounds like it is just bug and stability fixes and the ability to create LUNs bigger than 2TB. Nice, but just a fix of an obvious hassle.
Final Cut Pro, however, is NOT a new version as I inferred from my swingby of the booth before they opened on Sunday. What it is is a technology demonstration, for a version of FCP they would not name a version # for, with no announced ship date. So at SOME point in the future we'll see a new version with 24F support for the Canon XL H1 and 24p support for the JVC GY-HD100U. Now that's great, and I wish I had that now, but it's not a reason to bump to 5.5 or 6.0 if that's all we're talking about. There was NO mention whatsoever of an FCP 5.5 or 6.0. I guess that the switch to Intel kept'em busy. Rumor mongers - you were WRONG! Maybe at IBC we'll hear mention of the next major version of FCP. But 24F/24p support? To me, that is a 5.1.1 or 5.2 at best version jump. I'd love the features, I'll all for it, but OK, if that's it, get it on the market and let's GO!
I also finally got a chance to check out G-Tech's G-Speed (no info on website I could find) at the spiffy party they threw in the Wynn, their new 4 gigabit fiber channel array. 3.0 TB raw capacity, it does RAID 0,1,3,5, or 6 (which is double parity, cost one more drive's worth of space than RAID 3/5). The 3.0 TB model lists for about $6K if I recall correctly, and in RAID 3 config (Roger, the head guy, said RAID 5 ought to be about the same) clocked in around 240 MB/sec using the AJA disk throughput doohickey - PLENTY fast enough for uncompressed HD. Expected to ship in June, the hardware is done, they are just messing with the cabinect form factor/aesthetic stuff. Oh, it's also hotswap, which should rock! In RAID 3 or 5 trim, that 3.0 TB (6 times 500 GB Hitachi SATA drives) will yield 2.5TB, which with the usual funky hard drive math means 2.31TB of usable, formatted space in RAID 3 or 5 configuration. Roger said it should work with Xsan, too. With Medea recently purchased by Avid and it's future unknown to me, I think this positions G-Tech very very nicely to compete with the existing players of Apple, Huge, and Medea. I hear Huge has some good stuff at the show I haven't been able to check out yet, Medea's in Funky Land as far as I know because of this Avid thing until I can talk to them and get the skinny, and the speed and bang/buck that G-Tech offers in this first product is quite, quite competitive with the Apple XServe RAID.
Afterwards, we all gathered back in the hotel lobby, did some fascinating recap of the day, strategized for the future, I'd like to think that I came up with some KILLER marketing ideas for the next year (that I can't discuss), and managed to FINALLY get some dinner around 11:30. Got back to room 12:30ish, did some work and then typed this, and it is - oh damn! 3:10am now. OK, that's it, I gotta go to bed, up in 4 hours! I'm wide awake and cruising on the juice of a Day Well Done - RED couldn't have imagined a better launch day.
Ah, well, shortly Once More Into The Breach (but willingly)...
Monday, April 24, 2006
So here's my take on it:
More on RED
The camera was designed to modular - start light and small so it'll go places big cameras won't, but be able to bulk it up for large production needs using the optional cages that give hard mount points for rods, lens support, viewfinders, remotes, etc. It is a helluva lot easier to make a small camera big than a big camera small.
It has more in common with the working style of a film camera than it does a traditional video camera - the ability to shoot anywhere from 1-120 frames per second, between 720 and 4520 pixels tall.
A bit more on recording modes: there are four variables here:
1.) What pixel size you shoot
2.) What pixel size you record
3.) What video/data format are you recording
4.) what physical recording device does it go on.
One at a time:
1.) What pixel size you shoot -
1.) The Mysterium Sensor is 4520x2540 pixels, and is a Super35mm sized single CMOS sensor. All modes start from there.
2.) So when shooting in the following modes, you're recording:
4520p - it is the full output of the sensor, up to 60 fps
4K - it crops off around the edges to get down to 4096x2160, up to 60 fps
2K - in this mode, up to 60 fps (more on other modes in a minute) - start with the 4K above, and then scale it down - so it is super sampled (or oversampled, if you prefer that lingo) to give smoother results with less noise, better signal/noise ratio, less aliasing, and this in turn helps the dynamic range. So when recording 2K or less, you have an oversampled, cleaner/smoother image. Remember this applies to the rest of the formats below
1080p - take the 2K and crop it a bit more - so for 1080p, you have an image that is oversampled about 4x - it is 2x in both the horizontal and vertical dimensions. This is GOOD.
1080i - take the 1080p and use each 1080p60 frame to generate a 1080i field. This is, yet again, oversampling, and provides a VERY nice, clean, sharp, smooth 1080i. This is actually an ideal way to generate a 1080i image and is in no way a compromised image
720p - through a combination of scaling and cropping, the 4520p is scaled and cropped to 720p for a highly oversampled (I think about 3x in each dimension, I'll need to ask Graeme Nattress to confirm), so should be extremely clean. This description differs slightly than the handout on the show floor, but Graeme assures me 720p is optimally derived after I grilled him on it, not just scaled down from the 1080p as may be indicated elsewhere.
In the viewfinder, whatever mode you're shooting will show with reticles around the SurroundViewTM viewfinder - you'll be able to see what's in frame and what's about to come in frame - more like an optical viewfinder on a film camera.
OK, back to 2K - what if you want to shoot some serious high speed, like 120 fps? And what's this about the ability to use Super 16mm lenses?
It works this way -
The sensor is Super35mm sized. The camera can accept Super 16mm lenses. when you mount a S16 lens on it, the image hits the sensor the size of S16 film, not the size of S35 film from an S35 lens. You're now using a small center piece of the imager - this is called windowing. By windowing the sensor, you can use S16 lenses and get a 2K res image out of the camera, you're just not oversampling anymore. But, by using a smaller piece of the sensor, it is possible to crank up the cycle rate on it, and get a higher frame rate. Therefore you can go up to 120 fps at 2K res (or 1080p or 720p, and 720p WOULD be oversampled in this case) when using S16 lenses. And we're talking about DCI 2K here - 2048x1080, not the anamorphic film scan 2048x1556 pixels.
The decision was made to keep the depth of field the same for 1080p and 720p, FYI, so it doesn't change when you switch modes. Convenient! You're NOT windowing when switching between those modes.
A word on anamorphic - you don't need to use anamorphic lenses with this camera (unless you really want to). I had a lengthy chat with both Graeme Natress and Stuart English about this, and they said what we have here will work just fine for widescreen implementations. I'll probably follow up in depth and write more about it, but in general, you're in good shape - it is a native 16:9 sensor to start with, and you can always crop the image to get 2.35:1.
3.) What video/data format are you recording in? It depends:
When shooting 2540p - you're assumed to be shooting RAW at this point and want maximum quality. RAW is the unprocessed direct output of the Bayer pattern from the CMOS sensor. It is uncolor corrected, and isn't even directly viewable - it's still grayscale data at this point that has to be processed to figure out which pixels are R/G/B. In any case, 4520p is always RAW.
When shooting 4K, same thing.
When shooting 2K, you can now start using the REDcode codec. For 2K, it is assumed you want log data (like a film scan or Viper FilmStream mode), which does a better job of recording highlights and is akin to 12 bit linear. If that doesn't make any sense, that's OK, just know that in 2K you have a choice - either RAW, or REDcode 10 bit RGB 4:4:4 log variable bitrate wavelet. It might be possible to go out the dual link HD-SDI with 2K, I'll have to check. In theory it is possible, especially with the new AJA Kona 3 card's capabilities.
When shooting 1080p, you have yet more choices - you can record uncompressed out the HD-SDI (single or dual link), or you can use the REDcode 10 bit 4:2:2 codec. In 1080p, I THINK you can do RGB 4:4:4 1080p, but I'll need to double check - that isn't explicitly stated on the data sheet. But 10 bit 4:2:2 REDcode wavelet is an option, as well as uncompressed to the REDRAID.
There seems to be a gap here that I need to follow up on - is there a 10 bit log RGB 4:4:4 for 1080p? If there a 10 bit linear RGB 4:4:4 for 1080p? Or is it assumed you'll start with a 2K and crop down in post somewhere/somehow? I'll have to ask about that.) - actually, I just talked to Graeme about it - yeah, it'll be fine. He said something about codecs are still under development, and Red won't do anything to artificially limit users. My own thought - yeah, that would be so completely against the spirit of what they are trying to do here.
For 720p, 10 bit 4:2:2 to REDcode VBR wavelet codec, out the HD-SDI single link uncompressed, and possibly uncompressed to REDRAID, again I'll need to check the matrix on that one.
And lastly, where do you record it to?
RED-RAID - high speed RAID for RAW recording, possibly uncompressed recording as well - have to check. This is an external device, they are considering either Infiniband or dual link fiber channel of some sort.
RED-RAM - I'm guessing for short bursts of RAW or high speed I'm guessing, external as well. But it'll have a higher capacity than REDFLASH. Sufficient.
REDFLASH - internal 32-128GB flash memory. I need more info, but I'm guessing for shorter takes to REDcode, maybe RAW, maybe uncompressed?
RED-DRIVE - internal to camera, hard drive based (probably 2.5" drive mechanism)
...or out the HD-SDI, single or dual link depending on frame size and rate and what fits in the HD-SDI spec for frame and data rates and bit depth. At that point, you can hook up to conventional decks or DDRs or other recording gear.
If you assume a bitrate of 100 mbits, and assume a REDDRIVE of at least 40 GB (and these are just safe assumptions about datarate and drive size for an example here), that'd be an hour or so of footage. Higher data rate and higher capacities obviously change that. But count on being able to probably do about an hour of recording on the REDDRIVE.
Again, that is all just to say this camera is all about choices and finding what works best for you.
Today at NAB, tons of new details were dropped about the RED camera, and so far it sounds just stunning. So much new information has come out, I'm going to drop it in a series of articles covering different facets of the camera.
Let's start off with what's new information that's relevant to potential buyers: The price and how to get one.
First off, drum roll please, the price:
$17,500 for a Red One camera body (no lens, no digital mag)
The price for the 300mm lens (f2.8, fixed 300mm, details in other articles today): under $4750
Recording modules: Ted Schilowitz said that there would be a recording option under $1000, but did not specify storage type or capacity. I asked some more about it and he said that product at that price point would offer a useful amount of recording capacity. (My personal interpretation of that statement was that it would be well more capacity than a P2 card)
HOW DO I GET ONE?
If you're interested in reserving one today, they aren't taking orders, but they are taking deposits as a placeholder in line, so that you get a very clear indication of where you are in the queue for production, with no prevarications. In order to distinguish who is serious, they are asking for a $1000 deposit via credit card, which is fully refundable at any time, no questions asked. These are non-tranferable - if you don't want to buy it, you can't give or sell that line position to somebody else - that place in line simply goes away.
This is not to raise cash to develop the camera, simply a way to tell who seriously wants one and reserve a place for those folks.
At this time, they are trying to limit orders to 5 units. If you insist on trying to get more, talk to Ted or Jim. Also the price is what they consider fair and firm - the price is the price. Note that they are selling directly and not through resellers at this time.
If you don't feel comfortable putting a deposit down on a camera that isn't finished yet, fine - later this year they'll be showing actual cameras, and then shipping them late this year/early next year (current plan) - you can always wait and put in an order later, but you'll be behind everyone else who put in an order before you.
So the optimists (with a deposit) get theirs first, the pessimists will have to wait their turn.
update - as of noon, I think there were about 100 reservatsions placed last I heard. So get on it if you want in!
Red Camera: Basic Specs
So without further ado, here's the specs (which are of course, subject to change, as the camera is under development) on the Red One camera.
I'll obviously have much more to say and comment about it over the next few days, but let's start here:
CAMERA UNIT ITSELF
-VERY small, modular in design.
-the body will weigh less than 7 pounds without battery, lens, or recording module.
-the basic body ($17,500) includes HD-SDI, dual HD-SDI, HDMI, XLRs, and a bunch of other inputs and outputs.
-a port for RAW recording will be included, but you'll need to buy a separate recording module for recording RAW - it'll be some kind of high speed serial connection, that's still being decided
-single CMOS sensor
-as said before, it's 4520x2540 pixels
-29 square micron pixels - that's BIG, so a lot of light per pixel - helps signal/noise, helps light sensitivity, helps contrast, etc.
-24.4mm x 13.7mm (Super 35mm sized)
-dynamic range >66db, aka depending on lenses used, etc., somewhere in the range of 11-15 stops
-depth of field is equivalent to using Cine lenses
-it can be windowed down to 2K to use Super 16mm lenses and frame rates up to 120 fps
Variable 1-60 fps for 2540p, 4K, 2K, 1080p, 1080i, 720p
Variable 1-120 fps for 2K, 1080p, 720p by windowing the sensor (using a smaller part in the middle) and using 16mm lenses
And of course, all the usual suspects - 23.98, 24.00, 25.0, 29.97, 30.0, 50.0, 59.94, etc.
1-60 fps 2540p, 4K, 2K
1-120fps - windowed 2K (as described above)
single and dual link HD-SDI
2K 4:4:4 RGB
1080p 4:4:4 RGB
Digital Media Magazine:
-the digital mags will have FW400/800, USB 2.0, and eSATA ports, so you can plug it into your computer and just GO - copy the files off, or play from the mag.
RED-DRIVE will use hard disks in the 40-160 GB capacity ranges
REDFLASH will use flash memory in the 32-128 GB capacity ranges
REDCODE codec: will be a variable bitrate, wavelet based codec:
10 bit: 4:2:2 1080p/1080i/720p
10 bit log: 4:4:4 2K
(I'm GUESSING that it will be full raster, but I haven't seen that explicitly stated yet)
Audio: 4 channel uncompressed, 16/24 bit, 48 KHz minimum specs
Viewfinder: Built in high res LCD, with on screen display, focus assist, exposure assist
Construction: magnesium alloy (and I must say, the proto looks GORGEOUS, my first thought was "this is the alien spy sattelite")
RED-RAID - high speed serial interface, 2540p RAW data recorder, 12V D.C.
UHD Lenses - in development - RED 35mm and 16 PL mount Cine lenses. An optional B4 mounting kit will also ship at some point
UHD Viewfinder - SurroundViewTM (let's you see what's out of frame, like an optical viewfinder to see what's coming into frame, not what just did come into frame)
Camera Bages: Tripod or on shoulder use, powered accessory mount points (stainless steel mount points - very nice!)
Those are the basic specs.
Right NOW, come down the NAB SU (south upper hall) SU1401 booth and find me. Black pants, black RED shirt, 6 foot 3, glasses, brown hair.
if you're coming in from south upper - come along down the left side past Sony, past the blue banner hanging from the ceiling that says Escalator to Multimedia Downstairs or something like that. about 100 feet past the escalators, start looking to your right for a big red ball with a silver ring around it (the RED logo), and a red and white tent. That's us.
If you're in South Lower hall, come up the escalators, turn right, and in 100 feet or so start looking off to your right diagonally for the above.
wait a few hours and I will Tell All. I've been giving tons of access and details, but somebody else has first dibs on posting details, I'll be posting mid-morning to lunch sometime.
And there's LOTS to tell!
Also, Apple's new 17" Core Duo laptop is out, and I hear it has FireWire 800.
I also saw, wandering through booths yesterday, that there is SOME new version of FCP, because I saw a station that said 24fps support for the JVC GY-HD100U. So that's all good stuff!
I'll be blogging on other than RED starting tonight probably - today is all about RED, several articles to post.
And with that, I'm late as hell and gotta blaze, see you at the booth!
Tell'em Mike sent ya.
Sunday, April 23, 2006
As usual, here's my raw notes:
NAB Digital Cinema Summit 2006, Day Two Notes: james Cameron keynote
KEYNOTE: JAMES CAMERON ON DIGITAL 3D
James Cameron is giving the keynote
Elizabeth Monk Daley is the dean at the University of Southern California Cinema/Television
(she's introducing Cameron)
talking about Cameron - he comes at it from the perspective of making better films.
-came to LA in 71 to study physics at Fullerton College, machinist and truck driver
-in 78 decided to bea filmmaker, hired by Roger Corman
-worked for corman as VFX artist and designer, quickly went on to do his own films
-Terminator in 84 from his own script
-wrote and directed The Abyss, T2, True Lies, and Titanic in 97, (highest grossing movie in history, $1.8B in ticket sales), 14 Academy Awards, pleased that he's going for 3D
this is Cameron:
Human beings don't like change - if it ain't broke don't fix it. Our business may not be broke, but it could use some fixing. Falling box office and day & date release causes concern, how to get people back to theaters.
Can d-cinema provide the magic to do that? For me, yes. Digital is an enabling tech for 3D. Digital has a sound biz model for why it should be done. Those reasons are not why he's excited about it - for me, it's about 3D.
My vision is that it might be the most important part of d-cinema.
46 3d movies released from 52-55, then it died out completely. in next 25 years, only 3 titles. Why? People were fascinated by the stereo experience, but got turned off by crude cameras, producing stereo that created a lot of eyestrain even under the best projection conditions. The projection was dark and prone to error in the field. If it was 2 perfs out, the L&R eyes were reversed and you'd need years of therapy to recover. Plus, most of these movies sucked and you wouldn't have gone to see them without 3D.
The new 3d digital cameras have dynamic interocular and convergence, capable of producing perfect stereo images with no eyestrain. It's been solved. And the new 2K d-cimena projectors are a godsend. There's no point in making great 3d with no place to show.
the only difference between digital nd 35mm is one thing: Frame Rate. The reason you can show it on a single projector is because digital can do high frame rates (over 100fps) whereas film cameras are good for 24, maybe a bit more if you goose it.
Digital allows a triple flash per eye per frame (see yesterdays notes) - you're getting images at 144 fps but the brain is fooled into seeing the images simultaneously even though you're really seeing them sequentially. Digital can do something 35mm CANNOT DO. This is different from color, res/image steadiness/etc., this is binary. They just can't do it. Can you project 3d w/35mm projector? You can, but you're back to the 50's for the 3D tech.
Ghosts of the Abyss had over/under 35mm projecgtion in 50 theaters. it wasa nightmare process, dark murky pictures, really a non-starter. Anaglyphic is worse. For bright, stable, lucid 3d, you need digital projection.
6 ydars ago started down a path to shoot digital 3D.
Dissatisfied with the big bulky stuff at the time. Back then, plan was to shoot digitgal and blow it up to imax. Partway in, had an epiphany: the new digital projectors could be made to run 3d and didn't cost $2.5M apiece, so could be THOUSANDS of them. If the whole d-cinema conversion hapened, could be TENS of thousands of 3d projectors, not a few dozen. This was a glorious dream....that no one else shared.
got on horn with Doug Darrow of TI's DLP program, been working with them towards the goal of digital projector tech that is 3d capable. At first they thought of me as an intriguing annoynace, but after banging away that this was the one thing that their projectors could do that 3d couldn't, they got interested after d-cinema stalled (good as film?, doing tests, etc.) there wasn't a strong enough driver to do it. There wasn't a strong enough marketing hook to the public. No scratches, versatility, stable, etc. it'll save studios a boatload of money over time, but the average person is not going to line up around the block if Fox is going to have a strong quarterly profit. But 3D, he told Doug, changes alll that - now you have a very specific and marketable system to install in the near term, the benefits to be harversted by exhibitors and studios, but now something to intrigue the public's imagination, a differentiator for ttheaters, and a catalyst for d-cinema. It might turn out that he could be right - no scientist would take two data points to make a solid decsion (we usually use less data to make decisions in our business).
Polar Express 3D made history in 3d world. $121M on 2d screens. On 68 3d screens grossed $68M. 25% gross form 2% of the theaters. Lots of big films have been blown up to Imax. None of them have shown such a difference between Imax and regular screns. The difference? Feature length 3D that was sought out and paid extra for.
It didn't prove that success would translate to DIGITAL 3D. October '05, there were 43 digital screens, 14 were running 2K, none were running 3D. D-cinema had stalled for years due to chicken and egg conundrum. Chicken came first (chicken little). Disney installed 84 2K 3d installs. 6x more than before. 83 screens made 12.9M dollars. Average gross was 54K/screen. Average was $162K/screen for 3D theaters- vastly outweighs costs of doing 3D. (got figures from
3d sold out first and most often, overflow helped the 2d screens. 3D had stronger legs, ran as long as 13 weeks. 2% of the theaters did 10% of the gross in north america. that's a similar pattern to polar express but on a different platform. The commonality - feature length 3d. Confirmed audiences will seek out and pay extra for 3d, give good word of mouth, and the costs are outweighed significantly by the increased upside. Opinion dynmaics poll of 900 us addults - 12% (39M americams) would go to movies more often if they were feature length 3d films. Isn't that what we want? People to go more often? among 18-29 year olds, 29%. Some said it would depend onthe film. 14% would poay 2-3 dollars more.
The numbers don't seem that high - 50% have never seen a 3d movie, so that is the opinion based on PAST experience with 3d, which is a pretty sordid history. How mahy of these have seen digital 3d on a hollywood blockbuster? Very few. There is a predisposition to want to see 3D. The numbers will go higher once more folks see it.
We know people want it, will actively seek it out, and pay more for it.
This all needs to be driven by content, not format. Not about finding good scripts for 3D films, be like finding scripts for finding good color movies in the 50s - like when color was new, it was used onthe A - list pictures. 3D should be thought of as that kind of enhancement. Don't put a turbo on a sewing machine, put it on a sports car. Put 3d on a top tier market, and market on the merits of the movie, at the end of the trailer or tv spot, you say "In 3d at selected theaters" - if you sell movie as ultimate 3d experience, people will feel like they are settling for less in a 2d theater. Most folks will still see the movie in 2d rather than 3d for the next several years.
Make 3d the upselll/upside. Give consumers credit for seeking it out, and it is value add.
So where's this 3d content coming from? 3 ways to make 3d movies:
1.) Shoot a live movie with stereoscopic
2.) make a cg movie
3.) can dimensionalize a movie shot in 2d and make it 3d
how he sees it:
1.) Live action movies - he's been shooting live action, digital 3d for the lasat 5 years, all he's done - shot steadicam, hundreds of hours of handheld, helicopter, you name it. Nothing you can do in 2d you can't do in 3D. New cameras are mostly more sophisticated than cmaeras of the past, but are now electronic so are able to see it live and make changes, not bake in mistakes you can't undo later. Can do perfect stereo that produces no eyestrain. Camera package costs more, close to double, (2 of everything), factor in post and VFX have added 5-15% of the cost of the movvie. Put can gross 30-40% more. he's decided to shoot Battle Angel, Project 880, (new line and walden Journey to Center of Earth in 3D, first live action 3d to market on digital screens, about 800 screens is their goal). a number of other projects he can't talk about
2.) CG area - Zemeckis is doing Beowolf in motion cpature 3d. He says this is the only way Zemeckis wants to do it. Aas long as you want to do it in 3d, it is ridiculously easy to do to do 2 virtual cmaeras. Twice the rendering, but easy. It is the low hanging fruit in the 3D market for a few extra million dollars. Disney's Meet The Robinsons is being made in 3d from scratch. 3d and CG goes together likes peas and carrots. Chicken Little was dimensionalied at the last minute at higher cost than dual rendering from the get-go.
3.) Dimensionalization - was skeptical at first, but did some tests with InThree and "I did a real 180" when he saw it.
He's seriously looking at doing Titanic in 3d, looking at doing T2 as well, maybe some of the others if the costs come down. Peter Jackson, is looking at doing LoTR and King Kong in 3D.
Lucas is actively planning doing the Star Wars films. IT is technically feasible but very expeinsive. It is cheaper to shoot 3D rather than to shoot 2D and dimensionalize. It should be thought of converting the top earners of all time. Raiders of the Lost Ark would be really cool. He supports it because it'll create content to show on 3d, and the exhibitors need to know there will be a steady stream.
Bono on glasses - "This is not a problem, this is a marketing opportunity."
There's two kinds of glasses - active and passive. Active shutters open and closed 96 times a second, has a battery and a motor. New ones are light, in large volume they'll cost maybe $25 apiece. Need to get'em back and run'em through a dishwasher and it is a pain in the ass. The big advantage is that you don't need a special screen, and can use existing screen. For theater owner, i fyou want to move from big to small auditorium in week 3, is no big deal since can just move the glasses.
Passive - put an LCD filter over projector lens, install a silver screen. Glasses are cheap, under $1, can be given away, if you're cheap you can collect'em and use'em again. All the Bono marketing possibilities are there - Ray Bans or Oakleys. NEw systems use circular polarization and it won't mess up the picture. The only real negative is the cost of the silvered screens. They cost in the same general range as the active glasses. Lots of grumbling about silver screen is no good for 2D - THAT IS WRONG. Somebody will diagree, but that's what he uses. Nattering nabobs of negativity nattering on about change. But I'm not too biased on that. : )
whatever biz model works best is fine - otherwise you're splitting hairs.
Both systems work fine with a single projector. Every 2K projector out there can potentially be a 3D projector. And we're talking about eventually converting ALL projectors over time. The upgrade form 2D to 3d once the projector is in place can be made overnight. Can make decision downstream.
Anaglyphic 3d "we need to stamp out this unholy practice" - People are tempted to using this to release films on 35mm using this. Nobody over the age of 10 thinks this 3D looks good. Anaglyphic releases could cause a setback to the industry by giving new 3D viewers a BAD experience. 3d as a market is only as good as the brand association we build over the next few years. "Remember - red blue BAD!"
Where is all this 3D stuff going? Do I think all movies 3d in 10 years? No. Isn't like the pervasive changes of sound and color. In a few years, studios will ask about how many of the annual 4 or 5 of the tentpole movies should be 3D? Hopes it will be driven by filmmakers wanting to do it because it is cool, because they will want to create their own stereo aesthetic, like a new set of colors to play with it. The major animated 3D releases will, as a rule, come out in 3D. Timeless classics will be
Will it be a fad that blooms and dies like 3d in the 50s? No - a fundamentally diferent class of movies, the must-see major releases from the major filmmakers. Ina addition to making the movies you'd see anyway, in some theaters in vivid glorious digital 3d, and the quality will be perfect - no reason for it not to be, no crossed eyes and headaches and grumbling - fundamentally different from anything that has com ebefore. Dismissing it as a fad is not possible, but it isn't logical to dismiss it. How big can this thing get? As long as incremental cost is less than the incremental box office it creates, it'll be vialbe indefinitely. We have to get projection infrastructure into place, once you have the titles availble, and the teathers are avilable, it'll drive the economics and it'll improve over time as you have more theaters, which will encourage more movies, which will feed back to make more moviemakers to shoot in 3D, etc., is his theory. His theories so far are working on the timeline he thought.
Another fundamental difference - digital projectors can show live feeds that represent another biz opporunity. The digital 3D cmaeras can shoot live 3d that is hard to distinguish from human vision - what if you could use this installed base of digital 3d theaters to take place in live events across the world, the immediacy and power of that...."think about what you could charge..." : )
The home market - those who say you can't play it at home misses the point. You'll be able to pirate major release movies and it'll be viewable at home is happening. But you won't be able to watch it in 3d. You won't be able to pirate it in 3d, and even if you could, nothing to watch it on at home!
"We're so scared of piracy we're ready to pimp out our mothers" with day and date
if you have a successful 3d movie, people will buy the 2d version to watch it at home.
Eventually, it'll be possible to watch this stuff at home if there's another market, and it'll create a market for your film
If they buy your movie in 2d, later the 3d version will come out and they'll buy that movie again..."Am I evil for liking that idea so much?"
Even without 3d, including all the squabbling that the studios do, the wave is coming, 3D can ride the d-cinema wave, and even help drive the wave, giving people a tangible near term reason for wanting the projectors. We're passed the point where the fear of change outweighs the fear of NOT changing. He's not making movies for people to watch on cellphones. And he doesn't want day and date to erode the grand experience of theatrical cinema experience.
"We're in a fight for survival here" - maybe we just need to fight back harder, not wither away and die - d-cinema can do that, do that for a number of reasons, d-cinema is a catalyst for 3d, and it'll get people off their butts and bac in theaters where they belong."
gonna run a clip of 10 or 11 minutes from Show West that made some buzz for the digital 3d concept. Compilation of 3D from a number of different sources, that'll work from a bunch of sources. All CG, hybrid live/3d, straight live action production. Starts out with some stuff from T2 3D that was shot on film. All kinds of stuff. That'll take us into the panel discussion.
(wathced a demo of a lot of stuff - looks GOOD, but want to see what can REALLY be done with Cameron shooting some first rate action stuff - Battle Angel!)
and that led into a round table discussion, I'll post those notes in a moment. Sorry for all the typos in this, was typing as fast as I could....
NAB Digital Cinema Summit 2006 Day Two: Round Table Discussion on Digital 3D with James Cameron and Others
Geoff Burdick, James Cameron, Joshua Greer, Vince Pace, moderator Jon Jandau are on this panel.
There's some GREAT info to be gleaned in here, note the bolded section where Jim Cameron himself weighs in on the whole 422 vs 444, compressed vs. uncompressed thing. See my comments at end about it.
Raw notes below, as I took'em, typos and all:
Round Table Panel
John Landau is the producer working with Jim at Lightstorm
Real D guy is here
Christie and QuVis helped do this today.
50 years ago, if there was a TV, it might have been color
Now, in cinema, there have been no serious advancements in the visual presentation of movies. 50 years, no serious advancements. Something to distinguish the home vs. theatrical experience.
Tech has enabled 3 things to come together -
1.) The capture technology - no longer encumbered with huge heavy cameras - T2 3D had to have stunt players go half speed because that was as fast as they could have been moved. Now can do steadicam
2.) Post technology is better - can do post convergence
3.) Digital cinema experience - 3D at a higher quality than has ever been seen
Let's start with Vince Pace of Pace Technologies developing 3D camera stuff -
what we've developed is a motion controller that introduces dynamic convergence and interocular control. No longer a mathematical guess of what it should look like, is a creative interpretation of what's desired. Very complicated gear makes it easy to do good 3D. The gap from capture to what'll look like in a theater is getting very short
-can go up to 20 miles, can do motion control remotely, cmaera system is comprised of optical block, lenses, dynamic control at the point of capture that let's em steadicam, remote crane, go 3 miles under the ocean, through fiber optics, allows'em to go to whatever - disk, HDCAM SR, record with no degradation, but keep POINT of capture very simple and film-like. Exciting to see all those functions come togehter.
"3D opens the door for filmmakers to mine completely new territoriies" - some filmmakers may be leery of new stuff to learn and won't be able to do the things they know. Everyone will do it diffferently and see different opportunities - Jim doesn't like to do the "poke you in the eye" 3d, likes to create a reality beyod the screen, remove the screento make a window into the world the movie creates. None of the core stuff of character and what not that don't change, a few things to learn to do good stereo, can do normal camera movement, all the normal tools are there, lighting is lighting, there are few little mines to avoid, doesn't take logn to pick this stuff up, you have live viewing and feedback, can see on a normal HD monitor on set, have a small 3d viewing station in video village or engineering station, and also have chosen to have 2K projection near the set to see 50 feet away and watch EXACTLY what it'll look like in a movie theaters. No dailies, no photochemistry, etc. The projection way is the BEST way to evaluate the footage.
In order to have fun with the new medium, lots of new areas to paly with
(Geoff Burdick, James Cameron, Joshua Greer, Vince Pace, moderator Jon Jandau are on this panel)
sound has become important over the years to make the experience more immersive. 3D when done well makes it more immersive yet again.
Editorially, how does it work? Do you edit in 2d or 3d? Current plan is to cut in 2D on an Avid as you would any other, do an ongoing conform with the other eye so you can check the 3d at any point. Is there a difference with 2D and 3D screening? They did a year of testing to learn the language. Adapting a Terminator film for 3D, they are rapidly cut action movies - can you cut that fast? YES, you have to control the flow of the audience's perception of the stereo, most is done at acquisition to prepare the audience. The basic principles of cutting don't change. The blizzard, super fast, action cuts might want to detune the 3D that doesn't rely on the stereo space. Shots work so well in 3D you tend to want to linger the way you might linger on a gorgeous master shot, as with Titanic
Cutting speed of Titanitc was about half of other stuff - Terminator 2 had some # of cuts at about half the length
VFX and a lot of great production value/design, wnat to linger on those shots and enjoy that world, don't hang so long that it'll get too slow in the 2D version.
When he (Jim) talked to Peter Jackson, he'll do it completely differently than Cameron will.
There haven't been new tools in a long time - it's exciting (cameron)
Vince Pace - is shooting 3D different? when shooting 3D, embracing all the layers of something forgotten in 2D experience - a good shot is a good shot at the end of the day, the 3D shot can be a more entertaining shot since there is so much more - as a DP, you use lighting to create layers, in 3D it is a big difference since that work is done for you. At end of day, don't leave any of the tools at home - you gotta bring all those to the table, but you need'em to make a compelling story.
Josh Greer - nobodies looked at as much 3D as him lately - RealD comes in after the hard work is already done. They come in and fill in the RealD hat - wanted to make sure the presentation was high quality as possible - make sure eyes are in sync, they come in and talk about what happens with the pipeline for 3D, how to color time for 3D, big issue with light to take that into it as well, not trying to get out of mastering, but want it to be a "do no harm" kind of a thing to keep the integrity of the artist's vision and keep consistency from room to room.
Jim - how does 3D play into creating VFX - two parts of that - don't have to make a $150M movie to justify 3d, in first few years the 3d will be driven by tentpole movies. The cost of shooting 3D isn't that big a bump, but doing FX is a bigger bump - compositing FX in 3D is more complex, adds a layer of complexity to the project - it has to work in Z space as well as X & Y - it isn't a showstopper, (it isn't twice as hard or expensive, it's gonna go up by some factor depending on FX you're doing) - there's a bell curve of expense. BAse cost plus the kick to go into 3D - that kicker for 3D falls on a bell curve. If you have a pure live action film it is realtively inexpensive. If pure CG relatively inexpensive to just render all shots twice. When mixing CG and digital FX and live action is the middle of the bell curve. Most of your big FX movies are going to fall into that category. John and Jim's movies next few years are in the next few years....but fortunately could be biggeest earners. If a $400M in 2d, but $450 or $500M in 3d, you've covered the costs in spades. And of course, as with all tech, as more peoople do it, economies of scale and costs come down. All VFX companies except for a few are doing their first 3D, and it is harder. Once they know how to do it, it is easier and less costly. First time takes longer and costs more.
Above the line - set, props, wardrobe, etc. aren't affected. Cameras, VFX, post are the only areas affected. On a big picture, post budget is not affected that much. FX budget will go up somewhat. Lots of greenscreen of them inside subs. Were doing those comps for $1500 or $2000/shot. good, high quality 3D composites in greenscreen efficiently. On set, can see composite in realtime, and can see the composite in 3D by end of day before you strike or leave the set.
portability of gear - on the acquisition end, lots of improvements - SR deck that handles two data streams that are locked together (SRW-1) - any dailies that they need to see immediately are easy - just take tape stock and rewind it and paly it back instantaneously for production - they are try8ing to save time- making cmaeras more expensive up front, but as far as efficiency on set, has gone way up. hasn't been tracked, but the point of having all the creatives on set together and show'em on set day of shoot is big - the savings later on is better. More and more decision making on set rather than in post. After the cut is made, is filmmaker still involved?
Josh Greer - at end of day about filmmaker involvement - hope is they build trust and will trust it'll be consistent screen to screen - a lot of variation and a lot of systems out there - it has to be a premium experience. If anaglyph were gonna work, it would have by now. If the experience isn't great to your eyeballs, best case it is uncomfortable, worst it is painful. Most work is done for us with servers and projectors which are incredible. First epiphany was seeing it in Cameron's screening room, and realizing Ti was going to roll out zillions of these. "My career turned on a dime at that moment."
In 2003 were 29 films over $100M. In 2004 went down to 24, in '05 went down to 19. We need to start presenting things that people can't get at home. For those skeptical folks, now is the time to embrace all that.
audience question - when we get into "normal" films for 3d, how is that going to go if it isn't family friendly fare? "It's gonna scare the crap out of 'em" - (Cameron)
Whatever you're trying to do, it's more in 3D. The clock rate in your brain goes up, you're more engaged and involved, committing more of yourslef to the audience with 3D, you're gonna be more there. People will get more out of 3D, be more scared.
Once the infrastructure is there, and after the initial "ride film" types happen, after the tent pole stuff, people will start to experiment. You're not going to shoot My Dinner with Andre, but think about Titanic minus the boat sinking - it would have been beautiful in 3D.
What did color do - it brought your more into the experience - this is the next logical thing. (greer)
Q: in film school - in 50s film industry had problems with an invasive home tech - TV. 50 years ago 3D was said to save the day, but it didn't do it. A: Cameron - 3D didn't work in 50s. What are we supposed to do, shrug and walk away? No, we can fight back and use this 3d, all the pieces are in place to do it.
3d doesn't represent what he tries to do on set - recreate human vision - people feel like they are more THERE (Pace)
Q: Vince - Sony F950 dual stream stuff - playing backin realtime on set - is dual stream 4;2:2 or 4:4:4? One or two decks? Syncing decks?
-not trying to make a statement about what to do
-is popular to do 4:4:4 for L&R, can master on 4:2:2, piggybacking decks, all are possible. The neat thing is that depending on your needs, we can accomodate that very easily on set. Can do SRW-1 4;2:2 synchronous on one tape, and do two decks with 4:4:4, also can do digital disk ddrives - as for compression, up to the budget and the needs of the shoot (post heavy needs less compressed).
Cameron on 422 vs 444 - you can do anything, can support wahtever you want, filmmakers and VFX folks can do whatever they want - uncompressed is an awful lot of data (even HE says that!). Comparing HDCAM, SR, SR 422 vs SR444 in tough composites in greesncreen, like water flowing in front of green, blowing smoke etc. Old 7:1 HDCAM format - not good (but Cameron used that for all hisstuff to date), SR format does MUCH better, almost no difference between 422 and 444 for composite purposes...James is going to do 422 to do dual stream to SR deck and play it back immediately. Unless a very very specific reason to do 444 for a very picky plate, will do 422 for his stuff.
another panelist - they go through the decks and go to QuVis to record to get into an NLE environment as quickly as possible
-in 3D, couldn't refocus his eyes for stuff that is out of focus in 3D. In the stereo experience in theater, have to decide where the audience should be looking, at the bottle or the crowd. If the audience is not looking where I want them to look - they'll be looking at the non-fused, the unconverged spot. Have to direct the eye to the depth of field AND the convergence. The part of the shot that is out of focus is the Pips - focused over here is Gladys Night. Gonna use shadow, gonna use focus and convergence to draw their eye where you want it. Depth of field, lighting, stereo experience all work together. That's the neat thing when doing cinematography for 3D.
Q: 15 years ago saw IMAX 3D - haven't talked about audio to make audio more complementary to the 3D - what future stuff for that? What'll happen to IMAX 3D for all those theaters if all that is ture? (Don't want to touch that second part says Josh Greer) - the challenge for 10.2 audio and other stuff like that, it is EXPENSIVE. Who's gonna pay for this stuff? Greer is very supportive of pushing for too much at once, only so much the exhibitors can pay. Exhibitors lost their shirts going through a major conversion to stadium seating. CAMERON: 5.1 is the standard and it is enough for now. You're more aware of the spatial aspects of the sound when you're watching 3D, it's a psycho-acoustic thing when you're watching 3D stuff move around, you're more aware of it. Right now sound is more ahead of picture, you need to make "At a certain poiont you're mixing for dogs" - as for IMAX, there's only feature length movie to show in 3D, and 45 min docs, and that's a stable market for the last 20 years, but they've only every played one feature that worked well. If filmmakers don't want to make a movie for 50 or 60 theaters, would rather do 3D for thousands of 3D theaters and tens of thousands of 2D theaters. IMAX will still be a premium experience, but there will be more choices out there. Digtial cinema will make 3D ubiquitous, so hopefully a rising tide will raise all boats. There's enormous startup torque - the more contentious things are in 3d subset, it'll stall out and hinder the process. What's good for 3d is good for me and good for other filmmakers. He helps other filmmakers learn about the stuff.
End raw notes
Mike's commentary - wow, there's so much I could say. I'll just start with a coupla thoughts:
-when Jim talking about 422 vs 444 vs compressed, I also find it interesting to note that in some circumstances, uncompressed can be CHEAPER than compressed tape formats, but you pay a substantial penalty in size, weight, power requirements etc. But if you're on a big enough shoot to require a video village, becomes moot.
-3d can't be bootlegged in theater, can't be distributed as 3d, can't be shown in homes. Helps the piracy thing - can't be easily replicated at home any time soon, like 10 years or more, if ever.
RAW notes begin:
NEXT PANEL - Digital cameras from $30 to waaaaaay expensive
if it has to look like film....shoot film.
-all else being equal, but all else is never equal -
his slides are on the NAB site already, so go find and link!, also on Entertainment Technology site
-lighting has a huge affecdt on the final image
-lens has huge effect onthe image
-imager is what most talk about
-camera image processing
-storage - has no effect except money, size, time, wait, life, labor, and labelling
image processing - for the shoot and do everything in post, it DOES matter - if you filter it can help - Tiffen Ultrapol helps - many things can't be fixed in post, so know better!
-prism prevents direct use of film lenses
-can use'em indirectly with relay optics (but it builds a bozooka!)
-imager size affects depth of field and shot size
-in conjunction with resolution, diffraction limited res, dynamic range, etc.
-resolution is not so important in terms of details - people see sharpness
-contrast is seen before resoution?
Foveon - does anybody use it?
Mark Schubin, Digital Cinema Sumit, 2006 April 23 <===google that
most cameras are using Bayer mosaic, in either square or diagonal
-Panavision Genesis does stripe filters, sortra like a Trinitron but for acquisition
MTF - Modulation Transfer Function - Modulation is light to dark, transfer is change, function is a chart or measurement of it
-the area under the curve of MTF is important. The square of the area is how much sharpness you get.
-Combining vision and Technology - the wedge on teh left is all that counts - you can't see the rest!
-Imager sizes - lp/mm is line pairs per millimeter
-The Format Factor - divide the dimensions of the format -
Diffraction is the real killer - low lp/mm is not a big deal, high #s is a big deal.
THIS IS A REALLY GOOD PRESENTATION...THAT I BARELY UNDERSTAND
the smaller the sensor, the better a lens it needs
Presentations from 4 cinematographers
Darin Okada, has over 30 credits, Lake Placid, Dr. Doolittle 2, Cradle to Grave, Mean Girls, Stick It
Darin - came here a coupla years ago with the StEM footage
showing a project that Howard Luck had put together with Media Technology Board of Disney - twofold - to look at available digital cameras, he's going to show some behind the scenes stuff - to show creatives and execs to understand some of the technology, then some shots from dailies that will play unidentified, and afterwards he'll tell us.
they shot for 2 days, very ambitious in December, in the field, had to do it in two days under the same pressures as actual production, when he heard about it, thought it'd be fun, 2 days prep, 2 days shooting. Had November open to prep - it took about 6 weeks to put this together. Explored each camera for a week, don't just go out and wing it and connect it, rescouting of location like would do on a feature film, the results were pretty surprising. Learned some things in the attempt. We're the first to see any of this, he saw it Wednesday after it was all put together. It isn't even one week old.
D20, F950, Dalsa, v& Viper
shot tough shots in realistic conditions, they lit like they would for film, used same dollies and stuff
the live feedback is huge - the tools should match the worldflow
each will work perfectly for a certain type of show
Howard Luck says it looks ready for a big show. Contrasta and color rendition are still an issue - brightest brihts and darkest darks is the challenge.
coing "real close" to traditional 35mm film accoridng to Howard Luck, (tech head of Disney stuff
film is not dead and won't be for a lilong time, digital is not a replacment for film, on some shoots you may use a combo, or digital is better for one production and film for another.
kowing the tch and knowing what it can do for you
Dailies in a seven oh nine color space, and trimmed up to fit on Christie projectgor
Viper loooked good
I like Viper, than Dalsa, then D20, then 950....hmm. 950 did pretty goodnah, in
ignore that ranking
wow - the crispness of the tree shot done on the on the Dalsa is INCREDIBLE
showing 52012 film - looks pretty much the sme!
NO secondary color correction - no windows or dissolves
5218 is awfully nice, but not miraculously better
meant it not to be a shootout - more to see how fast they could work with them in the field, the manufactgurers were very generous, everyone helped each other out.
TEchnicolor let'em color correct it, QuVis and Christie very carefully set up here to screen it, Howard Luck and Walt Disney let'em borrow the footage
QUESTION TO ASK - WHAT TO RECORD ON FOR EACH OF THESE?
Curtis Clark, ASC next preso - will show an example of a commercial shot with Genesis
-selected Genesis was because the director, Victor garcia, fascinated with using a digital camera in a context not usually used. Car commercial in dessert, downtown LA, Santa Monica, day and night, 4 day shoot w/2 day tech scout and prepping, one of the first commercials to use Genesis instead of film, didn't have time to do much, got some test footage, wanted to use ColorStream system at eFilm using the LUTher box, similar to that one, can simulate different looks - with the box, the 24" Sony, and a waveform, was able to able to test stuff. He'd shot D20 first, then this was his first Genesis shoot
Audi A4 commercial - last shot of car used a Russian Arm remote control device that goes on an SUV - just need to not reflect off the shooting car. First time Genesis mounted on such a device- reliability was great - zero problems. Challenges as Dp - the contrast in all these scenes. Digital can't be expected to have same dynamic range of film - have relevant shadow and highlight detail, and had to use waveform and be on top of waveform - he wasn't operating - had the luxury of over-riding the f-stop.
Lack of frame rate control is an issue elsewhere, but wasn't so big a deal. Used PRimo lenses that he usually does with film. Never intended
While some have concerns about using that cmaera, in this case it worked flawlessly.
(It's awfully interesting that CAMERON, by the way, is an big proponent of speed and convenience, and is willing to sacrifice some image quality for the payoffs.)
NEXT SPEAKER - THOMAS ACKERMAN - ASC - in addition to music video and commerials, did Jumanji, Beetlejuice, many since then, just completed Scary Movie 4, will show us a little of Scary Movie 4, which was shot on Genesis
Early conversations were about film, and it wasn't until a month after he was hired and having dinner w/producer and director, the question was raised. He would ahve been resistant had he not done a project where a buddy was raving about the Genesis, so he decided to do a side by side test, F900, film, and the Genesis. "Immediately apparent this was a formidable imaging tool, so off we went." Short prep period- adequate for the movie, but not for digital cinematography. Technicolor was doing their "lab" work - most of their testing was not for vetting the camera, more to do with workflow through the lab, picture opened throught the weekend, here's some clips
The nighttime stuff looks surprisingly good. the last nighttime sequence shows well "the phenomenal dynamic range we as cinematographers have to work with now"
-workflow on set was pretty much as expected. "Ithink we've reached the critical mass where photographers can choose to shoot digital, and the next discussion in the months and years to come, what happens to all this once it gets into the post production workflow, a lot of wonderful things we can do, how to maintain the authorship of our material is to maintain a certain authority of how it is handled in the post production workflow." -a priority for the ASC
Next up - David Stump - talking about shooting and using the Viper -
feature that he made for Mars Callahan called What Love Is, a romantic comedy, he wanted to be able to do very long takes, get all of his coverage on a short aggressive schedule, be able to coach the actors, do fast paced overlapping dialog, dictated multiple cameras (4 Vipers) for the project, and in order to accomodate long takes, recording to SRW tapes, so Mars could coach the actors and whip them along, go 10 minutes, stop, say "pick it up 25% and redo it"
Did LUTher boxes to determine print lights on set to do color correction for monitoring on set,
(see pic for workflow)
...and it looked GOOD!
15 or 20 pages of dialog were done, in a row, in a short period of time. Tape with long running loads, 10 minutes on a mag wouldn't have worked.
That and to attach the camera with a body-cam was because it is a 6.5 pound camera for steadicam, crane, etc. It took beautiful pictures, onset monitoring and color correction, very efficient workflow to achieve the needs of his director.
Mark Schupin will moderate:
-I asked Daryn about shooting modes - they recorded everyting as best it could - Filmstream, uncompressed, etc. Daryn said something about uncompressed wasn't working out of the D20,
-eventualy we're going to head for 4K said one DP
Is the metadata you capture going to make it to the DI session in post? The ASC CDL (Color Decision List) to establish the basic lift/gamma/gain settings or no look systems, translatable between different color manipulation devices without messing up in the migration.
When shooting film, there's a certain native color characteristic that goes with it. The color space aspect of digital cameras if print reference is your primary reference for primary release. 709 color space just doesn't cut it for color space.
-Starting at 1:30p - Jim Cameron and John Fithian discussion (drat, which I will miss most of)
Saturday, April 22, 2006
So NAB starts on Monday. Apple is NOT having their usual Sunday press briefing. Part of me suspects it is because this is the first year I have a press badge, and The Universe Is Trying To Thwart Me. Then I remember to take my meds, and that also conveniently stops the little voices in my head telling me to clean the guns repeatedly.
But I digress (as you do). Back on track -
The rumors are flying that the new 17" MacBook Pros will be announced Monday. Or at least shown. I dunno - it'd be weird to have a new product just show up without a press event. In any case, one thing that is of extreme interest to me, but that hasn't been on the rumor radar, is this -
Readers Ask: What's up with Final Cut Pro 6, or whatever's next?
Mike Answers: I don't think Final Cut Studio 6 (or whatever they'll call it) is going to ship until this summer, possibly the fall. I think we'll see it demonstrated at NAB 2006, much as they showed Motion the other year and didn't ship it until October or so. So we'll know most of the features, but won't be able to lay hands on it for months. I think it'll be a paid upgrade as well, since the v5.1 upgrade was so cheap, and the crossgrades so cheap/generous. V6 - a full boat upgrade with LOTS of new features, such as (I hope) 10 bit 4:4:4 support, maybe log for 2K, more/better HDV support for the JVC and Canon 24P/24F modes, etc. And here's why:
Clue One: Final Cut Studio 5.1 is already shipping.
Final Cut Studio 5.1 shipped just a few weeks ago - why ship it then if that's all Apple's got to show off at NAB? Why not ship it at show and hype it up? You never want to go into a big tradeshow and say "Gee, look at all this stuff that we shipped three weeks ago..."
So they've got to have SOMETHING to show - and you don't give away your trade show ammo three weeks beforehand.
Clue Two: There are sales incentives particular to Final Cut Studio that end in June and September that are SPECIFIC to Final Cut Studio version 5.1
Several sources pointed out a recent sales promotion that give benefits for buying Final Cut Studio with a decently fast Mac that runs April 4 through June 26th. It is not specific to the version of FCS:
To celebrate the release of Final Cut Studio v5.1, Apple is kicking off a promotion that will run from April 4, 2006 to June 26, 2006. Customers who purchase a qualifying computer (an iMac with Intel Core Duo chip, MacBook Pro, or Power Mac G5) and Final Cut Studio (MA285Z/A) will receive an instant $300 savings. Visit http://www.apple.com/promo/ for full details.
For specifics, see this page on Apple's site.
Well, maybe that's not such strong evidence after all - not specific to the version. BUT...the part # they refer to is the version 5.1. SKU #. Then there's this that I was forwarded that was sent out to resellers:
...offer all authorized resellers a special sales incentive reward for their sales associates. Individual sales people who sell five copies of Final Cut Studio v5.1 (MA285Z/A) may receive an iPod 30GB (MA146LL/A-Black) for their efforts. The sales incentive will run from April 4, 2006 to September 30, 2006. A maximum of three iPods may be claimed by each salesperson during the program timeframe. Please view the program terms and conditions on the claim form that may be downloaded on Apple Sales Web.
OK, now we're talking about a very specific version of Final Cut. And hmm...while I could see them running this promo beyond the FCS 6 (or v5.5 or whatever) launch date, the extremely distant end date for the promo, combined with the pre-NAB relase of version 5.1, makes me think that it's going to be a while before we see FCS 6.
Part Three: Coupled with the lack of a Sunday Apple press event, and the scramble Apple must have been going through to get Intel native software and hardware to market, makes me think maybe they DON'T have huge new goodies to show us. I hope I'm wrong, but I wouldn't be surprised (just dissapointed).
I don't have any hard evidence, just looking at tea leaves here. Just speculating based on evidence floating out there in the aether...but I'm not counting on, nor recommending to any clients to count on, having FCS 6 in hand for stuff they are working on this summer....
We'll all find out Monday.
A quick question: I've got a newfound realization that portability would be nice (going in and out of an office for the next few weeks, working there and here)...has me thinking quite seriously about a MacBook Pro instead of waiting for a desktop later this year. Obviously, a highend desktop is going to be more powerful, but I've read such amazing things about HD-capable Final Cut on the new MacBook, with the whole Boot Camp thing it's kinda got me wondering. (I'd love to see how well the DuoCore handles, say, Softimage XSI on the Windows side....)
Is the biggest down-side you see to this the notion of not having a PCI-x slot and the whole uncompressed in, via AJA/BM/or similar? I'm not sure how much uncompressed work is in my immediate future anyway...but I'd love your take on why else this might not be a good idea. (I can see the obvious advantages of portability, plus a few clean disadvantages....just an idea I'm debating. If you've time for your 2 cents, I'd appreciate it....)
...which I interpreted as basically boiling down to this -
What if, instead of waiting for a tower MacTel this fall/winter, I went ahead and got a MacBook Pro ?(or the 17 that is rumored to be announced Monday), and just used that as my primary machine? For Adobe products, I could just run Boot Camp and use'em under Windows when I needed speed.
Hmm, a very interesting question indeed. Here's what I sent back:
Hey Zane -
I'm sitting in the Austin airport waiting for my flight, so I've got a little time to write this out.
personal junk excised from email
Now, portable work:
1.) Expandability - short of an eSATA card in the Express34 slot, FireWire 400 is your fastest connection bus on that machine, which is a bit of a bummer. RAM costs more, internal hard drives of reasonable capacity cost more. 160 GB is biggest laptop drive I've heard of, and cost/GB is higher.
So the internal is just a boot drive and a little bit of project files, any "real" media files of substantive size probably need to go on a FireWire or eSATA disk.
RAM costs are much higher too, and your expandability is limited. While Apple might list a 2GB max RAM capacity (I think that is right, I'm just guessing), the price on those modules can be shocking - double check it for yourself either from Apple or from crucial.com or other reputable vendor.
Monitors - you can drive a large monitor, I think even the 30", but that's it - the laptop monitor is your second monitor. That can be a bit limiting.
2.) Functionality - you've got decent processor speed, you've got the ability to drive a second monitor. BUT....especially for 3D stuff, that graphics card isn't full-on compared to modern desktop GPUs. For 3D, that's a big deal.
3.) Compatibility - not all software is Intel native. Especially of note: After Effects. Performance is sloooow under Rosetta. And AE plugins don't work in FCP 5.1 on Intel. So that can be a limiting factor as well. There IS NO Intel based AE plugin API, nor will there be until sometime next year.
4.) Dual Boot - so if you ARE going to dual boot, you CAN use Windows After Effects, Photoshop, and Illustrator for snappy performance. But you're having to reboot and take minutes to switch modes. Definitely a slowdown for me where I'm used to keeping everything open at once and just bouncing around as needed.
...but that also implies buying Windows, and buying the Adobe Production Suite
And now, having committed a partition to Windows stuff, you need room for the full install plus the data. I have yet to hear whether you can write to the "other" partition when running either Windows or OS X - if it is a virtual Firewall, how are you going to share files? Oh yeah, there's that FireWire drive, you'll use that for data, right?
...except that if it is PC formatted, you probably have a 2GB file size limitation. And if it is Mac formatted, when booted under Windows you won't be able to see it without additional third party software, such as MacDrive. This is actually what I'd recommend.
So if you had the MacBook Pro (and the rumor is that the 17" will be announced at NAB, maybe), AND WinXP, AND maybe the eSATA card and drive, AND Windows Adobe Production Suite, AND Final Cut Studio 5.1, AND a FireWire drive, AND MacDrive software, AND an LCD panel that you could tote around and use as your "real and big" desktop, AND an external full size keyboard, AND you upgrade the RAM and hard drive enough for production purposes and Boot Camp, THEN you've got yourself both a dual boot system that is a decent desktop replacement.
The things you won't be able to do:
1.) Have HD-SDI input/output (could do uncompressed SD with an AJA I/O BOB, is FW400 based, but then it needs to "dominate" the bus, gotta use eSATA storage then)
2.) Have seriously high disk throughput unless the Express34 based stuff is good.
3.) drive two big monitors
4.) Add any PCIe card for whatever you might need
5.) add cheap, big, internal drives
That's all I can think of right now, I'm sure there'd be more. Wanna add that up and see what it costs?
That's how I see it so far, or at least as I saw it sitting in an airport. OK, now I'd like to hear your ideas, folks - what do YOU think? Click on the comment link below and chime in.
And if anybody has time to add up the costs of what I'm laying out here, please let me know, I want to know but haven't had time to add it up.
If I'm reading this right, this is way cool -
-Flip4Mac now has PPC and Mactel versions.
-they're adding support for importing footage from Grass Valley Infinity cameras, which record to disk, as well as K2 media client and media server systems (I don't even know what those are, guessing stuff for Infinity?)
-initial support is for DV25 and uncompressed audio file transfer into FCP for editing
-additional format support is planned
It's VERY nice to see prompt support of MacTel hardware, even though the pro model desktops aren't even expected until later this year.
No Photoshop native on Intel Macs until sometime in the first half of 2007. And when they say first half, you can bet it is towards the end of that timeframe. If it were going to be January, they'd say early 2007. So - May/June it is, I'll betcha.
Oh, and for proof of the "lazy developer/small markets" argument about some apps only being available on Windows, and developers telling users to use Boot Camp? It's already happening - Chizen, head of Adobe, said that Mac users that want to use FrameMaker should use Boot Camp - they aren't going to make a new Mac version (nor have they for some time I think anyway). Hopefully, this will be a "keep it alive, sort of, on the Mac" solution to the problem of lack of availability rather than "we dropped Mac support because Boot Camp is there" kind of a thing.
First Blu-Ray computer drive, and it reads AND writes, even to the 50 GB disks.
Movie playback isn't possible yet - requires software that isn't done yet. And there will be Windows drivers, but no Mac drivers yet.
Japan price around $850 US equivalent, so it's pricey. Maybe these will be an option on the Intel based Mac desktop computers I'm hoping will ship this fall? Or at least from another manufacturer if not Panasonic?
These are some of the first plugins to ship for Intel based Macs. Very curious to see how fast they are in comparison to their G5 counterparts. Twixtor is one of the best bang/buck retiming tools out there, so it is great news that they have an Intel native version. RealSmart Motion Blur is pretty cool, too.
There are some issues with Intel and PPC file compatibility, so read up on it before diving in.
Synthetic Aperture has announced the Colorociter Colorist's Workstation, a four joyball hardware panel to control lift/gamma/gain and shadow/midtone/highlights when using Color Finesse software. Additional knobs and a 2 line alphanumeric readout complete the device, which connects via USB to your computer.
A shame that FCP doesn't seem to be supported by the device. It's about $4000 US.
I'd been really looking forward to this and Color Finesse 2.1, but it turns out that it has some major, major holdbacks:
-it does NOT play back in realtime
-you can definitely look at first/last frames, but I don't think you can scrub
-no windows or vignettes - as in no soft selections for color correcting only part of an image
These factors, on top of the price of the software, make it something I'm suddenly far less interested in, especially since Final Touch HD has come so far in the last year.
Perhaps in time, features such as realtime playback, vignettes, and scrubbing can be added that will make it a more competitve product.
Of course, I may be wrong about some of this - much of the commentary above comes from comments posted on my site from when the software,
Some tidbits while I catch up on mail - it appears Apple is udnerclocking graphics cards in the Macbook Pros in order to extend battery life. 35% on graphics card bus speed, 41% on memory according to the AppleInsider article.
I'm not panicing about it - no reason Apple (or someone else, for that matter, hint hint third parties) might not come out with a Preference Pane/setting that would allow you to crank it up, either all the time or only when connected to a hard power source, much like the processor speed can be ramped depending on whether you're plugged into AC power, or the screen dims on battery power, etc.
"Some daring MacBook Pro users successfully used the third party ATITool software to uncap the full potential of the ATI chip. They found it reduced the battery life of their MacBook Pro by about 30 minutes, but did not over heat the notebooks or cause any other side effects such as display artifacts.
However, one user said the uncapping "took only couple of seconds" to cause the system's cooling system fan to spin at a speed he "never experienced before.""
(from the article)
another year, another Digital Cinema Summit. As usual, here are all my raw notes, as I took'em, high speed typos and all:
NAB DIGITAL CINEMA SUMMIT:
Overview of day:
-Symes is moderating today instead of Hobson
-New Projector Technology
11am - Post Production side of Digital Cinema - distro package etc.
1:30 - 3D
3:30 D-Cinema in practice
5pm it's over
Pete Lude - heads up Solutions Engineering for Sony
-their job to find solutions, software integration, etc.
-he works on DCI implementation stuff
-first commercially viable cinema projector was about 100 years ago
-pivotal year for 2006 - over 300 US auditoriums equipped with digital projection up and running
-he toured a bunch of SF projection facilities
-variety of infrastructures and environments that this stuff is deployed in, 16-24 screens/theater facility, fiber of high speed copper, Metro on Union in SF, art deco theater had a 45 year old Phillips projector that is still up and working
-will the projectors that go in today work as well in 45 years? (I sooooo doubt it)
-starting off w/overview with industry experts
-Pete Putnam - president of Roam (Rome?) Consulting (Pennsylvania), tests/consults for all kinds of display tech, HDTVexpert.com is his site, (I should check that out), has written extensively for a ton of magazines, etc.
Cinema Projector is easier to define now - main projector for screenign feature films, compliant with DCI 1.0, secure system, automated operation
-can also be sued for ads, promos, etc.
-not technology specific, but standards specific
2048x1080 or 4096x2160 res
coner to corner uniformity of 85%
14 ft-L brightness
-white pot .414x,.351y
-2000:1 minimum contrast (on/off)
-150:1 intra-frame contrast
-gamma of 2.6
-you want smooth gradation from black to white, but is the hardest part (the grey)
-a flashlight has a 4000:1 contrast,
-30K lumus in DLP
-average contrast 400:1 DLP, 250:1 LCD/LCoS
-12 bits at LEAST is necessary
DCI requires 14 ft. lamberts
foot lamberts x 3.42=nits
how to figure if there's enough light on the screen?
crunching the math to calculate a 7.6 foot screen, need 230 lumens, for twice as back needs 890 lumens, 384 sq. ft needs 3600 lumens (in a dark room)
-60 foot diagonal screen needs 14,600 lumens from the projector
-"the brightness requirements pile up in a hurry as the screen gets bigger"
-light bouncing off the stuff in the room and bouncing back also
-iinstantaneous contrast in your eyeball is about 100:1. Get older, low 60s
-we need more than 100:1 since our eyes adjust on the fly
-wide field of view is important in a darkened room, to avoid eyestrain from seeing that bright screen against a dark background
-ambient light degrades contrast in a big hurry
We need so much horsepower in order to:
-best grayscale performane not usually at max brightness - usually lower, need room to avoid crushing steps
-headroom needed for top/bottom
-lens apertures affect light outputs
-is easy to crush top or bottom
-if you get a projector bright enough and it's not powerful enough, likely you're blowing out the highlights
-should be able to get bright enough, contrtasty enough, AND be able to get all shades of grey too without crushing or blowing out
-color and white balance:
DCI gamut is MUCH bigger than 709 (HDTV)
-LOTS more green in DCI
-blue is about the same
-DCI has more red
CRT displays "pure" RGB color since we can balance the phosphors
-LCD/LCoS/DLP projectors are dependent on light source, and that has a color temperature
UHP/UHE lamps are cheaper, but are tougher to balance
-Xenon is much more costly, purer color, doesn't live as long either
-35mm projector w/Xenon has a spike in it
(See pic 1)
-lamps have 1000-3000 hours, 50% of lamps will last to 6000 hours, no guarantee, some lamps can burn out after 500 hours, poor consistency
-Xenon doesn't last as long, a bit more predictable
Lasers - coherent light source, high light at low wattage, blanaced spectral energy, long life, but has problems with speckle, has safety issue
-lasers have been problematic - have to diffuse the light, and blend'em together to get color.
Stuff to consider:
high temp polysilicon (HTPS)
other (lasers, etc.)
-1st 3 can do 2K, and 10K lumens or more
DLP uses little mirrors - digital micromirror device - can be cycled super fast using pulse width modulation. They can make grayscale images, so need 3 for R, G & B. Smaller systems use a color wheel, bigger use 3. Can do different aspect ratios and can be anamorphic
-Pulse width modulation - two positions, on and off. Value of grayscale is determined by how many cycles it is on in a given instant. More ons is brighter, fewer ons is darker.
DMDs in many sizes and aspect ratios, 2048x1080 is highest in use, is a pure digital light modulator, pure monochrome, no color capabilities
-Christie CP-2000 can do 16 ft lamberts for 82' screen
Barco DP 100
-66 ft screeen
NEC 2500S, 23,000 ANSI lumens, 2000:1 sequential contrast, up 82'
Christie DW-6K - 7000 lumens, 1280x720 res, secondary projector
Panasonic PT-DW7000, 16 bits, 200:12< 1366x768, secondary projector
LCOS- Liquid Crystal on Semiconductor
-reflective imaging system
-light goes through liquid crystal and gets bounced back out,
-has a high fill factor (fewer lines between)
-easier to achieve pixel density than with HTPS
-light has to be polarized
-LCOS optical engine tends to be more complicated than DLP
-reflact light into RGB,
-HVC 2K D-ILA panel has'em up to 4K, doing an 8K demo privately shown
-fill factor is over 90%, can be used in front and rear projection. 48" rear projection is worth checking out, for post production color critical apps
-Sony makes LCoS, they call it SXRD, they have 2K and 4K panels, cinema projector is a 4K, 90% fill factor again
Sony SRX-R110 is top of line 10K lumens, 2000:1, 4096x2160, only good for up to 40 foot screens (not 60 or 80), Sony is probably working
SRX-R105 is good for 25' screen
JVC DLA-QX1 - 7K lumens, 1000:1 sequential contrast, for up to 25' screens
JVC DLA-HD10KU, 600 lumens, 2500:1 sequential contrast, 2048x1080 res, for post facilities and screening rooms, good for like a 10 footish screen. (good for post houses)
Transmissive High Temp Polysilicon - is not taken very seriously, the tech is getting better. Portables were shown in 1993, come a long way since then. Have 2K res systems now for front projection, 1920x1080, always uses three panels, using a prism, dichroic filters, precisely mounted to sync up RGB paths. Can fit in a tabletop
-Seiko Epson has a 1080p LCD panel, announced earlier this year, good for front and rear projection tech, first product is the Sanyo PLV-HD2000N, 10K lumens, 1000:1 sequential contrast, 25' screeens valid, a 2ndary projector
-any of these can be used for DCI IF they meet specs, all are suitable for secondary projection
-biggest hurdles: intra-frame contrast ratios, replicating DCI color camut and white point, dooing smooth greyscales wiht no color shading - no tints anywhere across the image, modulator efficiency vs. lamp power. Reflective systems have an advantage since it doesn't get filtered down through the process.
What about temporal stuff to to 48 fps? DLP is fast enough, LCoS has a lag as they twist, it is a combo of sample and hold and a motion blur artifact. Large, low temp LCD monitors have the issue, LCoS SHOULD be OK, but he's not sure. They can all refresh at 72Hz, if you're blanking inbetween that'll "clear the pallete" (my phrase/understanding). He hasn't noticed any bad temporal or motion blur artifacts with these technologies.
Brian Claypool - senior something at Christie Digital, snr. product manager, is on the SMPTE DC28 standards committe (the DCI spec committee)
-first stuff was 1.3MPixel (no longer made) (since they didn't sell the 8000 units they'd hoped for)
-CP2000 - 2048x1080, the "X" model has a split head design to put parts in different places
-encrypted dual link HD-SDI interface
-approx. 23,000 lumens with the yellow notch filter
-14 ft. lamberts on 80 ft screens roughly
-2006 are doing 100-115/month
-nearly 1000 2K projectors installed
-extensive monitoring and troubleshooting stuff in the projector as part of the Christie/AIX deployment, they can forecast problems, keep datalogging of critical component info (power draw, lamp performance, temp of fans, etc.) - helps'em predict component failures.
-bandwidth and signal processing
-illumination - Xenon bubble lamp is best cost/performance now
-consolidation and integration to lower price and increase reliability
Bandwidth - projector embedded watermark technology - that the server will do that work, but eventually you'll need that in the projector or want it
-3D technologies - takes more horsepower to do it, 48 fps, triple flashed up to 144 Hz
-4K and 8K source intgerfaces
-advanced optical input interfaces
-advanced gallium nitride "pure white" aren't bright enough, maybe in 10 years
-Xenon arc light is the way for the next several years
-laser has safety and speckle problems
-now, there are tons of parts to integrate and keep it working. Be nice if there were more going on in the projector - does it need a remote server, or do it itself?
JVC is NOT presenting today, nor is Seiko/Epson
Gary Mandle of Sony - he's worked on SXRD, he's focusing on digital cinema -
sony - Next Steps in 4K - Full Systems and More Light
-shipping a 4K DCI spec projector, LCoS tech they call SXRD
-have'em in several theaters, starting to roll out
-to meet DCI spec and address security, gone a different direction.
-they've designed an enclosure that the FIPS 140/2 spec required in DCI.
-entire system sits inside the enclosure. ALL of it.
-one touch - projectionist hits one button to power it up and it'll run it's schedule or can be programmed.
-eliminates link encryption, can play with other vendors and use a non-link encrypted connection to give theater owners more choice on integration
-inside the enclosure, there's a media block (Sony made), using a generic fiber channel RAID, a proxy server - it manages what is happening in the device, as you monitor for service, there is one reporting system
-a computer that drives the screen management system, loading clips, turning on/off, handles the general operations of the projector.
-below that is the securty watchdog hardware - sensors on doors that keep track of open/closed, storage in flash RAM that can't be erased so that if you unplug it it is aware of beingopened up, meets FIPS 140 reporting requirements.
-QuVIs is a partner, QuVis removes their RAID and media block and SMS and puts it into the system and can run the QuVis stuff instead of Sony.
-has tons of physical security stuff
-reports out of the box for remote monitoring, control for running lights, draperies, etc. Even for fire alarm.
-as FIPS requires, you can't access any equipment, if you have to run a cable into it there's a trough on the bottom
-inside, to keep it one button push, there's apower management system that goes to each piece of equipment. Internally, it powers up in proper sequence, the operator doesn't need to get involved in any of that. A single button on top to fire it up.
-SRX R220 system coming up - need brighter projectors. Current models are 10K lumens, OK for 40' screens, for bigger the SRX R200 is 18K lumens (hope for more later). SXRD device, similar optical stuff, also single enclosure solution, physical packaging is along the same lines of the other model.
-prototypes showed this summer, shipping thereaftrer
-first 4K units out, 130 units in October, still have a production capacity higher than that, 18K lumen 4K res product shipping late rthis year, after that they get into the 3D stuff
NEXT UP - Dan Huerta - director tech & implementation for AMC theater chain. He participated in 1999 first digital cinema projections. Member of SMPTE & NATO, blah blah, theater owner/operator viewpoint
-milestones - in trying to give best out of home moviegoing experience, until about 1995, had sloped floor auditoriums, analog sound, status quo for along time. in 95 started doing changes - the megaplex vs. multiplex - stadium seating, more comfortable seating, better projection and sound, digital audio came into play in Jurassic Park,
The Past - sloped floor
Present - 35mm presentationa and digtital sound, better screens, stadium seating
Future - 2006 and forward - true start of digital cinema from a deployment perspective in the industry
in 85, analog automation, console or pedestal,, platter, audio rack - 6ch, light dimmer, cost was $
2005 - 35mm projector, CPU automation, console or pedestal, platter for film, 8 ch audio, light dimmer, cost is $$
2006 and beyond - digital projector, CPU automation, console or pedestal, SERVER, audio rack 8+ channels, light dimmer, cache server, audio D to A conv, cost is $$$
cost per screen for 2005 level tech as shown above, $70K to $100K/screen for 35mm projector, automation CPU, console or pedestal, platter, audio rack, speakers, screen & masking (8 channel audio)
current digital systems have digital 2K or 4K projector, console or pedestal, CPU based automation, Server/s, Dto A converter, audio rack 8 channel, speakers, screening and masking - cost per screen $120K to $150K. Projector and server are the only differences in the equipment. About $50K/screen higher.
will need more power in the booth for those side by side gear, will need sufficient heat exhaust (can limp by 400-700 CFM), with larger digital projectors, will need to upgrade exhaust systems, esp. in older buildings, conditioned space requirements will increase, might need
-need more networking LAN for data/content transmission paths
-fiber infrastructure from cache server to screen servers (cache server, I'm guessing, is where THIS screen's contents sits, not on central server)
-pre-show systems are beefy and complex, but currently based on 100 megabit netowrking, gotta go to 1000 megabit to handle feature content distribution
-How to improve the experience:
-comprehensive theater/screen surveys will eilinate many integration and deployment problems
-maintain screen and masking quality to ensure sharp image borders and light uniformity - clean, plumb, level masking edges, uniform lighting is needed, but if the screen isn't maintained/cleaned will not be as good
-adjust existing moveable masking to meet SMPTE masking stuff. Currently stuck based on lens design because they can't fix it. With digital zoom capabilities, will be able to get precise image sizes
-ensure audio systems recieve equal attention during routine service and QA audits
-cross train existing tech staff on digital networks and projection/server hardware to ensure consistent content/product delivery - no dark screens! Current staff in the field aren't up to snuff to support all this new gear
-carefully review all Service Level agreements from Dig. Cinema manufacturers or service organizations - need to make sure it is all properly maintained, and make sure they know what's going on
-happy to be seeing deployments and progress as it is going today.
3D rollout - unlikely Sony will be retrofittable, or it would be prohibitive in expense. going to work with InThree or others, not a Sony product (or Real D, their competitor), since using a polarized system, have to be careful, have to be aware using linear vs. circular technology, getting mixed answers when addressing contrast between left eye/right eye. Which one's better? What would client want? Then issues of cost and making it work. They are waiting an seeing hwo the theaters react, want to get the 18K out the door to see how it all works.
-My question - are costs holding back deployment? AMC is still in fact finding mode and has joined up with others to form National Cine Media (NCM) that is evaluating all of the possible solutions. Long answer saying they are cautiously proceeding.
END OF SESSION
Fithian, head of NATO (National Association of Theater Owners), invented the phrase "ODS", has done tons of work in the field, etc.
-the size of this meeting is a good sign that this thing is working
NATO is trade group for motion picture theaters, represent top ten chains, hundreds of indies, members in 40 countries and all 50 states, coordinate with other groups in 25 other countries
-the letter he sent several years ago was about whether digital cinema was a a working thing.
-those in motion picture venue industry said they'd need interoperability standards, since then NATO has issued several other documents
-Nov 2004 adoped resolution w/priorities for tech, rollout
-NATOonline.org has'em all up there
-tech group put out tech requirements doc that fills in some gaps in the DCI implementation specs
-ODS = Other Digital Stuff (odious)
-he's going to give the broad overview
-Digital Cinema is biggest change since the advent of sound
-have been using same basic equipment for about 100 years
-given that broad of an industry to make that big of a change is a very complicated process
-two different tracks in new tech in cinemas: high end digital cinema projection, the other is everything else that happens electronically - pre-show, alternative content, etc.
-these two tracks are merging now
-4 different categories of players:
1.) Movie studios that make the product that is the driving force of the business
2.) movie theater operators
3.) techology companies that make this equipment
4.) other content generators that can be shown on big screen, not necessarily the studios
Understanding their needs paints the picture
1.) movie studios want a better looking product, a portion of money goes to theater owners, and part to the studios. In everyone's bbest interst to have better results on screen.
Studios want lower production costs - making prints, shiping prints in cans is very expensive. In fully digital model, 80-90% of those costs can be saved.
Studios could save nearly $1B/yr to make and ship film prints if they can pull it off
2.) Theater owners get a better product on screen. No reason to do it without that. Flexibility of programming is a good benefit for them, they have trouble with Tues/Wed/Thurs, not Fri/SAt night. Being able to program differently, with sports, rock concerts, church groups, whatever, to be shown up on the big screen. Bill Gates likes to hire the cinemas to be seen by all his employees. The potential for digital cinema are large. Film is great, but it is limiting. Great image, not very facile, tough to mix up programming schedule. RISKS - COSTS!!!! Once transition is made, this is the technology for decades to come, theater owners need to think about costs over the long term.
3.) Integration, upgrades, life expectancy, etc. Projectors are STABLE tech that lasts decades. Replace bulbs and a light assembly every once in a while. Theater owners are concerned about how often they'll have to change out their digital cinema equipment. TEchnology is moving fast - projector, servers, sattelite companies, etc. are all moving fast. Their goals were to start selling, and start selling fast. NATO wanted to make sure the equipment worked right before they started buuying it.
4.) other folks making contenet - within the movie sector, indies! gotta get money to make movie, distro, and marketing costs. Print costs are a major hurdle. In digital era, can shoot a movie, do the indie thing, and it can be put up on the screen. It'll become a much more democratic platform (UNLESS THERE ARE DIGITAL PRINT FEES!).
Nothing will ever replace major motion pictures as source of revenue. They just want to fill up the rest of the week.
2006 is big year for digital cinema, actually happenign as opposed to about to. 4 reasons:
1.) tech specs work has moved to a point tha tthere are interoperable products with high standards. Digital sound transition did NOT go smooothly. The 3-4 standards that came out didn't work together. DCI was open to their comments, even after fighting for a couple of years. That groundwork made it all possible. The supplemental specs they came out with will help it all work.
Transition is starting now.
2.) The quality levels "are there". A few years ago, wasn't as good or better than film. Today, with all the ways of judging quality, digital will make a better experience to allow you to sell more tickets and/or charge more (otherwise why bother?). Film prints are so fragile, that by week 3 the digital loooks as good as it ever was. Film is fragile. Digtal projection looks as good on the 100th screening as the first.
3.) Biz models are finally there - in the long run it is more than twice as expensive as a film projector - it wasn't possible for theater owners to make it work - couldn't sell enough tickets or charge enough. There are now several 3rd parrties (AIX, Technicolor) which is a 3rd party negotiates with studios individually, and deals with theaters individually to install stuff in theaters. Deals going both ways for 3rd parties. Virtual Print Fees is the studios paying the savings of a film print. $$ goes to cover the cost of what the studios are de facto paying for. Theater agrees to contribute in some ways to get digtal cinema into their theates.
4.) reason why now is because theater bidness needs it - had a bad 2005 at the box office. Mostly, movies weren't as good in previous years, but substantial home environment options. Internet, game platforms, HDTV, iPods, etc. Multiplicity of other entertainment options are out there. Theaters have to evolve to keep up. "Dgital is way cool for that generation" of kids/people that go to movies that are into all the tech. This year, 2006 is up 5% or so, May is expected to be best May EVER. Digital cinema will help them keep up that growth.
HOW FAST WILL THIS HAPPEN?
a little over 400 digital cinema screens in teh US today, he predicts between 1200 and 1500 by end of 2006, 2007 will be a big takeoff year. 150,000 screens areound workld, around 30,000 in US now. Be like a bell curve rate of adoption. How long will it take? Nobody knows. He's guessing domestically the vast majority will be digital in about 10 years. When Europe and overseas adopts, there won't be film anymore.
couple things have to happen first - need beta market testing data this year. Some 3rd parties have announced they'll be doing that testing, and that's good. But need real world experience. Little beta screeenings for years, need DCI spec, integrated facilities, have never been installed int he field. Need a coupla hundred screens running and getting beat up for 6 months. Are time stamps right? Are compression streams working? Did the keys arrive since they were sent separately?
Worst thing would be that they fail and there's a dark screen up in front of audience. They need a beautiful transition. Right now, it is beautiful chaos.
This year is beta testing. If those are sucessful, next year it'll take off.
Will the conversion be universal? Will small town theaters survive? Big concern for NATO. Did NOT get assurance out of their stuff with a single third party. Competing 3rd parties to do a certain # of screens. Smaller theaters are combining to get purchasing and marketing clout, a buying cooperative, 4000 screens are ganging up for this.
Waht aboout international space? in the US, these models are relatively simple financially, since most prodcut from US studios. In Europe, 65% US product showing tgheir.
Here, is easy to to VPF (virtual print fees) since US studios making movies ffor US based theaters. More complicated overseas where that isn't so close to 1:1.
John Fithian's opinion: lot of debate about 2K/4K issues. This debate is overblown - resolution is important but not only component. Contrast, color, light, etc. are all crucial too. Don't get into a 2K/4K divide as a marketing issue. He wants to market digital is better than film and better than the home experience, if we get into a "mine's bigger than yours w/4K" is not a helpful thing. Gotta be in the front 5 or 6 rows to tell the difference if you are the "golden eyeballs" that are sitting out in this group here today. Beyond 4K, nobody can tell the difference. We need to tell'em digital cinema is better than film, better than home. He's psyched about 3D as an experience, is troubling from a cost perspective. It'll be in enough screens (NOT ALL!) to make it worth it for folks like Jim Cameron. 3D screens grossed 2 to 3 times as much as the 2D screens. 3D is really hard to do in film, digital makes it much easier. It is a very nice value add, but not the driver for digital. Cost is a significant issue. Cameron will be keynoting tomorrow evening.
In short, 2K/4K is NOT helpful for the industry in his opinion.
NEXT UP: Equipping Post Facilities for D-Cinema, Wendy Aylsworth, Chair - DC-28 (the DCI spec group)
Post production for digital cinema
Wendy Aylsworth has degreees in comp sci and management planning, she used to do animation tech and theme park stuff, she's does a lot of standards work, chair of DC28 for the last few years, head of Warner or Disney something
What Processes should I (post faiclity) do?
what all should they do?
-should I always do the same master each time?
-how will square pixels affect my scanning and processing? (from usual 2K/4K work)
-possible to do work at 2K level when master is 4K?
-what tools are available?
What knowledge does my staff need?
...for interacting with other facilities?
....for doing QA?
Terry Brown - Technicolor Content Services
Jim Whittlesey - eFilm/Deluxe Digital Media - packaging expert
Dave Schnuelle - Dolby Labs - talk about QC and 3D proceses
Stuart Monksfield - Thomson/Grass Valley - talk about post workflow process (I should get his card!) talk about tools and changes
Terry Brown - Technicolor, talking about worklow and post production for digital cinema
talk about areas of concern, some things are easier, some things are troubling issues that are unresolved
(see picture 2)
-working on amodle of scanning 4K, using high res proxy (2K or HD pixel res)
-(classic Mac/PC issue here - slide not right, made on Mac, played on PC)
can use the high res proxy for making masters for other deliverables
-10 bit log workflow
-how to archive all this? 16TB for 2 hour project (originals and final)
-studios want color seps
-9TB for DCDM - 16 bit file format
-what archived, how archived, how validate that the archive is there?
-gotta lotta storage - if one project, 20TB storage needed. For a big facility, how fast can we get the bits out of the big bucket. Need efficient archiving solutions - will it be something other than tape (hopefully so, w/better shelf life)
-talk about how DCDM is made
-at end of DI process, have a 10 bit log master. gotta pass it through a transform to get a 12 bit, linearized signal, convert that to XYZ color space, gamma 2.6, take it to 2048 or 4096 and wrap in MXF container.
LUT Mashter 2000 (does everything but resizing and MXF)
the resizing issue - one of the former fathers of DCI spec, was about to do his first DCI delivery package, and realized he had a problem - got this scan done at 2048, and a projection aperture that is slightly smaller, gotta resize it and we don't want to do that.
W/in Technicolor, they talked about a possible solution:
(Pic 3 here)
There are S35 Cam Aperture, .98
S35 prjection is S35 .945
had various things that just didn't all line up, is a problem since don't want to resize
-Back In The Day, trying to figure out how to scan film for doing VFX, gave'em the #s for scanner stuff. Asked for 2K and 4K res, should pick round base 2 numbers, and picked 2048 and 4096, gave those #s to the machinist, who dealt with metric not inches, converted to metric, came up with 2 #s for pitch for scanners - 12 or 6 micrometers for scanners. In use, get .968 vs. .98- another mismatch based on standards conversion. In digital cinema and apply that aperture and apply the ratio gets 2117 samples per inch.
.945 *2117 equals 2000 pixels. If 2048 is the projection standard, can crop 24 pixels off of each side to make no resizing have to happen.
For anamorphic, scan at 4K does a slight reduction, taking anamorphic sizing into Super35 space. Everything we do ends up will fit into 2000 pixels of projection aperture. A uniform system that there will be a projection aperture for 2000 pixels wide.
(and no, I didn't follow all that either)
What aabout 1920x1080???? (probably just pad it out I guess)
Jim Whittlesey from deluxe - for making a digital cinema package
DSM - digital source masteer gets converted to DCDM
DI master to goes DCDM TIFF files, JPEG 2000 compression, then MXF packaging
Equipping a faclity for D-
JPEG 2K compression:
DCDM image to .j2c
need 2K and 4K support (deluxe has done 2 4K projects, starting a third)
need SPEED for JP2K compression.
-4K solution runs about 2 frames/secd
-laid out a schedule, 10 TB of data, 8 reels of film, suddenly had 16 reels and 20TB since they were going to burn in the text and not use the features
-of course, worry about time image quality
-variable bitrate that can vary the image quality for the size of the file you want to generate
-afterwards, wrap'em into MXF track files and encrypt them
-both of the packages they are looking at will do it too, third party tools are coming along to do it, and of course all has to be DCI spec compliant
-when encrypt files, have to make key files that are stored securely, and transport them securely. Using Key Delivery Message as a way to do it
digital cinema audio -
wave file, one channel per file
192 frame leader w/ 2 pop (8 seconds)
over HALF the files they get are wrong when they get'em
-gotta have a workstation to verify the incoming files, since these specs are outside the specs of what is being done out there in the world
-need a D-Cinema theater for QC
-need a screen height of at least 10 feet to see the artifacts
-need a projector
-need one or more servers to test on different vendors
-need good audio B-chain is OFTEN OVERLOOKED
it is more than a theater....gotta be able to QC against original DCDM or for files used for filmout
-playback in realtime the uncompressed source (both the DSM and DCDM)
-what to do for 4K? there's not realtime playback!
-a client was concerned about noise, had to go back and check against source
-need to QC against a check print or relase print
-studios think something is wrong with audio, and the complaint studios had was in the check print - so therefore needed a film projector as well!
-EDL - need reels start and in and out, need 192 fps header, 88 frames at end for leader and trailer. They'd always hand a little spreadsheet
-since done 2 others from other post houses, and never gave the info on headers and tails, slows things down
-tech requirements to adhere to:
1.) the DCI spec, and the sooon to be released Fraunhofer Test Plan/Procedures
2) ALL relevant SMPTE specs
interpretation of these specs is variable - so need to test the packages, and test the KDMs, and figure out thoose which are interoperable and those that are not.
-need to be gentlemanly about it when there are problems rather than yelling at each other. Don't be trying to establish a competitive advantage
-Digial Cinema Mastering is data centric - need very fast network, very fast storage, large stoarge system,s probably JPEG2000 will become a Big Render Farm task
-don't buy anything without a lengthy real world evaluation, it's all bleeding edge beta type stuff. hardware and software.
-be prepared to develop in house tools to supplement it all
Dave Schnuelle, Director, Image Technology, Dolby, talking about 3D
stereoscopic/3D in post
-concerns - low light levels, polarizer green shift
-3D exhibition systems use some kind of polarized system that filters light down, since each eye is only on one at a time, 1/2 the light level. There's a big shift in gamma, need to raise the lower levels for 3D presentation to look acceptable compared to a similar 2D projection. 3-5 foot lamberts showing on screen during Chicken Little. In your brain, it worked out more OK, is more acceptable than anyone thought it would be. In color correction, there is a green shift, so gotta remove some of the green (color correction operators wearing Chicken Little glasses). Crosstalk between eyes - with polarizers it is never perfect. Silver screen and passive glasses will always happen. RealD has a process they call "ghost busting" that helps that. With active glasses you don't need to do that. Ghostbusting makes it not suitable for 2D projection, FYI.
-double flash or triple flash - typically, pattern of left-right-left-right, each eye gets two exposures. An animated movie with no motion blur in frame, 3D sites ran triple flash - 144 Hz. When watching dailies with a white screen and can't use RealD, active glasses can't run triple flash, so you're looking at it in double flash mode. Active glasses are sync'd by an emitter behind the screen. They don't go fast enough to go triple flash. If shot on film and have in camera motion blur. If want to screen it gotta use Z-Screen for triple flash.
the issues are related to QC and the display process
-can't use screening rooms for every step in the post process, there's lots of steps to be done before you tie up union operators etc. CRTs are going away and not acceptable for this kind of thing. They've been experimenting with other kinds of displays. His own experience - using an Apple 1920 x1080 with an eCinema Systems box. Pixel for pixel master. the box runs at 24Hz. The DCM 23 is a better product from same guys. Dolby has been using it for QC. Can see 1 or 2 bits more on this than they can on CRT, depending on content. Much higher MTF, and much better look at detail. When first started doing compression, did a 24p HD-SDI plasma for screening. First time he sat down to QC on it on a real project. Started writing down about artifacts. They'd look in a screening room and it was OK. Plasma displays in their dither or whatever generate artifacts that look like compression artifacts. Is more of a "presence indicator" than an actual QC device. Is good to see if you tape broke up or your server stopped working. Bad for QC!!!! Conventional telecines are are WRONG for theaters - sharpness, color, etc. don't work.
In 1999 they built a theater with a 10x24 foot screen. (THX standards for mastering room). Audio system is REALLY IMPORTANT for QC!
Film print to cinema is close, but film is the threshold that you have to be able to compare. gotta have film side by side in order to do the work these days.
question of white point - there is a standard now, but more research is needed on the subject, and we need some meausrement tools.
(Pic 5) is from $7,000 out to $30,000 for light measurement. None measure black levels, gotta use LS-100 in bottom row. CL-200 is used for x & y color for film, is a good device for broadband light source, it'll give WRONG readings with digital projectors
Suart Monksfield of Grass Valley
post production workflow before
dailies - audio ingest (storage - 10 bi log)
DAT audio or DVD - audio ingest, isolate takes
-get the files in, break up by tapes, ptu some log & metadata in to save time later
-audio shows up 6 or 8 hours before
-film shows up, ingest images, no color grade, full range, run it fast as possible, transfer 30fps for either 10 bit log HD or 2K 10 bit log, with keycode coming in with it as metadata
-while that data is coming in, take the audio and image, viewing through a print LUT simulator, can apply the dailies grade using an ASC CDL, 3D LUT print emulation, check sync, skip ahead.
So is done digitally, syncing audio on a SAN while capturing images
-after a few takes have been taken, a separate system can start playing them out through an emulation LUT. Media files - DNxHD, QT, whatever
Production metadata - FTL< ALE, etc.
months later, when doing second scans, get the scans select list, put the film up a second time, scan at 2K or 4K version depending on requirements, speed takes a hit of a max of 7.5 fps for a 10 bit RGB
-storage in the dailies process may or may not be the same, might be going to an LTO archive
-one images into storage from second scan, create some proxies as a background task from the 4K stuff for 2K or HD res (still CPD 10 bit log, DPX files)
-do a preview conform (popular option), viewable at HD res
-do a preview grading on a DI grading solution, pulling in that ASC CDL that was created earlier during dailies, do a DI grade and loading up proxy versions, use a LUTher or other 3D LUT and lay it down to HD tape or a server to screen to the audience
days/weeks later, the decisions are made for final stuff, call up EDL from previous stuff, going back to 4K data, making any new versions as needed, rendering those out as needed, final DI grade will work using either HD or 2K res, once grade is set it pulls in the 4K res, applies same corrections, and renders that final high res to storage.
Can do frame by frame QC process, can fix dust, scratches, etc. while that slow process is done, they can take it into system as 4K master to do scale and crop , do RGB to XYZ space, gamma 2.6 correction, burn in a print emulation LUT (since they want it to loook to match the film), then send it to DCDM mastering house.
dumped out as #'d TIFFs to a FireWire drive
1.) practicality of scanning once only at 2K or 4K is not possible for the whole workflow, since time to turn around dailies, is not goign to work in terms of cost and data sizes (7.5 fps vs 30fps)
2.) exchange of ASC CDL from dailies to preview conform, speeds up craetive of the final look (can be done on set with Viper or Genesis done on set)
3.) final grading decsions can only be made using the final scans. Creating those HD res proxies made from scan selects means not the 2nd scans; gotta keep it an exacdt match
4.) storing and working with raw log images means avoiding costly mistakes from post and VFX. Keepingit log allows holding metadata and 3D LUTs
5.) color manangement and previs with a 3DE LUT is critical to each deliverable for matching the film look
6.) per to perf 2K/4K scans will always need an amout of scaling when creating the Image DCDM (projecitoj is full paerpture 2048 and 4096, means that's a good chunk of slight scaling that takes a lot of time)
All of this is just one part of making the DCDMs - this doesn't include audio, multiple audio languages, text, all the other stuff
So who can do all this? Who's going to develop these products? A new breed of digital cinema post production houses? Post for digital cinema is not the same as traditional film post production
DPX vs TIFF? all are supported (OpenEXR, TIFF, Cineon, etc.)
Why can't scan once? Can't sit on it for 8 months until they come back (I guess they are dumping it in the meantime?), too much to hold all that data in the meantime. Plus, can't scan fast enough - but even on a Spirit 4K, even if the pipe were fast enough, you'd burn up the film. Also a signal to noise problem if they tried to run it through the system that fast.
grain is sharper when 4K scanning. 2K scanning is often sufficient - scanning at 4K and dropping to 2K on the scan (oversampling)
So, in my usual Indie-Centric mindview, dailies are tough. Telecine dailies, do real scans later? gotta think about all this before I can vouch for it as solid - gotta think about viable film centric workflows (or just shoot on Viper/D20/Red!)
LUNCH BREAK, I went to go fetch by badge from Red, and saw the booth - as you enter South Hall Upper, walk down the left edge of thw show (starting at the Sony booth), and after you pass the escalators to go down (with the big blue banner overhead), shortly thereafter start looking to your right for a big Red logo hanging from the ceiling (big red button with a silver surround), over a red and white tent. Things were getting set up. It's going to be busy, it's going to be cool, it's going to be really, really crowded in that booth I predict.
AFTER LUNCH -
tomorrow morning, 8:15am breakfast, 9am stuff starts
-we've got 3 pairs of glasses for various 3D display technologies
-digital is a highly enabling technology for 3D
-presos from two organizations (with a 15 minute break to use a different screen - the two technologies use two different screen techologies)
-Matt Cowan, Chief Scientific Officer, RealD
-Stereoscopic Vision Cues:
-Conflict of Cues
-Demo using single projector, and passive glasses
How we decode depth:
-monocular cues (to tell it's in front or behind)
-stereo cues (using our two eyes)
-physiological cues (?)
Light and shade - gives us cues in depth
-relative size cues - all soccer balls are same size, for instance - bigger must be closer.
-interposition cues - things that are on top or in front in the image, the stack depth and occlusion gives us a clue as to where something should be
textural gradient - in a field of wheat, big stuff is closer and small stuff is further - as the texture gets finer, that must be further away
perspective - depth cues from perspective - lines that get converge, etc.
-shading cues - things far away are hazy in the distance
these were all monaural cues - you could close one eye and it still works
Stereo Cues Parallax
-eyes toe in or out
-braing references angle of eyes
positions interpreted relative to angle
-limits on amount of toe for comfort
-a dominant cue
our brains measure the angle we angle our eyes in to guess at distance, and if it is closer our eyes are pointed more towards each other.
over the age of six you can't focus on your finger an inch from your eyeballs
-stereoscopic is a DOMINANT cue, and a strong driver for us to see 3D
-eye knows ehre it is focuesed by the contraction of the focus muscles
-cue stronger in younger people
-cue non existent in those of us who wear bifocals
when all cues line up, stereo is excellent - our 3D perception is 1st class
-when cues conflict, stereo breaks down
-not all cues have to be "perfect" but it helps - if most are in place, can have quite acceptable stereo
Size Cue - things at bottom at screen are usually closer
Breaking the frame:
-projection screen represents window
-window truncates objects - normal for objects behind window, objects inside window must stay inside frame
-action in front of screen that breaks frame degrades stereo image
-the screen itself is felt to be the window we're looking into.
PICTURE - two Viper shot - can't break the edge as this does, next picture shows the two cars again - can't show it inside the room, since gotta live inside the frame, inside the room of the window
Simultaneous motion - left and right eyes want motion to appear at the same tmie, sequential frame 3D naturally offsets motion timing, parallax changes frame to frame
-double or triple flash required to improve motion - more flashes = better simultaneity=better paralllax
Double flash is 96 fps - L1, R1, L1, R1, L2, R2, L2, R2, etc.
Triple Flash - L1, R1, L1, R1, L1, R1, then L2, R2, L2, ETC.
flicker fusion rate - around 120 fps - get smoother motion - the brain integrates it into one thing
timing offset - the time between left and right eye image display. If it sees a difference between left and right eye frames, if it is 120 or more fps it is perceived as occurring simultaneously, otherwise there is a confusion as to where the object is in space - because if the left eye thinks it is happening at one instant, and the rigtht
Ghosting - the amount of light that leaks into the wrong eye
-caused by imperfections in the system
-gives confusion in visual system - it isn't sure where that part of the image belongs
-can be fixed somewhat by pre-procesign to equalize the leakage
-the ghost is the remnant from one eye going into the other eye
-to get rid of it is to subtract some of the ghost from the image
Symmetry - image brightness and color - euqal brightnes and color in each eye is the normal you expect. It is possible to have some systems that introduce a color tint into one eye and that's confusing (such as the anaglyph, red/blue 3D old style 3D)
Parallax and screen size - image at screen plane is converged, image at infinity appears with 2 1/2" offset
-offset in master means that 2 1/2 inch offset for 30' screens becomes 5" offset on 60' screen (2 1/2" is the distance between your eyes, more for supermodels : ) )
-need to adjust parallax offset according to screen size - so that you could ideally adjust that on a per screen basis, AT THE THEATER
Vertical parallax - eyes not designatede to accomodate vertical parallax - if there's any vertical disparity between left and right eyes, your brain doesn't know how to deal with it
-there's a demo from Polar Express - we'll be watching ReadD Z Screen - there's a rectangular screen in front of the projector - it alternates left circular to rigth circular for polarization
-MDI next generation screen we're projecting on
-RealD eyewear to watch it
-this'll run through a QuVis server wtih a ghosting mitigation box plugged in also
-you can tilt your head and it will still "work" unlike some other systems
watched several minutes of RealD converter Polar Express - that was good - sharp and clear and the 3D worked quite well, except for very fast motion across the screen (in terms of crossing a large percentage of the screen over a frame or two)
-anaglyph has been used for finishing Imax stuff for stereoscopic proofing of the images
www.ray3dzone.com is the presenter's website for this portion
-audiences love 3D when it's done properly
-used to require two projectors to do some kinds of 3D
-the point is to get different pictures to each eyeball.
-if you want something to look infinitely far away, need it to be 2 1/2" apart on screen.
-if you want it on the screen, there is zero split
-if you want image halfway between screen and viewer, you need R & L 2 1/2" apart, but L & R are on opposite sides as the infinite
-3D filmmakers want to be able to cpature images where everything built such that you have 2 1/2" interocular (between eyeballs) distance.
-in CG, it is no problem to do - just need to know scale
-stereo base, interpupilarry distance, interocular are all the same thing
-in the past, it was impossible to put'em side by side since they cameras were so bulky. Used to use a 45 degree mirror between two cameras aimed at each other.
-Ramsdell configuration was two cameras at a 90 degree angle, one camera shooting through a half silvered mirror, the other shooting straight through it.
-underwater scenes in Creature from the Black Lagoon shot this way
-with the lens to lens (first model shown) would allow you to control convergence, so not parallel shooting, but could set up convergence. If you don't limit the background, the cues may make your eyeball want to diverge, not converge (and hurts your eyes) - if background wasn't limited,
Today, digital capture techniques - Rodriguez used the Sony HD 24p model for Shark Boy and Lava Girl, Spy Kids 3D, also fro Aliens from teh Abyss, Fujinon lenses used (smaller than Sony lenses, got 2.75 interocular
-two JVC HD-10s (ugh! Sucky cameras!) used for an HDV based film, interocular of 3.75 inches. The fix if interocular is too big, use longer focal length lenses, and some kind of formula to keep from getting too close to your subject, if too close get Pinochio thing - people's faces look too long and stretched.
-they shoot 720p, shooting 30p, HDV format (I know all about), he transcoded using Cineform,
-top down L configuration - Night of the Living Dead 3D using two Sony cameras, can get interocular down to zero - keeping interocular changing helps keep it fresh. (ummm...OK)
Dynamic Interocular - Cobalt Entertainment used two F950's in an L configuration - while rolling, interocular can be changed in sports footage.
-this rig can do 6 inch to 0 interocular in 1.4 seconds. It is Steadicam-able, but boy, it looks awfully big and heavy!
1.) Hyperstereo-Increased Interocular - if you overdo it (20 or 30 foot interocular - makes for striking 3D to make it pop, but it does make it look like a miniature because the cues are so odd)
2.) Hypostereo (reduced interocular) - 0.25", makes things look and feel bigger
3.) Spatial ambiguity -
Digital Advances - CGI is inherently volumetric, so very capable of doing neat stuff
Cyberworld 3D - was 3D rendred, stereoscopic was done 100 ft to 0 - when doing 3D, very easy to change interocular
-Polar Express 3D
Stereo Conversion - DI=CG - there's all kinds of possibilities, for manipulating stuff shot on 2D
-3D metadata - Rodriguez has done cool stuff with it ot do live action filming with the RCS camera with green/blue screen, that metadata is sent to the CG studio for doing conformed stereoscopic space for composites
NEXT UP -
the other model of 3D projection - has 96 fps the DCI spec
the glasses are active, battery and stuff brains and stuff. Glasses can do 200 Hz. The limits are in the projectors, not the glasses.
-InThree is using a straightforward white screen. Both use d-cinema projectors, the screen is a matte white screen. They have two infra-red emitters that sends a signal that bounces off the screen and bounces off the screen. Disadvantage is cost - cost is 1/10th of anything on the market. The purpose of the glasses is to separate the left from right eye, so left lens is clear during left eye display, right eye during right eye, and the "wrong" eye is closed at the right time. Theater owners do have to learn a new skill - to manage the glasses
-we'll see some of the orginal Star Wars dimensionalized.
-the left eye is EXACTLY the same as the source material. The right eye view has been dimensionalized.
-exhibitors were not excited about digital, since they couldn't charge more, and there wasn't anything in it for them.
This kind of dimensionalization, and the 3D makes digital appealing. The economic model works for digital. You can't do 3D more efficiently analog. D-Cinema let's em do it for 1/10th the cost. More gets pulled in over time.
-theatrical exhibition will let theater owners make more money
-watched a few minutes of dimensionalzed Star Wars Episode 4 - WOW - it works! digitally projected from film source, it LOOKS GREAT. Only snag - at 96 Hz double buffered per eye, fast motion is confusing - maybe triple flash rather than double would help? Nothing is keeping that from working here - or maybe it is my hyper critical eye - won't the X-Box generation have the same kind of critical eye?
-dimensionalization - the way to do it is to provide a library or recognizable material and convert it over, as in non-choice material. Wanna see some little indie, or choice material like Star Wars, Matrix, etc.?
-It is a seriously non-real time process - takes a lot of help to do it. George Lucas and Rick MacCallum helped'em put this together (obviously) - when George first came in to see dimensionalized material, George said "I'm sold, I'm sold, I'm sold" over and over again.
-always on the vapor trails of all the other failed attempts. It is an uphill battle to do it, they'd seen all the other previously attempts to do it. They've done hours of dimensionalized material. It's not a gimmick. 3D doesn't have to be 2 inches in front of their face to get it. It needs to be realistic, pleasing experience, the next step in motion picture imaging (and maybe for home video some day). So they studied up on good vs. bad 3D. Bad 3D to them is hard to look at for very long, getting eyestrain or visual confusions.
-wanted to educate self of what works, so they got their own 3D shooting rig to understand it. They take something totally flat and do a "depth restoration process" - you don't want cutouts, you don't want other problems. Analyzing 3D captures, in order to understand that you have to understand the images and what causes eyestrain and stress and confusion. Now that there's an outlet, there are quantifiable reasons that eye fatigue is caused - just gotta know what they all are.
-anything that deviates away from what normal vision would do will cause confusion, and confusion can cause fatigue can cause a headache
-wanted everything they produce to be free of eyestrain
-with non-choice content it was very hard to watch - he calls that bad 3D. Not that it didn't have a neat effect, just that you couldn't watch it.
-how to make it through a 90-180 minute movie if eyestrain fits in from 15-60 minutes?
-this is all their software that they've done from the ground up. Software developers are all PhD's at their shop - they understand the elements that cause bad 3D viewing. It isn't a copy of something off the shelf, it is their ever evolving tool. Went from 10 up to 200 employees in 13 or so months.
have been at the mercy of the glasses - it is perceived that they are competitors to RealD - but when they started out 6 years ago, there was no RealD or any others out there. If they'd done in that year, the studios would have said "where are you going to show it?" since studios needed more than the Imax screens to show it. Teamed up with New Vision Technologies, they needed an active glasses solution since they didn't think that 10s of thousands of screens would convert back to silver screens. Active gives best solution, 200:1 separation, but to say how that came about, needed something where cost was no biggie. The glasses we watched on were about $25/pair. In time, it'll be cheaper. There's been no reason to do it in the past is all.
-were at the mercy of the skepticism over digital cinema
-an exhibitor got up and said at '96 ShoWest and asked about digital - "Why should I spend $150K/screen to do what I'm already doing?" - exhibitors make their money for folks to come in and see product. Folks aren't going to go just because it says digital. Exhibitors were suffering because movie product comes out so soon after the movie, folks wait until it is on video.
-Lucas will redo all six Star Wars, starting with the 30th anniversary (so 2007) to be dimensionalized by InThree. It gives them reason to re-release. Is there a reason to re-release The Godfather? Not really. In 3D? "Of course." he says, I say maybe.
HEY - BY THE WAY - AND YOU CAN'T SHOOT IT OFF THE SCREEN WITHOUT GETTING A MESSED UP COPY - UNLESS YOU FILTERED TO GET JUST THE LEFT EYE, IN SYNC, WHICH WOULD BE A HUGE PAIN
1.) Is it realtime?
2.) Is it frame by frame?
Yes and no - is akin to VFX work. Lucas and Jackson etc. say it is like where they were when they started out. He's saying they've only scratched the surface to make it better, faster, and advance to the point of no limit to how many they can do per year.
3.) Every movie need to be dimensionalized?
No need - but they hope that good 3d/bad 3d is taken seriously.
-they don't even deal with interocular - they don't care about it.
For them, every scene, they think of things in and out of the screen. They have active workstations where they can judge it on the screen and change it on the fly. VFX you see, 3D you FEEL IT - it works or it doesn't, you have a gut sense of it. It makes sense or it doesn't. Vertical disparity, where is infinity, what is too far? Every shot has got multiple keyframes where they establish something that changes - camera moves, something comes into frame, etc. Judged by how long something is up on the screen, etc., and it's looked at by at least 6 people that eyeball it and see how it works. When he started doing it, he was concerned that he was adapting to something. But with a half a dozen folks eyeballing it, makes it consistent. Content can go anywhere. Screen size is a factor. Less light coming through the active glasses. You wouldn't want to watch a 2D movie through those glasses, but the 3D is more compelling. Passive systems with polarizer, half the light is filtered and never hits the screen. NOTE - PROJECTION LIGHT LEVELS ARE LOW IN 3D, HE SAYS THE GLASSES WILL GET BETTER - WHAT ABOUT GETTING BETTER/BRIGHTER PROJECTORS?
MOVING ON -
How is the rollout going?
Dave Schnuelle - was very involved in Chicken Little
Chicken Little -
-had a very short timeline
-opening date doesn't move
-84 stereoscopic installs, vast majority in last 2 weeks, install locations all over the US
-some install sites required an airplane ride
-gear didn't always show up on time, so required more than one trip.
-D-Cinema was running on 84 screens on opening day, 3000 screens on film prints
-3D screens did far more business per screen than film screens
-people forgot they were wearing the 3D glasses - obviously not a problem
-in order to do the movie -
-the basic system was a 2K projector, a pair a dolby show players to decode left and right eyes, with a syncronizer between'em. Done for time - no time to do anything else. In the future with JPEG 2000 units, the single unit can do 3D as a standard feature.
-MA10 unit to run automated curtains and lighting
-preassembled short racks and shipped to theaters as a unit to save time
-encrypted files sent as MXF interop format while DC28 work progressed
-MPEG-2 used as compression codec for this, still using same frozen version of packaging standard, for now using an interim solution, at some point you'll see the industry have to make a massive, simultaneous switch to JPEG2000 based stuff.
-Z-Screen can be rotated in and out as needed for 3D or 2D usage
-the switching rate of the projector helps make it better - did 144 Hz, triple flash. That flash rate the strobing effect isn't apparent to the human eye
-why 84 screens? That was every projector they could get their hands on. They ran out of parts, they all ran late, logistics and a really good shipping manager is essential. Even then, gotta plan for interruption for the schedules. Customs held projectors for several days for food and drug inspections. Hurricane was going through Florida the week they setting up. Had warehouses with no power, had theaters with no power and had to be installed with no regular power (brought in gennies), had to coordinate installation of RealD and silver screen stuff
-for three weeks kept in close touch with Dolby and RealD
-best way to get these things done is get everyone in one room loooking at the same charts
-gotta have a bulldog attitude, you can't walk away and deal with it later, there is no later. You have to have a plan every time you hang up the phone. A local shipping specialist, tracking and controlling independently of the manufacturers shortened the time tables for shipping by as much as a week
-don't trust anyone, check and recheck
-better to be annoying than miss something
-comfirm and confirm, get every field guys' cell #
-people in the field are the key - if they aren't concientious, if they aren't competent and dilligent
-had to not just install but also train their on site staff
-put 50 people in the field towwards the end of the project.
chuck Goldwater, in charge of DCI when it was going, is now president and CEO of Christie/AIX joint venture that is putting in projectors, one of the two main organizations that is making D-Cinema happen.
-DCI spec published last July, stopped the moving target from moving so much, gave a foundation to start building digital cinema
-just under 300 screens in under 30 locations, 550 screens in 70 locations by end of June, by end of year 1500 installs (all JPEG based, presumably JP2K based)
-heading into big summer season, excited to see so many JPEG digital versions. Signed deals for close to 2500 digital screens in 38 states, including all of Carmike, Ultrastar, Galaxy, cinetopia, plus a 9 screener in NYC as a beta site.
All these folks did heavy research before signing up.
-Christie/AIX is ready to go on this stuff.
....and with that, I'm tired, I'm done, I have lots to do and write, so I'm bailing a bit early. I don't think I'm missing too much that's too important.
Lots of details, some I've covered, some new, definitely worth reading. I'm off to the Digital Cinema Summit right now, no time to summarize for you folks.
Go read it!
And if you're attending NAB, find me on the floor in the Red booth all day Monday and part of Tuesday.
UPDATE - OK, NOW I'VE ACTUALLY READ IT - THERE'S LOTS OF STUFF I HAVEN'T DISCUSSED YET, SO GO READ IT..
Ted talks about wavelet based codecs, how it's a tapeless camera, the use of digital magazines, all kinds of good stuff.
Windowing the sensor down to get higher frame rates. Lots of juicy details!
Codex will be in the booth showing recording capabilities, and Assimilate will be showing how Scratch and work with 2K files in a viable way.
What about NLEs?
And then you'll most likely see a Final Cut station as a popular NLE choice. A lot of folks know a big part of my life at AJA was working with Apple, so that is a logical choice for me when it comes to what NLE to show first. We will of course support as best we can whatever NLE, DI products, storage products, etc. that our customers for the Red camera ask us to support.
(from Ted from article)
This little board goes in a drive enclosure to let you attach 5 SATA drives that connect to the host computer by only one SATA cable. Performance tops out at around 225 MB/sec, but that's OK since that's more than most HD folks ever need for single stream playback.
Their usual incredibly long detailed analysis. Check it out.
This thing is in, or will be in, more and more enclosures. I think port multiplying is DEFINITELY the way to go for MOST enclosure needs.
Friday, April 21, 2006
I'm in Vegas already, playin' with my girlfriend Melissa (we saw Avenue Q last night, and I still have the song "The Internet is For Porn" stuck in my head). So no news today, but I'll be taking notes at the Digital Cinema Summit as usual tomorrow and posting them when I can.
For those interested in checking out the Red camera, or finding me to say howdy, I'll be in the Red booth all day Monday and part of Tuesday, SU1401.
Until tomorrow, I'm going to go do the Viva Las Vegas thang.
Wednesday, April 19, 2006
Once again, Scott Kirsner's a great resource. See the article he refers to, but also note the stuff Scott lists the other guy missed.
AE 7 new features that we care about:
-support for 32 bit floating point compositing - this is HUGE, and will result in better looking, more realistic composites. That take longer to render (such is life, my friend...). Note that feature is only in the Pro model.
-new retiming capabilities - again Pro only. Supposed to be good. I need to download the demo and check it out.
-new UI stuff that is groovy
-3D features are also Pro version only
-new UI more like DVD Studio Pro instead of Pallete Spackle 6.5.
-full featured curves editor for animation properties - which is the serious mojo for any motion graphics and visual effects work.
-new Pixel Motion is that improved remapping stuff - uses motion vectors, is mo bettah but mo slo.
-Pro version has Timewarp effect - has more options
-OpenGL 2.0 support- can preview antialiasing, motion blur, and blending in real time - which used to be a huuuuuuuge time sucker in the past. So a honkin' graphics card makes a difference. I'd like to test this feature against the range of Macs I have. Maybe I'll make a test file to send out for everyone to play with and stopwatch.
-32 bit mode supports OpenEXR, HDR images, Cineon, 32 TIFF & PShop, and more (hopefully DPX is in there too)
-3D engine unchanged
So Pro version is clearly the way to go if you want to do heavy VFX/compositing/color work with this toolset.
Note, VERY IMPORTANT - it is NOT Intel native for OS X, so if you need to do any heavy AE work, a Quad G5 is the way to go for now, until that new desktop Mac ships, which I'd say is no sooner than this fall.
I could go either way on this one - Apple is not having it's usual Sunday press briefing, so that could be interpreted as either:
a.) no serious new hardware/software to show, or
b.) Apple is playing ever tighter to the vest, and waiting to splash stuff on the show floor
The article also has some thoughts about Final Cut Pro 6, but I'm going to share my own thoughts on that later....bwah hah hah haaaaaaaaaaa.........
So, the first US consumer decks are shipping. Anybody getting one that reads this?
A very small list of films available from the get-go:
The Phantom of the Opera
Million Dollar Baby - NOT - delayed!
by April 25th - Apollo 13, Doom, then Jarhead, Cinderella Man, Assault on Precinct 13 on May 9, Chronicles of Riddick, Bourne Supremacy, Van Helsing, and U-571 on May 23.
Not an inspiring line up.
Of those, I'd like to see:
Serenity, Apollo 13, maybe Jarhead and Chronicles of Riddick...feh.
UPDATE - here's a review of the first unit from Toshiba. (the HD-XA1)
-it's slow - 35 seconds to boot until logo, over a minute until a disk in it at power up shows the first menu
-nearly 40 secs from disk insert to start playing - a lot of consumers might think it's broken
-you can bring up a navigation menu (such as for chapters) WHILE IN THE MIDDLE OF PLAYING and it is a graphic overlay - that is, in the vernacular, "totally schweet."
-It'll be about $800 in the US
-reviewer was dissapointed
I'm certainly not going to rush out and buy one of these.
Interesting to note that HD-DVD players SO FAR appear to be hundreds of dollars less than Blu Ray players - that alone may tip the tides in the format wars. The codecs are the same, so visual quality should be the same.
Tuesday, April 18, 2006
SYNTHETIC APERTURE NOW SHIPPING COLOR FINESSE 2.1 HD+ STANDALONE
Advanced Color Correction Software Adds New Self-Contained Application
SAN JUAN CAPISTRANO, CA -- April 19, 2006 -- Synthetic Aperture today
announced shipment of Color Finesse(R) 2.1 HD+, the standalone version of
their award-winning Color Finesse color corrector. Color Finesse will be on
display at NAB 2006 in Synthetic Aperture's booth, SL5138G.
"As a plug-in, Color Finesse brought advanced color correction to whatever
host application it was plugged into," said Robert Currier, President,
Synthetic Aperture. "But being a plug-in placed limits on what features we
could offer. As a standalone application, Color Finesse HD+ breaks free of
those restrictions, and we've been able to greatly expand its
Color Finesse 2.1 HD+ manages color correction projects on its own, with
its own timeline rather than relying on the host application's. Still
images, QuickTime movies, and image sequences can all be loaded onto the
timeline for correction. Supported image formats include JPEG, TIFF, Targa,
Cineon, and DPX. Clips can be grouped into scenes so that they can be
corrected all at once, and individual frames can be assigned markers for
Integration with Final Cut Pro
For projects being edited with Apple Final Cut Pro, Color Finesse 2.1 HD+
makes it easy to import the project into the color corrector. Color Finesse
accepts Final Cut Pro-generated XML files which it can read in as a project
onto the Color Finesse timeline. Once the color correction is complete, the
project, now with corrected footage, can be exported to a new XML file and
re-imported into Final Cut Pro for any final rendering.
Color Correction Tools
Other features include realtime preview playback, import and export of
color correction settings via the ASC CDL (American Society of
Cinematographers Color Decision List), and the ability to store up to eight
QuickGrades(tm) which can be recalled with a single keystroke.
Color Finesse 2.1 HD+ ships with over 500 color presets, including film
looks and emulation of lighting gels from Rosco, Lee, and GamColor. Users
can also create and save their own presets.
Color Finesse 2.1 HD+ is resolution-independent, handling standard-def to
HD to film-res projects. For those whose needs are more modest, Color
Finesse 2.1 SD, which is restricted to NTSC/PAL resolutions, is also
Color Finesse works entirely in floating-point, allowing proper handling of
overbright image areas, avoiding unsightly clipping problems. Color Finesse
does not require a specific video card, allowing it to run on every type of
system, from laptops to multi-processor workstations.
Color Finesse 2.1 HD+ is fully compatible with the just-announced
Colorociter CS-1 Colorist's Workstation.
Color Finesse 2.1 adds direct support for video previewing without the need
for additional software. Preview hardware support includes FireWire/DV and
boards from AJA and Blackmagic Design.
To ensure accuracy, all previews can use LUTs created within Color Finesse
or imported from Photoshop-, Discreet-, or Quantel-format files.
Pricing and Availability
The Color Finesse 2.1 standalone application is currently shipping for
Macintosh with the Windows version scheduled for release in Summer 2006.
List price for Color Finesse 2.1 HD+ is $1995. Also available is the
standard-def-only Color Finesse 2.1 SD for $995. Upgrade pricing is also
available for those who previously purchased the plug-in version of Color
About Synthetic Aperture
Founded in 1995, Synthetic Aperture is a provider of practical tools for
digital video and audio. Synthetic Aperture publishes the Color Finesse
advanced color corrector, Echo Fire video previewing software for Adobe
After Effects and Photoshop, the Test Gear test instrument plug-in for
After Effects, and the popular free utility Test Pattern Maker.
Further information on Synthetic Aperture may be obtained by calling (949)
493-3444, accessing the company's Web site at http://www.synthetic-ap.com,
or by emailing firstname.lastname@example.org.
Some quick first impressions:
-can import FCP XML files to transport timelines in and out...but as I learned from my six intense months of daily work with Final Touch HD, XML import/export isn't guaranteed perfect.
-it DOES handle still image file formats - which is a plus over Final Touch HD, which doesn't let you preview and color correct them within FTHD (as of version 2.2.4, last version I checked this on, v2.5.3 or 2.5.4 is latest I've seen posted on their site)
-can preview out of an AJA or Blackmagic card
-Mac version shipping now, Windows version shipping this summer (last NAB, he said Mac version would ship summer 2005)
-HD version $2K, SD version $1K,
-doesn't say anything about realtime performance, so I'll have to check it out
-says it'll work on any machine, no specific video card needed - that may be the tradeoff betwen Final Touch HD (currently $5000 for the HD version), which has STEEP hardware requirements and is aimed at the experienced colorist.
-I'll definitely be checking it out at NAB
-there are 4 or 5 viable Mac based color correction software packages, listed here very roughly in order of ease of workflow simplicity:
- Final Cut Pro's built in tools
- Color Finesse plugin in FCP
- After Effects after using Automatic Duck to get timeline to AE
- Final Touch HD using XML
- Color Finesse HD standalone (presumably this easy)
- After Effects with or without Color Finesse a clip at a time
- Shake for color correction
I'm slow on the draw on this one - I got tipped to it Sunday night but now is the first time I could blog on it.
-Digital Cinema camera
-records direct to disk (hallelujah! No tape!) using the Cineform RAW format
-Cineform RAW is a visually lossless wavelet based codec, particularly designed to work with RAW format
-designed for indie filmmakers
-combines a 12 bit 1920x1080p camera with Cineform RAW
-10 F-stop dynamic range
-7" LCD touchscreen interface (NICE! hopefully a good UI on it)
-IT friendly connectivity
-4 hours continuous shooting on hotswap 160GB notebook drive
-single large CMOS sensor
-good DoF - almost as good as S35mm
-interchangeable optical assembly allows for PL mount lenses
-F and compact C mount lenses can work too
-can use follow focus, matte boxes, etc. on 15 or 19mm rods
-detach the camera head and connect over Gigabit Ethernet (OK, that is HOT!)
-supports 1080p24, p24, p30, and 720p up to 72fps
-can edit using Prospect HD over a LAN
-or connect removable drive magazine to edit from that (connectivity is...?)
-10 bit, full raster codec - no subsampling! 10 not 8 bits! That's great!
-using a large aperture Zeiss S16mm lens, similar to a 35mm DoF when shooting F5.6
Mike's (more) Comments - this all sounds GREAT.
The good stuff:
-10 bit, full raster, untethered, virtually lossless codec
-native editing in Prospect HD, a pimped version of Premiere Pro 2.0
-Depth of field akin to 35mm
-very reasonable price if the image quality is good
-another good entry in the not cheap, but not hideously expensive digital camera range
-no deck required - just connect drive to an appropriate WinXP system and you can copy it over. Decks that convey this kind of quality are $80,000-$100,000.
But I think this is a great thing and look forward to checking it out at NAB - I already have an appointment of these guys.
The Bad Stuff:
-ONLY edits natively in Prospect HD/Premiere Pro 2.0 - how to get transcode that data so it'll work with Avid or FCP? Perhaps shoot it over HD-SDI into another station? Is there another way?
-IT based workflow means IT based backups required - something different than "stick the tape on the shelf" to be learned and mastered
-no deck means limited interoperability with other NLEs, unless you use a Prospect HD system as a deck...which isn't such a bad idea, except for issues of 9 pin deck control, etc.
Friday, April 14, 2006
HD4NDs Exclusive: Ted Schilowitz, now "Leader of the Rebellion" for Red Cameras, talks about Red & NAB
I've known Ted Schilowitz (formerly of AJA, see article below) for about 3 years and periodically had a chance to talk and work with him. He's smart as hell and very affable and easygoing and always very energetic and enthused. And now he has a new job title:
"Leader of the Rebellion" for RED Digital Cinema Camera Company.
So what does that mean? He's going to be the point man at RED, touching on areas of marketing, engineering, the development process, you name it.
I talked to him on Tuesday, so I'll just dive right in with the Q&A:
Q: (Mike Curtis, HD for Indies) - So you've told me you're leaving AJA. What will be your role at NAB, and for which company?
A: Ted Schilowitz, Leader of the Rebellion:
At NAB I will be wearing the RED hat, as of yesterday AJA knows I'm leaving. (that would be April 10th - mike)
Key AJA folks have known for a while so that it could be smooth and AJA was covered, the transition has been happening for some time now.
Q: What will your role be at RED?
A: "I'll be employee #1 of RED."
"'Leader of the Rebellion' is the official title. Jim and I are driving this thing to the finish line. One of the things I like about Jim is that he's REALLY involved day to day, NOT in any way just a figurehead. This is his job, he is directly working on the project, he's got job responsibilities. We work hand in hand together every day, calls and emails all hours of the day and night, he's legitimately working hard day and night. Akin to my last life at AJA, the head guy there works every day - head engineer and get his hands dirty. It is part of his job and what he likes to do. I've been lucky enough to work for two different people that enjoy the process, not just the figurehead of the company.
Q: What was your motivation to leave AJA and go to RED?
A: Motivated by a couple of things:
"First, it was opportunity to change the landscape of another industry after doing so in desktop digital video world."
"Secondly, the time is right to prove a digital cinema camera can be created at a viable price point, and to take it out of science fair/experiment mode and deliver something that is viable for a full spectrum of users."
Q: OK, on to product - I see that all content on the website has been pulled - are all the previous specs up in the air?
A: "Things are holding true to form, nothing has changed to the point where people will be dissapointed. The ship has been tightened up, we've been getting the development process in line to make sense for the project, but most importantly 4K/2K/1080p/720p are all still in there."
I said that it sounded like a very ambitious undertaking on a tight timeline -
"We set bar high, and now we're hanging on for dear life." Ted responded jovially.
Q: OK, on to bidness: What will we see at NAB?
A: "You will see prototypes of the camera, you will see protos of how the camera will be used in a workflow, you will see things that really shock the world and shock the competition at how evolved and how much thinking has gone into it."
"This is not just your average bunch of crazy people working on something - these are REALLY crazy people having a good time and taking some hardcore risks, and now is the time to put our money where our mouth is and we're gonna show you what we've got and what we've been working on, one of our big targets is the person that spends their time reading HD4NDs- that type of indie maverick spirit that wants no restrictions on their workflow, and in their tools, they want something that really works for them without restrictions, our goal is to make that work for them."
(he's being lighthearted - I've met and talked to these people, they are hard core industry veterans who know what they're doing)
Q: Along those lines, what about codecs - have you picked one, how is it going?
A: "Codec development is coming on strong - we won't have specifics because we don't want to commit to what is still in development - that curve moves quickly in terms of what's on the landscape and what people know and understand. What I will say is that we are looking very hard at the advantages of wavelet based codecs to achieve the type of image quality we want to see at viable data rates for today's technology."
Q: What about your development timeline - how is it coming along, are you still going to be able to make the previous projections of testing this year and shipping early next year?
A:"We are in development and we are proud of it - we make no bones about the fact that we are in development, and we're going to share with you where we are, and we hope you will be impressed."
"We are working hard to hit initial targets, but because of the development process, that stuff will be a little liquid - we'll do our best to hit those targets - that being said, with any engineering product these are just estimates - we don't want to claim a ship date and miss it over and over. We're taking our best shot at hitting a ship date, but at this point it's just our best guess before we're ready to call it once it is close. We're just trying to be honest with everyone and let'em know how we're doing and where we are."
Once they get close to the finish line and can see it, they'll be able to call the ball from that point and announce a ship date was my (Mike's) takeaway.
Q: OK. Now at NAB, are you going to have any announcements about strategic alliances or partners - NLEs, accessories, hardware, software, etc.?
A: "In terms of development partners, one of the companies already we're in extensive talks with, and moving to a a development cycle with, is AJA video. There will be an announcement of something of significance between AJA and RED, and we'll talk a bit about it at NAB."
"It's really exciting for me personally - nothing would give me greater pleasure to see these two worlds working together. AJA has been pioneering the ability to really take desktop video tools and make them viable and reliable for mission critical, true broadcast use. When AJA came onto the market. A lot of maturing had to happen on capture card, CPU and software side, and at the right time, AJA came in and made it a professional tool. A few years later, AJA desktop technology on a Mac was used for little insignificant broadcast events - things like the SuperBowl and the World Series. I'm very proud of that fact and very happy to keep those two worlds connected and working together."
Q: I saw that the previously posted information on the website was pulled, and that in its place is now a countdown, counting towards NAB on the 23rd. Very Armageddon-esque, by the way, and I mean that like the movie, not the Biblical event.
A: "The first site was just to let folks know what was going on. Hold your breath to see what comes next. What you'll see next is very impressive."
(So sounds like no new info to be revealed until NAB, the countdown ends on Monday the 23rd - when the tradeshow opens at NAB. -mike)
Q: I was talking about the camera and what it was for, who was the target audience. His response:
A: "Your user group (talking about HD for Indies readers -mike) that reads the blog will be waaaaaay into it. We are focusing our design and stuff at the people that read your blog, along with the broadcast markets and ultra high end digital cinema world, that I think will embrace what we are up to. I believe that most folks want a product with no restrictions, they want a product that will elevate their workflow and elevate their quality."
Q: I asked another question, that in other words boiled down to would it live up to the hype that they were creating. He laughed and said:
A: "This thing is so f**ing wicked cool - this thing will knock your socks off, this thing is like no other camera that has ever been built."
Q: I then asked about testing that I'd heard about - were they going to show any test footage at NAB from the camera, or the chip, or anything?
A: "Camera tests are going fine, we decided to not show anything in advance of when we're ready to show 4K. We're pushing our way up the test regime, and are not ready to show images outside of a lab environment yet with our s35 sensor (give us some time...we're young!), that's part of the dev process we'll see in the next couple of months to show that. We have some targets towards Amsterdam and Japan in the Fall that will be a better time to see what we've got on that front."
Q: And will we be impressed?
A: "Hit'em hard, hit'em fast, make'em wonder where we've been their whole lives."
Q: What about this Mysterium sensor - is it ready yet? Is it real?:
A: "Mysterium is under development - the whole process, the whole thing, is still under development, and likely to be more refined over time. We're just going to show you where we sit today."
(so it doesn't exist yet as a finished product - mike)
I then got very specific with Ted about what we'd see and hear about at NAB. Here's the bullet point Q&A:
At NAB are you going to:
Q: reveal pricing?
Q: Reveal codec?
A: Not final specifics, just goals and targets
Q: Reveal details on lenses & their pricing?
A: "Check with us at NAB."
(I guess SOME things will be kept a mystery until the last minute -mike)
Q: Reveal details on accessories?
A: Some details yes
Q: Pricing for accessories?
A: Probably not
Q: Reveal any NLE PARTNER details?
A: Only the fact that we certainly understand there are a number of good choices for NLE these days, and we will want to support what our customers want.
Q: We talked about the ongoing development process, and making big leaps rather than incremental evolutions. Revolution not evolution. I mentioned that true, big progress comes from the the little mammals busy and scurrying around the feet of the big dinosaurs, not from yet another bigger dinosaur than the one before.
A: "RED is a new mammal - it's really all about innate understanding that new thinking doesn't come from the old dogs. It comes from peole way out in left field willing to take risks no one else will, push limits and push the envelope. That's where the surprises come from. It's been proven over and over, the ones you think will be the leaders of the next generation of tech are NEVER the ones who do. Look at music - if you'd asked 10 years ago who'd be the leader in terms of players etc., 9 out of 10 would have guessed Sony. Oops, look at Apple now. How do you change the rules? Number One, you set out to change the rules."
"I had success doing that with AJA, proving what good hardware and a Mac can do, that they could break a lot of rules. It's time to turn up the heat and take it up a notch." said Ted.
I (Mike) mentioned the quote about the unreasonable man -
The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man. -George Bernard Shaw.
Ted shot back the one about Teddy Roosevent and the man in the arena -
It is not the critic who counts, not the man who points out how the strong man stumbled, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena; whose face is marred by dust and sweat and blood; who strives valiantly; who errs and comes short again and again; who knows the great enthusiasms, the great devotions, and spends himself in a worthy cause; who, at the best, knows in the end the triumph of high achievement; and who, at worst, if he fails, at least fails while daring greatly, so that his place shall never be with those cold and timid souls who know neither victory nor defeat. ? Theodore Roosevelt, "Citizenship in a Republic," speech at the Sorbonne, Paris (April 23, 1910)
Q: That led into discussions of risky products and ventures -
A: "I've been lucky enough to be associated with successful projects, and I have a very very high confidence level in this one, the fact that we're willing to go way out on the edge and take a risk is huge - It's all about giving it your best shot, and putting all the effort into making it happen."
Q: Last I heard, you guys had a booth lined up for NAB - is all that still good?
A: "NAB south upper hall hasn't changed - SU1401. We'll have a 20x20 booth, it'll be something unexpected/expected all at the same time. Expect something different."
Q: OK, so at the booth - how many prototypes?
A: "How many protos? We don't know yet, we're riding the edge to get it all ready. The drama is good and everyone's having a good time. We'll do our best and hopefully have more than one."
---end of interview---
...and so I think that will make for a very entertaining experience. Their originally stated goals of revealing a non-functional prototype and their future plans are still in place, which is as much as they ever promised. I'll be looking forward to checking it all out, and if you'll be at NAB, I encourage you to hussle on over there the first day or two and check it out.
John Thorne as been groomed to be his replacement, and will be taking over Ted's duties there. "He'll do great." Ted says.
OK, readers, so why do we care about staffing changes at AJA? Read the next article for an HD for Indies exclusive...
One of the things that is making people say "Oh, yeah, I didn't think of that." is the fact that when you switch to MacTel, you also need to switch all your performance oriented software - Rosetta is fine for Word, but not for anything that requires realtime performance...like compressed video playback. Fortunately, Popwire fills in the gap with the Intel version of their Windows Media Video 9 playback component.
so yesterday I got the studio set back up (somewhat) in yet another punt configuration, just Macs, monitors, drives, and UPS's up on the tabletop and got them Ethernetted together.
Transferring uncompressed HD over a GigE network takes a while - surprise!
And while the most obvious answer might be to cue them all up to copy at once, that's actually quite a bad idea - because if you are writing more than one file to a hard drive (or array) at the same time, it is quite likely that you'll be writing them in a fragmented fashion, and that would hamper playback performance. That's bad, m'kay?
Keeping the media straight and NOT linking to the wrong folder is what's going to give me the heebie jeebies over the next day or two.
With RAIDs plugged into different computers than they were when capturing on set, and multiple copies of ever-evolving FCP project files that started on a half a dozen different machines, I'll be darn proud of myself if I can keep it all straight.
So just getting all the footage on the same drive, and getting it all linked up will be a logistical challenge in and of itself.
At this point I've got nearly 500 individual clips, and I haven't even captured all of the media yet. Yikes!
Also, I've been getting some flack about recording uncompressed and comparing the footage that way, folks saying this doesn't take compression into account, and isn't a realistc real world test.
Answer 1: Duh.
Answer 2: Yes, comparing uncompressed footage between cameras and giving opinions based solely on that would be VERY unrealistic and not applicable to untethered field work. That is why we recorded to BOTH native media (HDV tape, P2 cards, DVCPRO HD tape, XDCAM HD cartridges) AND uncompressed to disk at the same time for most of our test footage.
When we went to shoot at Auditorium Shores (outdoor location), obviously it would have been incredibly burdensome to lug all our computer junk down there, so we didn't. The purpose of recording uncompressed was:
1.) Is there a significant difference between tape and uncompressed?
2.) Is it a useful and meaningful difference?
3.) How hard is it, really, to set up and capture this way?
4.) After learning the quality difference, and knowing how hard it is to do, under what circumstances would it make sense to routinely operate that way?
A partial answer to # 4 is this: obviously only in a studio location, or maybe on a BIG set that was going to have a video village anyway.
I have a 3 foot cube that is a road case on wheels with compartments for G5 & RAID and breakout box, I sit a 23" LCD on the top of it for computer and video monitoring with a switch box. I didn't set it up this time because it takes about a day to rig & wire it all up, and didn't have time, and didn't consider it worth the effort in this case.
Thursday, April 13, 2006
Hey all -
so here's what's up (this is just logistics, no real data of serious interest - skip down for learnable stuff)
-we shot Friday through Sunday, reviewed Sat/Sun
-tore down on Monday, captured some more footage for Adam Wilt to take with, he took off
-packed up Monday and didn't drive off in the U-Haul until 8:30pm
-dropped by Neil Halloran's to return his gear that he had graciously loaned (and he himself helped out on Saturday)
-neighbors of his had dropped by and were enjoying wine and barbecue, I ended up showing them the stuff in the UHaul and I've sold the red chair that is Oh So Lovelly. Perhaps when he is sober he may not want it any more. : )
-in any case, came home and konked out
-unpacked the UHaul Tuesday, returned it
-now I have ALL of the computer stuff I own in original boxes or swaddled in padded wrap sitting in my living room, breakfast room, and studio
-I also have about 13 hard drives to return, and one RAID enclosure
-before I do that, I need to copy off all data from the questionable RAID and drives
-I need to consolidate all the footage in one place and start organizing it
-I need to then make a backup copy of that data
-then I can put my world back in order - I "broke up" an 8 drive RAID to make two four drive RAIDs
-THEN I can start organizing it all and getting it in proper shape
-THEN I can start doing meaningful analysis
-so it'll take me some time before I have footage analysis, like next week
-in the meantime, check out DVInfo.net's thread, and at some point Adam Wilt will have some stuff to say about it too.
MEANINGFUL DATA STARTS HERE:
Some observations from the shoot:
-plan further ahead than I did. For anyone else crazy enough to do a 6 Mac live camera shoot and have to buy, assemble, borrow, configure, etc. to get it working, the day before the shoot is camera capture testing, the day before that is setting up/hooking up/plugging in computer gear, the day before that is transport, the day before that is packing & padding and boxing up. Don't forget to make a packing list of everything you've taken to location, and if there is more than one person's gear involved, don't forget to label EVERY SINGLE LITTLE PIECE OF GEAR with a piece of tape with your initials or some other identifying mark that you care about getting back. It isn't that anyone is going to steal it, just that at the end of the shoot when everyone's dead tired and stuff is getting unplugged...who's is this?
-at last year's shoot, I had a little pad that each operator took notes in for timecode in, timecode out, settings, scene and take #, etc. This year we decided to centralize this data capture and Li was in charge of it, just one person. One thing that didn't happen was timecode in/out records for each camera for each shot - it would have been nice to have so that I'd know WHERE to go find a shot now that I'm capturing! My fault for not finding a way to incorporate that, either on the record sheets that Li was using, or giving each operator another pad to write scene/take/timecode in/timecode out on.
-along those lines, I forgot until towards the end a handy way to differentiate multiple cameras when all shooting the same thing - assign #s to those cameras, and at the beginning of the take, have people stick their hands in front of the lens with that many fingers held out. It doesn't have to be in focus, you just have to be able to count how many fiingers. Obviously, five is the max you can do with one hand. : )
-we were VERY LUCKY that everything worked as well as it did and we only had a few snafus
-the AJA HD-10A will only do 59.94 frame rates; the Multibridge Extreme can also so 50Hz frame rates, so plan accordingly UPDATE - THIS IS WRONG - apparently, I just didn't set it up correctly. Our unit was not labelled 50i/60i. I'll go back and look and double check, but I clearly didn't use it to it's fullest abilities. My bad.
-the AJA HD-10A is small, light, and easily taped to the sticks to hang underneath with cable strain relief; the MB-X is not
-the AJA HD-10A has a convenient HD-SDI passthrough that helps for other monitoring activities as well. The MB-X does too, and has a component analog out as well
-the issue of POSSIBLE (unproven as yet) performance differences on the MB-X for analog to digital conversion is being vigorously investigated - BlackMagic is aware of the potential but unproven issue but we need to table the issue until after NAB, they are getting ready to come from Australia to no time. They are adamant that there is no underperformance issue. I did happen to capture the 24p JVC GY-HD100U footage through both the MB-X and HD10A so I'll be able to do some analysis and see if I can tell the difference, either visually or via scopes etc. Further update - Grant Petty himself emailed to tell me they are using the exact same conversion chip as used in the HD10A, so there should be no significant differences.
-I found it very interesting that I was able to capture uncompressed 10 bit 1080i60 footage (160 MB/sec data rate) and 10 bit 720p60 (150 MB/sec) on 4 drive arrays without apparent problems. I'll have to analyze the footage for dropped frames, but in our capture tests when I had "Abort capture on dropped frames" for everybody's Mac, they did OK. Now, in order to do that, I had pre-tested all the arrays and partitioned them at the point where I guessed performance would fall below 200 MB/sec, and assigned the capture scratch to the faster of the two partitions. OH, and other huge point in that success - they were ALL FRESHLY INITIALIZED/CREATED ARRAYS. So there were ZERO files already on those arrays. And I don't mean that the arrays were emptied off and trash emptied, I mean RE-INITIALIZED. I've seen performance differences between emptied vs. re-initialized arrays. I'm not sure what voodoo is going on there, but it made a difference. So freshly stripe your arrays before mission critical, e.g. live, capture.
-uncompressed capture is trickier than it would first appear - what you get out of the cameras is NOT 24fps coming down the wire, be it HD-SDI or component analog.
What you get coming down the wire from these cameras, be in HD analog or HD-SDI
Panasonic HVX200 -
720p24 - you get 720p60 coming down the wire with 2:3:3:2 or 2:3:2:3 repeating pattern, not sure which at the moment, gotta look it up - that'll make life interesting in post...
720p60 - is 720p60 coming down the wire - whew! At least there it is 1:1
1080p24 - comes down the wire as 1080i60 with classic 3:2 pulldown, mixed fields. I dub this 1080p24on60
1080i60 - comes across as 1080i60, 1:1, so that the fields will match between live and tape capture
...so clearly a little research or documentation to be done here
This camera has no HD-SDI output, so we used either the Multibridge Extreme (I dub it MB-X for short) or an AJA HD-10A to convert the HD analog outputs to HD-SDI for input into either the MB-X, an AJA Kona2 card, or a DeckLink HD Pro Dual Link card (have to check notes to see exactly what used)
720p24 - you get 720p60 coming down the wire, with either a 2:3:3:2 or 2:3:2:3 frame repeating pattern. Gotta research that to find out which. UPDATE-WRONG! It's a 1:1:1:2 pattern according to David of Cineform
720p30 mode - comes down the wire as 720p60, and I think you're actually getting 60 true fps down the wire, no frame repeats (I need to confirm that by examining footage, but that's what I recall from NAB last year at the demo). So that also has exposure ramifications - you're limited to the exposure times/shutter speeds of a 60p shooting system even though you're only recording 720p30 to tape.
This camera had no HD-SDI out, so we used an AJA HD10A converter to convert to HD-SDI for ingest on a BMD or AJA card.
Sony Z1U - we blew off Cineframe since it has been documented that it suxors (throws out half the vertical resolution), so we shot 50i and 60i instead. They come down the wire, predictably, as 1080i50 and 1080i60. I'll be messing around with some deinterlacing and retiming tools to see what kind of a 24p result I can get.
Since this camera has no HD-SDI outputs for uncompressed capture, we used an AJA HD10A converter for 60i footage, then the MB-X for 1080i50 shots. We didn't capture one series of 24p shots (we used 50i when others were shooting 24p) when we realized the AJA HD10A couldn't do 50 Hz (or perhaps it can and we didn't know how to switch it to do so - anyone know? We were too rushed on set to dig out the manual). We switched it with another capture station, I think the GY-HD100U, and henceforth used the MB-X to capture 1080i50 from Z1U.
Canon XL H1
1080-24F: even though it has a 24F (not 24p) mode, it does NOT send a 1080pSF signal down the HD-SDI as a Sony F900 might. Instead, it adds classic 3:2 pulldown, so you get 1080i60 coming down the wire, what I dub 1080p24on60 (acutally, 1080p24on60i would be more accurate, but since there is no 1080p60 in widespread use, except for the Sony SRW-1 decks, and possibly more at NAB 2006, I'm going to let it slide for now).
1080i60 - comes as normal 1080i60 down the HD-SDI wire
...but I can't recall if this was the camera that DIDN'T send audio down the HD-SDI. I'll have to check my notes and footage to verify.
1080p24 - surprise! Even though it has a true 24p mode, and HD-SDI, it STILL sends 1080i60 down the HD-SDI cable, not 1080pSF, even though it is a Sony CineAlta. Harrumph. Classic 3:2 pulldown added that has to be removed in post. I see this as a clear example of market segmentation, and it is the lack of these kinds of features that makes me a bit dour about seeing that "CineAlta" badge on the side. But is has HD-SDI, which the F330 does not (right? Too many camera facts in my head right now), so that's a big plus.
1080i60 - comes down the HD-SDI nice and clean as you'd expect.
Oh, and audio was included on HD-SDI - we checked.
720p24 - For this test, I had just installed the AJA Kona3 card that AJA was kind enough to loan me. It has a preset that specifically mentions Varicam for 720p23.98, so I used that to capture. We were so run and gun that I didn't have time to test as carefully as I would have liked. HOPEFULLY it detects the "flags" in the signal coming down the wire that mark original from duplicate frames (when shooting 24p, the camera is correctly sampling time at 24 times a second, but repeats frames on the HD-SDI signal, marking those that are originals rather than duplicates for extraction). But doing some quickie captures and playback it looked pretty good, I'll just need to double check. If I did something wrong, or the Varicam or Kona3 doesn't act or wasn't configured the way I wanted, my 24 frames a second may not be the 24 per second that I wanted, I might have been "off cadence" and have some kind of stuttery playback with incorrect frames showing (repeats instead of originals, or originals skipped). Again, I'll have to hook everything back up and play it back and compare tape to live capture to see if I did it right.
720p60 - OK, this one should be easy enough so long as I used the right preset - 720p60 shot, 720p60 coming down the HD-SDI.
Post Considerations for tape/P2/XDCAM HD
At this time, all I have to use if Final Cut Pro 5.0.4, I don't have the 5.1 update yet. So here are my capture options:
all formats - connect a FireWire cable to the camera with the P2 cards inside that contain the footage you want. File==>Import==>P2, and then you get a list of clips to highlight and click import. They come in with random file names, so you have to sift and add metadata. Slates essential!
-connect via FireWire
-720p24 does NOT work with Final Cut Pro 5.0.4, and I've read no indication that v5.1 does any better. Use LumiereHD instead is my best advice, they have support for capture and playback from this camera
-720p30 (NOT 60, 60fps only comes out the analog ports) - HDV capture works just fine over FireWire at 720p30 with this camera. Can use Make New Clip on Start/Stop, HUGELY helpful
UPDATE-WRONG! It's a 1:1:1:2 pattern according to David of Cineform for 720p24. So for now, I just captured over the HD-SDI to uncompressed, I'll come back with LumiereHD if I can get a camera to work with - anybody in Austin area have one they can loan me for a day or so?
-connect FireWire cable
-boot camera as 50 or 60Hz as needed
-use FCP presets to capture, not a problem at all
-CineFrame, which we didn't use, comes in as 1080i60 with 3:2 pulldown
Canon XL H1
-1080i50 - captures fine over FireWire using 1080i50 HDV FCP preset
-1080i60 - captures fine over FireWire using 1080i60 HDV FCP preset
-1080-24F, however, does not - I captured over HD-SDI as 1080p24on60i, will have to remove pulldown in post
Sony F350 XDCAM HD
-since there is no XDCAM HD native support for this in either FCP 5.0.4 or 5.1, had to capture over HD-SDI.
1080p24 - comes in as 1080p24on60i, so gotta remove 3:2 pulldown in post, same challenges as live capture
1080i60 - comes in as 1080i60 over HD-SDI. Again, frustrating that codec support and drivers would let me use GigE or FireWire or USB on this camera, but I can't...yet.
Panasonic Varicam (rev H we used)
-use Panasonic 1200A deck, you CANNOT capture using FireWire from the camera. The camera lacks FireWire and 9 pin deck control, so any kind of controlled capture has to be done with a deck. The 1200A is inexpensive (as HD decks go, about $30K with HD-SDI and FireWire boards installed list price). FireWire is an OPTION on this deck, make sure it's installed if you're renting. I'm planning on some thorough testing - capture BlackMagic, capture AJA, capture FireWire, capture analog for fun too, and see if there are any differences. Anyway, here's what I expect is the usual route:
-connect HD-SDI and 9 pin deck control cables, or can even use FireWire for deck control (did it the other week myself so I can confirm this works)
-log and capture in a "normal" fashion for 720p60
-use the 720p23.98 preset for the 24p capture to remove redundant frames
-connect via FireWire, use presets for either 720p24 or 720p60 and it works just fine.
POST CONSIDERATIONS FOR LIVE CAPTURED FOOTAGE
Now that it has all been captured, I need to sit down and foodle with how to extract the 3:2 pulldown from the 1080p24on60i footage, or 2:3:3:2 or 2:3:2:3 frame repeats from the 720p24on60p footage. Since timecode does NOT get captured correctly when doing live capture this way (even if timecode were on the HD-SDI from the cameras that have built in HD-SDIs, and not all do include timecode on HD-SDI, FCP can't de-embed that timecode and read it), the usual system won't work.
Normally, when trying to remove 3:2 pulldown, you can count on the pattern (3:2 pulldown on 480i60 or 1080i60 or the 2:3:3:2 or 2:3:2:3 on 720p60) to start on either the :00 and :05 frame counts for 60i formats, or at least the :00 frame counts for 2:3:3:2 for 720p, or :00's and :05's for the 2:3:2:3 720p60 formats (See kids? Isn't this FUN!).
UPDATE-WRONG! JVC GY-HD100U for 720p24 - it's a 1:1:1:2 pattern according to David of Cineform, not 2:3:3:2 or 2:3:2:3
...BUUUUUUUUUUUUUUT since our captures are "wild" and the timecode for each capture starts at "00:00:00:00" regardless of what the timecode on the camera said....I am hoserated - each and every shot will have to be analyzed to figure out where the start of the pulldown/padding pattern is. NOT a recommended workflow for bulk work.
The only time 1080p24 capture has been relatively easy is when using an F900 that spits out 1080pSF down the HD-SDI (and by the way, pSF stands for, I think, progressive Segmented Frames).
So all my source captures of 160 MB/sec will be processed down to about 130 MB/sec after 3:2 pulldown is removed from 1080p24on60i. The 720p24on60 should shrink more considerably, down to around 60 MB/sec I think. But yet more huge files to store and manipulate.
Before the really nitty gritty comparisons can be done, I need to process ALL live footage down to their true 24p resolution for playback, and to add the correct metadata to all - so that I'll have source footage, 24p versions, 50i converted to 24p, etc. and be able to compare all this stuff. And have log notes and a database with all settings (iris, shutter, etc.).
That, my chilluns, is gonna take a while, and I don't even know if I'll be done with that stuff before I leave for NAB next Thursday. (I'm going for fun for two days, then the Digital Cinema Summit for two days, then the trade show madness begins. There's an extra twist about the trade show I'll share sometime before the show starts, too....)
OK, THERE! That ought to give you enough to think about until tomorrow's Next Exciting Episode...
Wednesday, April 12, 2006
Great loooong review of the Sonnet E4P PCe eSATA cards -so far, the best solution I'm aware of for high speed SATA RAID on a PCIe Mac. As far as I know, this is THE ONE to get.
These guys have done their usual incredibly detailed analysis, at least as good as I would have done if not better. I even learned some good stuff in here, including to NOT install in the 8 lane PCIe slot....which I think I did, and I'll now have to go back and re-test to double check performance.
Anyway, if you have a PCIe Mac or are thinking of purchasing one, and want to know what the right card to get is, how it works, what kind of speed to expect, this is the best article I've seen so far.
It's all German to me - literally. But if you look at the graphs, clearly not only is FCP 5.1 faster than 5.0.4 on G5s, but Intel Macs stomp the daylights out of G5s, even dual G5s. Of course, I have no idea what particular feature they are testing that whomps on the older Macs.
I'll see if I can get a translation, somebody emailed me one I gotsta find it.
Chris Hurd, who runs DVInfo.net, and was the primary organizer of the shoot, has a forum up over on his site that has an ongoing discussion about the shoot, with commentary from Chris, Pete Bauer, Greg Boston, and others who were at the shoot and were hands on camera operators.
Read for details from the shoot.
Tuesday, April 11, 2006
Daring Fireball: Several Asinine and/or Risky Ideas Regarding Apple's Strategy That Boot Camp Does Not Portend
A nice and realistic assesment of what Apple will or won't do regarding OS X software and Windows hardware and software.
Monday, April 10, 2006
We captured footage from some different cameras:
1.) XL H1 24fps - while the HD-SDI is 1080i60 (24p with 3:2 pulldown added as 24p or 24pA), when in 24F mode, Adam says they have a proprietary solution that allocates data rate optimally to the 24F frames per second. This also means you CANNOT record it from Final Cut Pro 5.0.4. 1080i50 yes, 1080-60 yes, no 1080F24 (frame mode) - so NOTHING comes in over FireWire, not even 60i with 3:2 pulldown. Same kind of problem with 24p on JVC GY-HD100U - can't record anything over FireWire in that mode...yet.
2.) Z1U - I like the location and functionality of the big flipout screen and playback controls - they're quite nice. I like the button location and wide, easy-hit button layout. I didn't like that the FireWire port wasn't responding. So we used the XL H1 to capture the footage - which for some reson surprised me. But it worked fine for 1080i50 and 1080i60. As for Z1U, somebody said "Dead FireWire? That doesn't surprise me." so apparently we are far from first or frontmost with this position.
I spent the rest of the day getting the place put back the way it was when we started and shutting down, disconnecting, and packing up my gear and borrowed friend's gear (thanks again Neil Halloran!), and returing Omega Broadcast Group's borrowed gear - two G5's with monitors, 3 HD LCD monitors, lighting gear, a million cables, their HVX200, their Z1U, their Varicam, etc. If you need video stuff, they've got it.
Around 8:30 I finally pulled out of Omega's lot in the U-Haul and headed to Neil's to drop off his gear.
It's 12:32pm as I write this, I am beat but happy and satisfied - I'll learn a LOT in the weeks to come from all of this footage.
But I've dozed off twice, so definitely time to go.
The lesser compression artifacts (it is still noisy, however) on the HVX200 puts the COMPRESSION in a different class than the HDV based cameras, not the overall quality.
Thinking about the three mid-price cameras - the HVX200, the GY-HD100U, the XL H1 - they are roughly the same price in actual, usable configuration - the JVC lists for $6K, the Panasonic is $8-$10K or more depending on how many P2 cards you want and can live with (or without), the Canon lists for $10K. (Once the FireWire or P2 recording devices comes out for HVX, the price config will change for those willing to put up with big rotating disks).
If you gave all three manufacturers the same pile of parts to choose from with a variety of different encoders, recording media, imaging chips, lenses, body styles, etc....you'd get about what we have here - three cameras where the emphasis was put in different places. My PRELIMINARY gut read at this point -
Panasonic put their money into the P2 part of it and the color reproduction. In order to do this, they added to those piles and took away from the imaging chip pile to meet their price point. (More on that - they went with lower res on chips to get better low light and less noisy than higher res chip would have yielded, all things being equal). Not to say AT ALL that the imaging chips are bad, just that in the overall balance of things, as compared to the others, that's my gut read - the P2 tech and nice compression technology was where they put their effort.
JVC put their money into interchangeable lenses, true 24p, and a nice looking image. The color reproduction could be a LITTLE nicer, the imaging resoution could be 1080i or 1080p instead of 720p, the frame rate could have been higher with some different technology (720p24 or 720p30, 720p60 comes out the analog component only). The standard lens is nice but could be nicer, but obviously that would push up the price point. It's a nicely balanced solution - a good bit of each.
Canon seems to have put their money/effort/time into 1080i instead of 720p imaging chips, a good codec, 24F technology (note NOT 24p), HD-SDI output, nice pro controls (full color matrix) and interchangeable lenses. The color reproduction is highly tweakable, and we didn't mess with it as much as it could have been. What we saw was good but not phenomenal - the way we had it set up, it was a like Kodachrome colorimetry. To fully explore the matrix options for best results would be a whooooole other test series to run, and we didn't have time - like a week in and of itself.
I don't know which color reproduction I like best yet, I think probably the Panasonic but that's a VERY PRELIMINARY gut read, NOT based on side by side comparisons enough. But the Panasonic "just feels good," and as a post guy, I like the idea that I wouldn't have to muck around much with the colors in post, because well, the colors are already looking pretty real, as in realistic, as in what you saw with your eyeballs.
As far as codecs go, based on Adam Wilt pointing out a lot of things, I think Canon has the nicest of the HDV codecs, while DVCPRO HD is preferable in terms of fewer compression artifacts, although it suffers in terms of resolution - 1920x1080 formats are recorded as 1280x1080 (50i at 1440x1080), and 720p formats are recorded as 960x720 instead of 1280x720.
As for glass, "they fail in different ways" says Adam - Canon has good detail but lots of chromatic aberration. HVX lens - goes nice and wide, fairly clean, and it clearly passes a lot more detail than the chips are capable of recording. "We've got nice vigorous aliasing as far as the eye can see - it's a demo case for what aliasing looks like." says Adam. JVC glass - for the price it's an alright lens. Around F4, middle of zoom range, it is nice and clean, but at the end of the range, it portholes and vignettes a bit at telephoto extremes, and has chromatic aberration. "It was the only real lens" of the range as camera folks call it - it has mechanical zoom and mechanical focus. (glass comments are from Adam mostly, he knows'em better).
As for the Z1U, we are just now capturing footage from tape, and during the shoot, it was the furtherst machine from my capture station, so I don't have an opinion on it beyond my previous doodlings and testing with it from over the last year.
And as for the Varicam, we were using Omega Broadcast Group's Varicam, rev or model H (I'm not sure of the correct designation). But the latest one. I watched during live capture as I was operating that capture station, and it looked gooooooood. More review and analysis to follow.
Zane Rutledge is a filmmaker and friend of mine, and was oh-so-incredibly kind to help me/us out with the camera testing that went on from Friday through Sunday.
He's posted his preliminary thoughts (link above) about the shootout, and I like his takeaway - there is no one best end-all be-all camera, there are matters of subjective opinion and taste and preferences. My personal thoughts - this is especially true amongst the JVC GY-HD100U, the Panasonic HVX200, and the Canon XL H1.
We continued our camera tests:
Sony F350 (newer, nicer XDCAM HD)
Canon XL H1
We met down at Auditorium Shores at 10am, and talent (same two, Stacy and Carol Lee) showed up about an hour later. Stuart English stopped by with his two little girls, we an impromptu kid shot was set up. I popped in and out between talent wrangling and fetching some supplies for the shoot, and had to leave to meet Allan Barnwell up at Omega so he could let me in (on a Sunday) to go capture footage from the F350, which had to leave that afternoon.
Later I saw more footage, which consisted of some across the water shots, some zooms of downtown, the classic Austin shot - Stevie Ray Vaughn statue, Town Lake, the downtown skyline. Shots of the models in broad daylight in close up, medium, and wide. Shots of a long canoe with 10 or more rowers cranking away.
Civilians kept wandering up to figure what was going on. Li finally said "the body has floated away, there's nothing to see!" and Nate Weaver suggested this brilliant response for the overly inquisitive:
"We're filming a mayonaise commercial."
...which should instantly disperse any curious crowd. What's less exciting than mayonaise....
While most were still down by Auditorium Shores, I drove back to start capturing from F350 and HD100U.
Sony F350 XDCAM HD (tapeless optical disc in a shell)
Capturing form F350 was interesting - gives a lovely thumbnail screen out the HD-SDI (there is NO component HD analog output on this camera!) in HD, makes it a snap to see what's going on - a NICE touch! I kept trying to click on the thumbnails with my mouse in the capture preview window just by instinct. Since is on a nonlinear cartridge and not tape, you get thumbnails of your video clips - very much like the thumbnail viewing mode on digital still cameras. Just press the "thumbnail" button and you get the list, press Play to play the clip. Since there is no 9 pin cable outlet on the camera, we had to capture wild. FCP 5.0.4 (and 5.1 as well) do not support the XDCAM HD format yet. 24p mode unfortunately comes out as 24p on 30i with 3:2 pulldown. Since we have no deck, and it doesn't come out as 24pSF (progressive segmented frames), we had to capture as 1080i60 (1080p24on60i is what I call it, since 24p with 3:2 pulldown means 1080i60). PFHAT files, since no native codec support, just saved as 8 bit uncompressed - biiiiiiiiiig files considering how small it could be. Again, a "GIF saved as a TIF" solution - saving the files in the massive uncompressed format, even though there are files on a disk inches away I can't read. Grrr. Soon I hope, but I fear FCP 6 won't be until July-October timeframe.
Panasonic HVX200: P2 solid state memory cards
capture was a snap - hook up FireWire to the camera from Mac, and select File==>Import and there's a P2 option. Adam did it for me was we were pressed for time via FireWire from the camera. First batch came in all random file names, he said he'd goofed somehow and he redid it. Not sure if that was just puting Card 1 clips i Card 1 bin or what, I'll need to look at results tomorrow - hope everything's slated. Similar to the F350, you get thumbnails, but unfortunately no view of thumbnails out the FireWire or component output. All files were imported no problem. So all frame rates and recording options (at least teh ones we used) appear to be no biggie to get into FCP.
JVC GY-HD100U: HDV with a 720p30 and a proprietary 24fps on 60 fps timebase mode
The JVC was the first one I finally used the checkbox in Clip Settings for "Make New Clip on Start/Stop" when capturing native HDV over FireWire, which allows you to COMPLETELY automate and walk away capturing a series of clips. Truly a "Sliced Bread 2.0" award winner in terms of time savings. I know, I know, DV folks have been doing this forever, but the HD-SDI type of capturing I've been focusing on for last year didn't have that ability. The camera send 720p59.94 out the component analog outputs when shooting live (that we ran ito an AJA HD10A converter and then recorded the converter's HD-SDI output into an HD-SDI card, either BMD or AJA, interchangably). HOWEVER, when recording to HDV tape, it skips every other frame and you get 720p29.97 - half as many frames. Harrumph. But it captured just fine over FireWire, I just need to go back and add metadata to those clips now. 720p24, however, is a COMPLETELY different animal. It puts 24 fps on a 60 fps timebase. There is no 720p60 HDV, so Final Cut Pro gets completely confused and doesn't work - AT ALL. I'd expected to be able to capture 720p59.94 and process out the extra frames, but that isn't an option over FireWire - it flat out doesn't work. Solution? LumiereHD will capture and go back to tape with 720p24 footage from the JVC. I ran out of time to set it up since the camera had to go back, so I just punted and captured the footage uncompressed over the HD-SDI. Lots of data, not so much information. Grr.
One of the useful things about doing this blog is the near realtime feedback - a reader from Germany wrote in to say that in their tests, the Multibridge Extreme recorded less and less useful info as the brightness got higher. The Cobalt converter they used didn't. I'm not swearing this is true, but based on charts he sent me it is believable. I need to study and read up on this, but basically be aware that not all HD Analog to HD-SDI converters are the same. I'd be inclined to believe that if there were a problem of this nature, AJA seems to have an edge on these kinds of issues (totally unscientific gut reaction), and perhaps this justifies their higher price. In any case, I take that to mean our tests are non-definitive on this issue to some part UNTIL I can look into this converter issue.If I get less via analog than I do from FireWire, why am I doing it again? UPDATE OK, I had that a bit wrong - high FREQUENCY detail is recorded a bit soft, not bright information. It had been suggested by the writer that perhaps a lesser, consumer grade A/D chipset was being used. Grant Petty of BlackMagic personally wrote me back and said this is not the case, they are using a professional chipset as is used by other pro gear, and his engineers are looking into this possible discrepancy. I'll keep you posted. After receiving that info, I went back and recaptured 720p24on60 footage from the JVC through BOTH an AJA HD10A as well as the BlackMagic Multibridge Extreme, it'll be interesting to see how they compare on that footage (all of which came from tape).
Canon XL H1
- captured 1080i60 fine over FireWire with automatic scene detection ON, same no problem deal for 1080i50 from the camera. But 24F? A complete stumper, as it is supposedly a proprietary format with padding for the zero frames. NO Lumiere will help at this party, you're on your own. And no analog output that I saw immediately obvious. Will have to use Cinema Tools, guessing A frames, to figure this one out.
Varicam - haven't started capturing yet, Omega owns a 1200A deck I'll use. HD-SDI or FireWire, not anticipating any problems.
Z1U - haven't started capturing yet since Omega owns one of those as well. Didn't shoot any Cineframe 24F mode, since it is known to blow. But since we shot a lot of 24F with XL H1, maybe we should have. I'd rather shoot 50i and deinterlace and retime. But both 1080i50 and 1080i60 SHOULD capture no problem. (we had FireWire bus problems later)
SOME OBSERVATIONS, OFF THE CUFF AND PRELIMIANY:
As we reviewed the footage later, by this time we were getting a pretty good idea of what to expect. Most of this is stuff Adam Wilt pointed out:
-JVC GY-HD100U - I was wrong in my assessment of the color rendition earlier - it actually looks quite nice seeing it on the Canon 1440xsomething res projector against a screen. Detail was good but not the best, but with a square pixel sensor, and 1280x720 recording format, it held nice detail. The 13x lens, which has actually dropped in price to about $8K street, while costing more than the camera WITH standard lens, makes a BIG difference - buy you way into a sharper image with that lens. BUT....the codec is the weak point of the camera - there is some temporal sticking (in waving trees or other slow, fine motion, blocks of pixels won't update for 6 frames). If JVC were to update/replace the codec, we speculate by changing the firmware at best case but more likely by developing new silicon with new codec we guess, it would make this camera a much more solid contender.
-HVX200 - great color rendition, but relatively low resolution in this crowd. You can see aliasing in this one you don't see in the others. Probably OK for present generation of home theater HDTVs, but not so good for theatrical or higher resolution home theater systems I speculated. GREAT codec performance, since there is no compression frame to frame, all frames are compressed freestanding, which means no frame to frame compression artifacts. This camera is also pretty noisy. Operators complained of how heavy it was to hold as a handheld rather than shoulder mount camera. But I love the way the color looks. You see color and contrast first, later possibly notice a lack of resolution.
Canon XL H1 - the best HDV codec of the bunch. GREAT detail, decent but slightly funky color rendition. I'll need to check my notes, but even carefully setting up shots, skin tones a little pinky/orangey maybe? And did highlights clip in a very video way?
Z1U - footage not reviewed, Adam was well familiar. He said Sony tends to have a very video-y look to it, highlights blow out quickly.
Sony F350 - 35mbit looked better, but not hugely/amazingly better. I could very quickly tell the difference just looking between 18 and 35 mbit. 18 looks OK, but you see 35 and realize how much better it is (we didn't bother with 25 mbit - you either want quality or space, and probably won't split the difference that often, unless trying to maintain timeline compatibility). NICE SHARP DETAIL, decent but not outstanding color rendition. At this price, however, it should thoroughly rock. It does do quite well. But Adam noted it was closer in behavior (think he was talking about focusing) to 1/3" chips than 2/3" chips. This camera was also the one with most options, and they weren't sufficently explored. There is probably more performance to be gained from this camera, we didn't have time to learn how to do so. We discussed that someone was told Sony rep confirmed that while the SIGNAL if FORMATTED as 4:2:2 out of the HD-SDI, it only contains 4:2:0 worth of data. While some called this bit withholding or lame, I called it market segmentation. And Adam pointed out it might be something technical, like they are relying on a chip that converts RGB to 4:2:0 that's already built and working and just dropped in, and to do otherwise would be a ton of work. Oh, and highlights blew out dissapointingly quickly for a camera of this price. But image did look GOOD and SHARP. Definitely shoot 35mbit. Canon's 25 mbit CBR HDV looks, from a code perspective, to be darned close in quality to Sony's 35mbit VBR XDCAM HD recording. Canon's codec is better from what I've seen and has been pointed out. I'd expect the codec performance (blocking, quantization errors, temporal sticking issues) to be better on Canon than F350. But F350 has bigger chip, nicer lens, etc.
Varicam - didn't review it, and didn't take it out in the field - we deemed that we'd seen enough footage from it, and it's expensive in case we dropped it (no production insurance that I'm aware of in place on this gig).
Some non-definitive brainstorming:
If I were doing something strictly for HDTV with a lot of fast motion and didn't have deep post budget for post and color correction, and had around $10K camera budget (assuming purchase or metric equivalent rental) I'd think about shooting HVX200 and editing natively. The good color reproduction requires less tweakage in post, the post workflow is simple and works on inexpensive equipment without requiring HD-SDI, and has low drive space and drive throughput requirements but is still a nice codec.
If, on the other hand, I was shooting something I was confident would be seen on a larger screen, and it included a lot of indoor shots or minimal fast pans or fast motion, and had an NLE that could handle 24F mode (or 1080i50 in a pinch for later conform to 24p), I'd consider the XL H1.
those are just two examples, don't rush off and make your project based on those. They ALL have various pros and cons to them that doesn't make one outstandingly better than the other at a given/similar price point.
CLOSING SUMMARY FOR THE DAY:
OK, I can't keep my eyes open any longer, I think I just dozed for 5 minutes. So I'll punt until later, hope I don't have any typos -
-for home theater destined stuff, HVX200 looks awfully good "right off the truck" as they say-meaning if well shot, uncorrected, it looks awesome. MOST home theater systems would have a hard time showing the shortcomings except for some detail aliasing I THINK (need to verify more later)
-BUT for theatrical, the artifacts would DEFINITELY show if filmed out or done on proper high res truly 1080 res or 2K projector.
-XL H1 was nicely sharp, for instance, and would show better detail.
(see Adam's point about XL H1 like the Z1U but with 50% better detail)
Adam mentioned at one point that beyond about F5.6 you start getting less resolution out of these cameras - something about limits of diffraction something or other I'm too tired to remember all the details of.
Sunday, April 09, 2006
short version - LOVED color on HVX200, the JVC looked better than my first impressions indicated, setting up gear is Fraught With Horror when you haven't done proper legwork. We got LUCKY we got everything to work. Woops, lost like three paragraphs here. This'll have to do.
The facts: Zane met me a little after 8am and we planned on building a couple of RAIDs with new drives that had arrived priority Fed Ex the day before, and using a couple of enclosures I brought from home I'd dissassembled and removed the drives from. We sit down with a couple of screwdrivers and 16 little bracket parts that require 4 screws each, and discovered that the new drives were...wrong. ATA not Serial ATA. Sigh. Another priority overnight FedEx wasted. Ugh.
So we bailed on that and set out to get all the capture stations up and rolling smoothly, and started assigning computers to cameras. I'd asked a bunch of folks to show up and help capture, and in the end, I got 5 volunteers:
-Rita Sanders, editor on the doc Slam Planet that debuted at SXSW that I did some post work on
-Neil Halloran, local filmmaker and client of mine, who was INCREDIBLY gracious and generous to let me borrow not just his person for most of Saturday, but also his COMPLETE G5 based editing system, including his 19" JVC studio HD monitor.
-Lary Cotten, who deep geeks deeper than anyone I know (and I mean that in the best way), CTO of OpenLabs, makers of bogglingly phenomenal pro audio gear that mashes a computer with touchscreen into an musical keyboard
-Craig Negoescu, co-founder of OpenLabs, and like Lary and myself, an ex-frogdesign employee (he was one of the handful who started the Austin new media office where I worked)
-Jenn White, a friend and local DoP who ended up running the Varicam a good portion of the day while I captured the output from it
The plan was to capture the uncompressed output of the cameras, either via HD-SDI or through a converter attached to the HD analog component outputs of the cameras. Either AJA HD10A converters were used, or the analog HD inputs on my BlackMagic Multibridge Extreme or through an AJA HD10A converter (HD analog component to HD-SDI) to an HD-SDI capture card, either BMD or AJA.
Why? Becaue ZERO compression artifacts since all these sources ARE before compression, beause you get full raster (1080i HDV is only 1440 pixels wide, not the 1920 most assume, for instance), and is 10, not 8 bits of color depth (more subtle color choices available - can your camera reproduce them?). Of course, uncompressed capture is only viable under limited circumstances, such as on a studio set, greenscreen set, or BIG set that already had a video village type of need.
Please keep in mind these are the random floaty brain bits of an exhausted guy after another 14-15 hour day on set. These are preliminary notes and observations, and don't have the qualifying wrapper of serious analystical comparisons and analysis - these are my gut reactions at this late hour.
Duly disclaimered, read on:
NOTES ON SHOOT;
Adam Wilt, Camera King on this shoot, who has a better trained and and more experience in these things than I, commented on some footage we viewed (and I missed a bunch, but here is some):
-ADAM on 18 vs 35 mbit on the F350 - "18 looked like a bit more degradation when sitting 2 feet away from screen, 18 looks better than I would have expected" - (I'm curious about 18 here vs. JVC 19 mbit?). In motion they both looked good when viewed from 15 feet away, but looking at stills, BIIIIIIIIIG difference. (I thought 25mbit HDV from XL H1 held up surprisingly well against 35mbit F350. But F350 is a COMPLICATED camera, and it just started shipping a few weeks ago, and none of our crew had spent huge gobs of time with it).
-F350 harder to focus, something strange about pulling focus with this - a lot closer to the 1/3" focus than 2/3" in the focusing ability - is it viewfinder or depth of field? Viewfinder was hard to use - contrast was all the way down in viewfinder, made it tougher to focus
The thing that panasonic does better is naturalistic color and good gamma, and skin tones (going into overexposure it doesn't blow straight out like the Sony (even F350), skin tones going to overexposure is NICE
-F350 reportedly has only 4:2:0 amount of information in the 4:2:2 stream coming out of the HD-SDI, when asked to explain at 10pm after an 8am crew call, I blurted "It's like saving a GIF as a TIFF." to which I'd now add "...and then smiling like you did a great thing."
-the question that popped into my head - after a CURSORY examination of some of the footage (I ended up running the Varicam capture all day today, so I thought all the footage looked great since that was what was in front of me), I went and looked at some Z1U footage. Comparatively, eww. But hey, whaddaya expect when one camera costs 15-20 times more than the other?
-I then thought about this - esp. on the cameras that don't produce color well at first blush (and based on EXTREMELY PRELIMINARY GLANCING WITHOUT EXTENDED TESTING, EXTENDED "I RESERVE THE RIGHT TO CHANGE MY MIND DISCLAIMER DISCLAIMER), I'd put category of in the "not so great/not so pleasing color rendition" the GY-HD100U and the Z1U.
ANYWAY, for the cameras that don't do nice pleasing color with low noise, even capturing uncompressed (and I gotta look at the uncompressed footage), my gut says it is of lesser value to go to the trouble of capturing uncompressed (see about troubles below) from these cameras, better off going compressed (or uncompressed) from a better camera. that's my gut vibe now, I'll have to see how it turns out later. Because...
...recording uncompressed to disk is fairly complicated. You can say "Well hell, you just buy a G5, a Sonnet E4P card, and 4 or more drive RAID, a BMD or AJA card and away you go!" To which I say, "Kinda sorta." On they fly, later in the day, we decided to shoot some tethered (record uncompressed to disk via Mac with HD-SDI capture card) footage from outside the building. Problem - it's about 175 foot cable run from where the Macs were set up to where we wanted to shoot. We hooked up two HD-SDI cables and used a barrel adaptor to link'em together. We took the BlackMagic Multibridge Extreme box (remember all the brains 'n guts are inthe box, the card is barely anything, just a connector) outside, and run that long HD-SDI back to another capture Mac in the studio. Hooked it up, fired it up, and....nothing. Changed to an HD-SDI camera (not component analog) and....something, but jibberish. I then realized the problem could lie in:
1.) the long cable run, esp. with a barrel adaptor
2.) something misconfigured in the Multibridge
3.) If something were misconfigured in the Multibridge, I'd have to troubleshoot it, and possibly take my laptop out and hook it up via USB to configure it until I got it working...with the G5 200 feet away.
4.) Or it could be a problem with the capturing Mac
...so I bailed on tethered, uncompressed recording, since in addition to the trouble I was having at the moment, I'd be having to switch the HD-SDI cable from capture Mac to capture Mac to put in the right capture bins, or straighten it out later, or possibly troubleshoot any new difficulties.
Now, on a "normal" (if there is such a scenario) shoot wanting to shoot uncompressed, there would largely be one chunk of technical troubleshooting and then you'd be good until something went wrong or broke.
So I bailed on it. Tethered capture to uncompressed is a hassle - gotta have all that extra gear out on set, it's noisy, sucks a ton of power, requires another person to operate, whine whine bitch moan and YEAH, it is a pain. So think twice, nay nine times before going to the trouble.
And practive BEFOREHAND. I could have easily spent 2-5 more days prepping, testing, etc. Only becauase I've been playing with this stuff for 2+ years is it even sane for me to have tried this. To have only failed on a 1080i50 vs 1080i60 issue for a few shots, and had to cancel one test, ain't bad at all I figure, and I was LUCKY. There were many, many possible points of failure in this ordeal today.
FYI, unlike last year, switching frame rates on Varicam didn't crash Macs attached to it at the time. Either an AJA vs. BMD thing (BMD trouble last year), or
-when I lamented to Nate that I'd had a few problems today, Nate said don't sweat it, we'd done better than he'd expected when he heard what we were trying to do. Boy, that helped!
-along those lines, I hoserated myself by NOT Sticking To The Plan. I'd sat down with a spreadsheet and carefully calculated out which Macs, with which capture cards, with which RAIDs, with which RAID cards, would be cobbled together and record from which camera. What size computer monitor, whether it could run out a computer LCD with the video image on it, the RAID speed, the card's capture capabilities, all factored in. It was very specific and limiting. At one point, I'd plotted out that the Z1U should go to the Quad G5 with the Multibridge Extreme. I figured heck with it and used a different Mac running through an AJA HD10A converter - big mistake, the HD10A doesn't do 50i, so I had to have that Mac sit out of some capturing.
GEAR: Oh, MAN, I had a ton of different stuff! A sample listing of gear used:
-Quad G5 2.5 GHz
-Dual 2.3 GHz PCI-X G5
-Dual 2.0 GHz PCI-X G5 (multiple)
-Dual 2.5 GHz PCI-X G5 (multiple)
-AJA Kona2 (multiple)
-AJA Kona3 (and why does the Kona LH have analog HD input and this doesn't?)
-AJA HD10A (LOVE these! Had two on set)
-BMD Multibridge Extreme card/breakout box
-BMD DeckLink HD Pro Single Link PCI-X
-BMD DeckLink HD Pro Dual Link PCI-X
-BMD DeckLink HD Pro Dual Link PCIe
-MacGurus 5 bay eSATA port multiplying enclosure with five Seagate NL35 400GB drives
-MacGurus 5 bay eSATA port multiplying enclosure with five Hitachi 400 GB drives
-MacGurus Burly Box enclosures (two 4 bay enclosures) with 8 unknown drives (Seagate 7200.8's if he got what I recommended)
-two arrays comprised of Firmtek Seritek 2 bay enclosures with Seagate 7200.8 drives (set up two of'em, for 4 drive array, partitioned at 560GB to be sure it's fast enough)
-LaCie Biggest S2S 5 bay enclosure (went unused, didn't need after all)
-Trans International Mini-G 4 bay SATA enclosure (one I did a review of)
-huge, dim, blurry, 80-90 pound Power Computer PowerTron 24" CRTs that I never, ever, EVER want to lift again
-Apple 15 or 17" LCD, old plastic style (went unused)
-Apple 23" LCDs (multiple)
-Dell 2405 24" LCDs (multiple)
-Apple 30" LCD (ooooooooooh, I want one - it's so big you could crawl into it through the screen.)
-Sonnet Tempo 4+4 SATA cards
-Sonnet Tempo 8 port eSATA card
-Soonet E4P cards (in both Quad G5s)
-JVC 19" broadcast HD CRTs
-Pansonic 17 or 18" broadcast CRTs with HD-SDI and component inputs, and built in waveform - LOVED these, but Jordan said they cost $-$5K
-then of course the cameras:
Canon XL H1
Sony F350 (the new XDCAM HD)
HD10A thoughts -
GOOD: rockingly small, lightweight, easy to use, has HD-SDI passthrough (smart!)
BAD: no 50i support (I got bit by that), requires use of dip switches on bottom (doesn't auto-detect in some cases), a touch pricey these days
Multibridge Extreme - a bit complicated since it does so much. Took me about 15 minutes of troubleshooting, even after working extensively with BMD products for a couple of years, to figure out that selecting analog inputs had to be done in the Preferences Pane, not in FCP (or if it is in FCP, I didn't know where to look. Of COURSE, I didn't read the instructions, why do you ask?)
-AJA HDP, BlackMagic Multibridge Extreme's DVI output, and BMD's HDLink - great for checking focus on set, not so great for interlaced or true 24p footage, not sufficient for very fine/finicky color correction work - poor black levels, impossible to calibrate by standard video means, JVC 17" CRT too close to price competitive - just get the real thing rather than a simulator for COLOR, but for DETAIL, they can't be beat for color accuracy, fidelity, black levels, etc.
Is it worth capturing lower level cameras uncompressed? I'm kinda thinking not, just seeing their compressed output. Even without compression artifacts, there are still artifacts.
The HVX200, which was ranking low in resolution tests, and was described as noisier than many if not all others (I'll have to prove that, may well be wrong), looked GREAT as raw footage. Colors were just a bit off from other cameras. I'd think about capturing his uncmopressed. Or at least, the Z1U I probalby wouldn't, dunno on the GY-HD100U. XL H1 looks pretty darned good with HDV footage.
BUT is Varicam worth capturing uncompressed, ever? In theory the answer is a clar yes - you gain:
10 vs 8 bits/pixel color depth
none vs. moderate compression
1280 vs.. 960 recording pixels
Also, specs aren't everything. While the HVX200 kept sinking in the ratings in the objective tests (Chroma DuMonde, etc.), when I saw footage I LOVED it - colors look vivid and real in CineLink-D. Now, liking the color reproduction of a camera CAN be a subjetive thing. But nobody chimed in saying they liked the HD100U, Z1U, or XL H1 color reproduction better. I likes it.
But I'll have to refer to footage and guage for myself.
OK, I just hit the point of about to fall asleep on keyboard.
this should keep you busy for awhile.
No pics tonight, too much work to do now.
All this is totally random, incompete, and inconclusive.
....and this picture was so blatantly dorkified, I couldn't pass it up. So completely, ludicrously, delieriously stupid, that in the infinite wisdom of sleep deprivation after four or five 16 hour workdays, I feel I must post on the net. For all those who think I'm an idiot, here, indeed, is, if not proof, then your low hanging evidentiary fruit. Nay, a t-ball shot, with bat gracefully proferred.
Take your best shot.
What it is - we shot a bunch more test footage Saturday, including some outdoor shots in broad daylight. I would walk, then run past camera, loop around, and sprint at camera (or as welll as my saggy 37 year old self could do after 6 cameras and takes). The reason: to provoke compression artifacts from the pebbly building wall, the grass, the trees waving in the wind, to see if blockiness occurred in the motion blur during fast pans, etc. For kicks I made this, um, let's just call it "face" at the camera running by on one of the last ones.
While reviewing footage tonight in the shooting area on Omega's 50" HD plasma display, Adam Wilt (in front with camera in lap) happened to pause on this winner of a frame and we all immediately burst out laughing, so Chris Hurd needed to take a picture of a video still on a TV coming from a breakout box with loopthrough attached via HD-SDI to camera, all while I'm taking a picture of him taking a picture of me but it is a picture of me not me.....to bed.
It was suggested that we use this as the cover art for the DVD of all the material from the shoot. I said it would then have to be called Extreme Indie Filmmaking.
I am dead tired, 2am as I finish this. I'm out, shoulda gone to bed an hour ago, gotta get up in 7 hours at the latest...
Saturday, April 08, 2006
Friday, April 07, 2006
Hey all -
it's nearly midnight and I'm BEAT, was on location at 6:45 this morning. Not my usual timing!
Short Version - see annotated pictures from the shoot.
So here's what happened:
PREAMBLE, WHICH YOU CAN SKIP, MOSTLY ME WHINGING ABOUT HOW LATE AND POORLY I PLANNED, DESPITE MY "HEROIC" (read: frantic and ill-advised) EFFORTS:,
Saturday through Wednesday - order gear, try to get organized, email shot list ideas back and forth with Adam Wilt, Chris Hurd, etc. Try to find recruits to help run the capture stuations - we've got SIX cameras that we are going to record from, simultaneously, uncompressed to disk. Gotta get six G5's together, with RAIDs, HD-SDI cards, etc. UGH! Mucho mucho mucho logistics to be run. Arrange for camera talent. Buy/rent duvetyne backdrop. Do we have a trumpet or trombone available. Will it rain, if so what day, if so what fallback contingencies. Try to arrange an F900, and provide what the rental house requires to make them WANT to donate it instead of rent it to a paying customer. On and on and on.
What actually happened: Gear shows up Wednesday not Tuesday. Drive model desired isn't available, go with new replacement (DANGER! Unproven!). Order a disk array enclosure from one vendor, drives from another (Danger again! No one stop conflict resolution!). Major, major trouble with array. Vendor goes to HEROIC lengths to help, emailing in the wee, wee hours Thusday morning. Thurday more gear shows up via FedEx, start hooking up yet more gear. Don't start packing up until after noon. (I should have arrived at Omega by that time at the LATEST). Don't get UHaul truck until about 2, only with HUGE HELP from Rhonda Schneider, sweet friend that she is, who in fact Rocks Like Slayer for dropping everything and helping me out for several hours. Zane Rutledge shows up and helps me load the truck with basically everything that has an electrical plug in my house - 3 G5s, 5 RAIDs, every monitor I own, somebody ELSE'S entire HD editing station (Thank You Neil Halloran!) including 19" HD monitor. I don't arrive at Omega until about 20 minutes before they are supposed to close.
Jordan Hristov (in charge of rentals) from Omega Broadcast Group helps me unload the truck into Area 1, but I am massively behind. I am so hosed.
Brief aside - Omega was INCREDIBLY generous with us in terms of giving us space to work in, letting us borrow an ungodly amount of gear for the testing, and staying late to let us do our thing. Big, BIG props to Jordan from rentals and Allan Barnwell from sales (who babysat our mad crew into the late hours), and David Fry (owner) for letting us come play in his yard, so to speak. Omega has cameras to buy or rent, the rent and sell edit systems, do training, all kinds of stuff. If you're in central Texas and need stuff, they are absolutely worth talking to. Jordan Hristov is in charge of rentals, and Allan Barnwell is in charge of sales.
Chris Hurd shows up, eventually we all (me, Nate Weaver, Adam Wilt, Li, Greg Boston, Pete Bauer with his XL H1, Mike Devlin who came down with his own F350 XDCAM HD, and I forget who all else at this late hour as I type this) rendevous at Omega Broadcast, the incredibly generous hosts of this event, opening up their space and their rental department to us (the fools! : ) ). We more than took them up on it, looting, I mean pilfering, I mean borrowing a lot of cables and whatnot. Seriously though, David Fry and Allan Barnwell and Jordan (runs rentals, The Guy To Know) of Omega have been a huge help, they have tremendous resources ("You need 4 HD monitors with HD-SDI inputs and built in waveforms? Yeah, we got at least that many sitting around available...").
We decide to relocate into the smaller, shorter, but air conditioned adjacent space. Realize I need to hump 600-800 pounds of gear into the next room. EEK.
FRIDAY MORNING, SHOOTING DAY ONE:
I arrive extra early at 6:45am (traffic's a snap if you get up before God), and start setting up gear. There are so many untested, newly arrived, or never configured solutions it is just ridiculous. I could have spent 2 full days just transporting, setting up, and testing everything to be ready. With Zane helping, we managed to get through the day.
(Tidbit learned - unlike some configs I've worked with, you can change the input format on an AJA Kona2 without rebooting and it'll recognized different flavor video inputs - not always the case as in the past)
We get a couple of capture stations up and running while Adam and crew (Boyd Ostroff showed up, Mike Devlin, Nate Weaver, Pete Bauer, Li who came down with Adam, who am I forgetting?) set up the set. Stuart English (Mr. Varicam) dropped by for a bit to check in on us and talk stuff.
About the time things were moving smoothly, a band started playing on the other side of the wall and we lost power to all the computer stuff. Fortunately, UPS' were plugged into everything mission critical and we were able to save and quit and shut down....mostly. Well, at least save anyway, so then we had to source power from elsewhere in the building using extension cords. Sigh. The area where we were wasn't really designed to do a 6 camera shoot with lights and 6 HD uncompressed capture stations all with HD monitors as well as computer monitors.
SHOOTING AND TESTING
Today was setup, charts, and high dynamic range testing. Here's random thoughts about the day that are flopping out of my head.
I'll have more to say later, but some tidbits:
-HVX200 wasn't as good as I'd hoped, confirming earlier reports
-XL H1 HD-SDI contains neither timecode nor audio - repeat, NO AUDIO ON HD-SDI! Grr.
-24F mode - blah - loses vertical resolution. That or 50i and good software deinterlacer? We'll find out in post.
-XL H1 - looks pretty darned good, esp. with good glass on it. We did some lens tests.
-Adam mentioned something like kinda sorta ballpark 10% better res in 1080 rather than 720 mode with HVX200 - confimring my gut vibe from earlier tests, 1080 doesn't offer a HUGE advantage. Now, that's live feed. From tape, to be seen and figured out...
-BlackMagic Mulitbridge Extreme is darned handy - ran it as a standalone converter at first - HD analog component in, HD-SDI out to a "plain" HD-SDI card (forget whether AJA or BMD, wouldn't matter I'd imagine). THEN also ran component outputs to a JVC
-looks like I was wrong - the JVC 19" CRT does not appear to do 1080i50, unless I was doing something else wrong. All moving to fast to verify, so I need to double check that. If that's the case, that puts a dent in my "shoot 1080i50 and deinterlace and conform to 24p, you'll be fine." theory and plan for affordable (ish) hardware for post workflow
-but at same time, also running to 17" LCD for pixel-for-pixel preview of GY-HD100 output. My generic and non-spectacular Dell 17" LCD (1280x1024) is GREAT preview for 720p. Right width, 60Hz. Not a color accurate solution, but very good for discerning detail and focus (I thought anyway, have to ask others what they thought).
-this thing would be a great way to evaluate gear in the field - so many pieces in use. It'd take hours to days to log all learned, talk to folks, document what got used how...I'll have to save a Profile Report for each station, too much to keep track of!)
-tearing down is going to be a HUGE pain - even though I spent hours putting tape with names on stuff (and others did too), dissasembly and return to rightful owner will take a long time and inevitable mistakes.
-AJA HD10A rocks - used it on GY-HD100U in 720p59.94p mode, and on Z1U in 1080i59.94, and HVX200 in 1080i mode.
-ALL of the cameras tested that do some kind of 24 purportedly progressive mode send it down the HD Analog as 1080i60 - NOT 1080pSF. So no true 24p capture, gotta do pulldown removal in post. TBD best way to do that, which will probably be a bear.
-FINAL CUT 5.1 is starting to arrive to folks - Omega got theirs in the mail today, my friend Charlie Wan did too.
-more to say later, that's a start.
-the Apple 30" display is MASSIVE. Seven inches makes a BIG difference. I want one BADLY now. No time to test BMD Multibridge Extreme on it, but I will.
-Kona3 works like a charm so far - more on it later
-Multibridge Extreme has a specific capture mode for HD analog as a preset - THANK GOD! I was worried about how to config it.
-Kona3 - tiny litle mini-HD-SDI plugs - kinda cool. Yet another unique cable you have to keep track of - not so cool.
-same for the unique power cable of Quad G5's - not generic, not interchangable. Thought I'd lost one, was going to mean I couldn't use that computer - good example Standard Gear Matters - if the system relies on having exactly one particular cable, and that is a hard to find/takes days to order and get thingie, that is BAD
-note how convenient that 90% of rest of gear uses a standard computer power plug - no worries at all, I've got bags of'em.
-There is No Such Thing as having Enough Cables. Of any sort.
-AJA HDP is darn handy on set, and will scale a 720p signal to fill a 1920x1200 monitor (less black edge top and bottom, which is OK and correct) - great for getting focus Just So for chart tests. Not so great for super color accuracy, but it is not supposed to. We used it on Dell 2405 and it worked fine.
-this was my first field deployment of a Burly Box 5 bay hotswap port multiplying enclosure with 5 400GB drives. Working well so far in limited testing to capture uncompressed 1080i60 10 bit video - heaviest data rate you're likely to use.
-Got all six stations running, amazingly well considering how little prior proper planning was done on my part. Right now, one Mac's RAID is twitchy (keeps going offline), one RAID is having trouble and drops frames (I'll try and replace it by building up another one on set tomorrow), one RAID works but mysteriously drops offline from time to time.
-Everything Takes Longer When You Try To Do It Right. We basically shot charts for all cameras in various frame modes (24p or closest metric equivalent (24F, CineFrame, 50i, and 60i/60p). Then we set up and lit and shot high dynamic range test shots, shooting one camera at a time from the same sticks to minimize angle variance.
-tomorrow we have waaaaaaaaaaay more to shoot than we could every finish, so we'll do some serious battle triage to our proposed shot list.
-excited to have uncompressed to disk results of a bunch of stuff to compare to stuff on tape. Today was all static tests, tomorrow's motion tests will tax the MPGE-2 codecs much more heavily.
-if someone (uhhhh....me) forgets to bring tape stock, it's awfully nice when the facility has'em in stock for everything you need.
-slates are sooooo handy - even logging stuff into Final Cut as we shoot, even doing shot logs (Li is Scipt Supervisor, aka Shot Nazi, while Adam and I share Clock Nazi status, Nate and Pete are slate ninjas). Getting off track - point being you HAVE to have multiple places of collecting metadata, because it just takes so much effort to get it right on the fly in these situations with 9000 things going on, that you need LOTS of redundancy to be sure you get it right. Slates, log notes, AND logging info in as you go only defense as far as I can tell.
-lots of Adam conferring with others I didn't hear since they were at cameras and I'm at capture stations - what interesting info did we miss and not record that they were talking about? I'm not talking footage, I'm talking around-the-campfire good info about things they noticed, liked, disliked etc. Just too hard to record every bit of speech, notes, thoughts, etc. Too much going on, too fast.
-Chris Hurd rocks as social director
-really hard work can be really fun - I'm exhausted but having a ball.
Tomorrow the talent shows up - model/dancers, actor/martial arts guy. Motion studies, low light performance, and other good stuff. All tethered and recorded uncompressed as well. Talent was told to bring camera unfriendly attire - we WANT to see how bad reds bleed, strips jitter, etc. to see which ones work best. So test footage won't be pretty, but it'll be informative.
OK, too tired to say more. Go look at annotated picky-churrs from the shoot today.
Thursday, April 06, 2006
Canon XL H1
Sony F350 (the new XDCAM HD, the nicer of the two)
Adam Wilt of DV Magazine and Chris Hurd of DVInfo.net are my co-conspirators - they'd already planned this shoot so I'm jumping in with them. We'll be capturing a lot of footage uncompressed to disk as well, so should be quite interesting and exciting.
And Omega Broadcast Group is going to graciously let us shoot in their space and raid all of their parts bins - we'll be using their Z1U, their HVX200, and their Varicam, as well as lighting gear and who knows what all else.
Here's Chris' Announcement from DVInfo.net:
Announcing the Texas HD Shootout -
Starting today, powerful forces are converging on Austin, Texas for a four-day round up of all the major HD camcorders you guys like to discuss so much around here.
Adam Wilt of DV Magazine, Mike Curtis of the popular 'HD For Indies' mega-blog, and elements of the DV Info Net team will convene in a series of studio and location camera tests in a variety of different situations. Everything will be recorded, of course, and the plan is to offer feedback to interested parties through Adam's articles for DV Magazine, Mike's online journal at HD For Indies, and right here on our message boards at DV Info Net.
The camera line-up includes the Panasonic AG-HVX200 DVCPRO HD P2 camcorder, the Canon XL H1, JVC GY-HD100 'A' with 16x and 13x lenses, and Sony HVR-Z1U HDV camcorders, and the Sony PDW-F350 XDCAM HD camcorder. Plus we'll also have a Panasonic AJ-HDC27 'H' VariCam DVCPRO HD camcorder for reference.
Adam is our technical lead and will be directing with Nate Weaver assisting. Mike is co-directing and managing a truckload of uncompressed capture stations, and DV Info Net regulars Boyd Ostroff, Greg Boston, Pete Bauer and Mike Devlin are lending a hand with all of the cameras and other production requirements. There are a lot of other folks involved who are helping out in several capacities. I can't wait to tell you more about them. We'll keep you guys advised on how this is going day by day.
Meanwhile, although I can't promise this will happen, I'd like to encourage any Austin area members who might be up for dinner and drinks on Friday or Saturday night to please shoot me a private email. Our crew is already talking about going long, so a public get-together might not actually happen, but let me know at least if you're interested and if we can pull it off then we'll manage that by email. Hopefully we can get some Shiner Bock flowing.
Updates follow as this thing progresses -- and be sure to also watch for Mike's entries over at his site, http://www.hdforindies.com."
-I'll blog when I can as we go.
Busy, no time, read if interested.
This hints that we should see some killer performance once we get Intel native apps for OS X.
Wednesday, April 05, 2006
Well, interestingly enough, just a few weeks after the indie geeks got XP running on Intel Macs, Apple has announced their own bootloader that will allow an official and supported path to get Windows XP running on Intel based Macs (Intel Minis, MacBook Pros, Intel iMacs).
Apple will not sell or support Windows, but you'll be able to install your own separately purchased copy of WinXP on an Intel based Mac. Officially, this is just beta software. Leopard, the next major OS release due next year, will include this feature AFAIK.
From Apple's site:
Here's how it works:
First, you need to make sure your Intel-based Mac has the latest version of Mac OS X and the latest firmware update. These provide technologies that make Boot Camp possible. It%u2019s also wise to print out the Installation & Setup Guide.
The Boot Camp burns a CD with the drivers Windows needs to recognize Mac-specific hardware. It is very important to do this before starting the Windows installation.
The software also helps you set aside hard drive space for the Windows installation, without moving any of your Mac files around. Just drag the intuitive slider to choose the size that%u2019s right for you. Boot Camp also helps you remove the Windows partition, should you so desire.
Next, insert your Windows installation disc, restart and follow the Windows installation process. The only tricky part is selecting the C: drive manually. Be sure to get this right, or you could erase your Mac files accidentally. Remember, Apple Computer does not sell or support Microsoft Windows.
After the installation process is complete and your Mac has booted Windows, you'll need the Macintosh Drivers CD you burned previously. When you insert the CD, it will automatically install the drivers. Follow the instructions in the Installation & Setup Guide for helpful hints.
Other goodies: it'll include drivers for:
The Eject key (on Apple keyboards)
Brightness control for built-in displays
-it'll let you install WinXP Home Edition or Professional with Service Pack 2.
-USB mouse and keyboard (Apple's for sure, third parties unknown), no Bluetooth definitely
-gotta have latest firmware and 10.4.6
-gotta have at LEAST 10GB free on your drive
-it will let you partition a drive WITHOUT reformatting it, so if you have your current Intel Mac already set up, it'll "make room" for a Windows partition - THAT'S a biggie! Dunno if FireWire is supported for boot.
Be sure to check out the Requirements, Installation, and FAQ document as well.
Think Secret - Apple adds Windows booting feature to Mac OS X
AppleInsider | Analyst says Apple's Boot Camp could be "game changer"
MacInTouch: timely news and tips about the Apple Macintosh
Macworld UK - Apple software lets Intel Macs run Windows XP
Lengthy user discussion on MacinTouch.
Mike's comments -
Well, this totally rocks, and for creative professionals this could be a Really Big Deal.
Somebody asked me what the big deal was when the indie geeks got it working a few weeks ago with a complex workflow, and I said this would allow creative professionals to have one box to do all their work.
In a perfect world, you'd have virtualization software that would let you run an instance of XP at the same time as OS X, but we aren't there yet.
What you COULD do, for instance, is run software under Windows that you can't under OS X, especially 3D software. Imagine doing 3D renders with Max, or heavy After Effects composites (since no Intel native OS X AE until next year) under Windows - you render to QuickTime, then reboot and can edit in Final Cut. Smooth as possible? No, but it does give you the option of running ONE box instead of two for the budget impaired.
For business users, the ability to run their Windows only apps and still be able to reboot to run the other stuff will be a nice feature as well. Indie/freelance types that buy their own laptops would consider it as well - Windows during the day, OS X at night. But I wouldn't expect most purchasing offices to be so generous as to pay for Apple's premium hardware AND a freestanding (NOT cheap!) Windows XP install just for convenience's sake. But this WILL make it easier for creatives in large companies to get Macs I should imagine. They can just stipulate that it's a Windows box with special needs or something. : )
However, this does expose Apple to a risk akin to the dire fate of OS/2 from IBM years ago - if you can run Windows on the box, and developers already have Windows apps, why should the developers write OS X native apps for that "extra bit of convenience?" (Truthfully it is WAY more convenient, but that may not be the perception). The move to Intel makes it easier to make OS X Universal Binaries, and to port Windows apps to OS X, but the risk of Boot Camp is that it may disincentivize them to do so.
So Final Cut Pro - no, excuse me, Final Cut STUDIO version 5.1 isn't supported on Intel Mac Minis.
Some thoughts on this -
Some are speculating it is intentionally hamstrung so you can't run the software on the cheapie box. Nah, I don't think so - I think it most likely has to do with the built in graphics chipset not being fast enough or easy to support, either for Motion (most likely) and/or for FCP's realtime capabilities.
This is NOT to say that Final Cut Pro might run OK but have no support - I have no data on this as yet. It WILL install according to Apple's page. Will it run? Or crash? Or some things not work right? We'll have to find out.
Final Cut Studio 5.1 is not supported on Intel Mac Mini (early 2006) is how they describe it on their site. I glean/guess the following from that:
1.) There might be a mid- or late-2006 model
2.) that later model might run FCS 5.x (or later)
3.) Nothing says that Final Cut Studio 6.0, if such a thing were announced at NAB, might not run on Early 2006 Mac Minis.
4.) I've heard conjecture that built in graphics are being used mostly due to lack of graphics chipsets/boards that would fit in the tiny form factor of a Mini - maybe they'll figure out how to later this year.
So I'd think a late-2006 Mini might run Final Cut Studio 6.0.1 or something later in the year....maybe. So I'm not shutting out hope for Final Cut on Intel Minis, the only fact we have is that version 5.1 won't be supported on current Intel Minis.
The rest is just speculation on my part based on...nothing. No facts, just guessing. Some things I know and say, some things I know and can't say, but I don't have any definitive knowledge of Apple's hardware plans.
Cinema Minima: Personal Digital Cinema. News for movie makers � Apple Motion 2.1 unexpectedly quits when opening more than one project
under 10.4.6, Motion 2.1 will unexpectedly quit if you try to open a second project.
Intel Macs only I THINK.
Tuesday, April 04, 2006
Some folks are wondering whether to update now or wait for an NAB update.
Here's my read on it:
1.) Apple is offering an update to FCP just a few weeks before NAB. Clearly, they want this out, done, and away before they announce whatever they are going to announce at NAB....otherwise they'd just announce or ship at NAB.
2.) Somebody asked in the comments about fear of updating now out of concern for paying another update for v6 (or v5.5 or whatever next). Without any prior knowledge, but just from a business sense perspective, v5.1 is the update for Intel Macs, some new camera (XL H1) support, and some overdue bug fixes. Version Next will offer new features above and beyond these things, and I have no doubt, after buying software for 15 years, that it'll be it's own "full sized" upgrade price, probably no less than $200, but probably no more than $500. Historically, $299 is where I'd peg it.
3.) Because they are shipping v5.1 NOW, my guess is that they'll announce the next version, but it might not ship at the show. Remember how v5 took until May or June? Remember how Motion 1.0 took until October? v5.1 is the holdover until v6 ships - otherwise they would have waited three more weeks and shipped v5.5 or v6 with Intel support at NAB. Because they DID ship v5.1 now, that TELLS ME that the next version is less likely to ship at NAB or immediately thereafter. And that's OK - because v5.1 will work on iMacs and MacBook Pros for now.
Apple Ships Final Cut Studio 5.1
On laptops, the Intel MacBook Pro is up to 2.5x faster than the 1.67 GHz G4 based laptops according to Apple.
There are also some VERY SIGNIFICANT improvements to FCP above and beyond Intel compatibility, such as some vital Media Manager fixes, support for the Canon XL H1 (which FCP 5.0.4 does NOT recognize), and a bunch of other stuff I'll write about later tonight.
Salient quote from the page:
*DV and HDV rendering from the Timeline is up to twice as fast on a MacBook Pro with 2.16 GHz Intel Core Duo than a 15-inch PowerBook® G4 with 1.67 GHz PowerPC. MPEG-2 encoding from DV and HDV for DVD authoring is up to 2.5x as fast on a MacBook Pro with 2.16 GHz Intel Core Duo than a 15-inch PowerBook G4 with 1.67 GHz PowerPC.
They've discovered issues where AirPort does not automatically rejoin a preferred network after waking up from sleep and where the signal will randomly and abruptly drop to single-bar strength.
Some of the first MacBook Pros also packed LCD displays that have a tendency to flicker uncontrollably when the brightness is set to the lowest setting. Then, of course, there is the issue of heat.
According to DailyTech, some MacBook Pros get so hot they can barely be handled or placed on a bare lap. The notebooks are especially hot in the area above the F keys and underneath the notebook itself.
However, Apple told the publication that the MacBook Pro should never become this hot. A representative said 'that should not happen. If it is, bring back your MacBook Pro and we'll give you an updated version.'
...so if you're having trouble with yours, complain and get it swapped out.
You paid good money for it, it should work right.
This is another reason to never buy a 1.0 product right out of the gate (ahem...I seem to have trouble following that advice myself with tower Macs, but haven't been burned....yet). Same reason you don't buy a new car in Year One of it's production run - let'em sort out the issues and fix'em, then get it once it is stable. NOTHING ever goes as smoothly as folks hope when doing something new. Deadlines are fixed, problems are not, and the first ones out the door of any product are RARELY as good as the ones six months later.
UPDATE - This link has more info, including which serial numbers are likely to have the problems, what Apple is doing about it, more about the specific problems, etc. Apple is up to a Rev E (5th revision) since they started making them. If you buy a used one, early models not so great.
The report claims that the new iBooks will have a 13.3 inch LCD display and that they'll start selling in June.
Hopefully 13" MacBook Pros will start shipping not too long afterwards - I want one to replace my 12" PowerBook G4 ASAP!
....although the article mentions 12" MacBook Pros. Whatevah - I just want a small Mac Intel based laptop of my very own.
I will hug it.
I will squeeve it.
I will, in fact, call it George.
Remote Edit Session
Shane Ross over at little frog in HiDef wrote doing a remote edit session using iChatAV with a remote producer.
He aimed an iSight at the monitor in Digital Cinema Preview mode.
A Better Idea (if you have the hardware for it) -
If you have an AJA or BMD card with analog outputs, you can use those, regardless of whether you're doing an HD or SD edit. If you're doing HD, just use the downconvert function to send SD out the analog outputs.
Configure the unit to send composite or Y/C, not the usual component, through the analog outs.
I just checked - using the AJA control panel, you can click on the Analog Out tab and select Analog Format and use "Composite + Y/C."
On the BlackMagic stuff (I'm checking this on my Multibridge Extreme)....hmmm, I don't see an option for that in the PrefPane. I'll have to dig around to see if it can do it. So assume for the moment you can with BMD, DEFINITELY can with AJA -
So take your composite or s-video output from the card, and run that into a DV hardware codec. I have a Sony DVMC-DA2. What is a hardware codec? It's a box that converts a FireWire based DV stream to s-video, composite and RCA stereo pair, or vice versa (analog to FireWire DV signal).
Now connect the hardware codec (or use a camera if happens to have an analog to FireWire DV conversion capability) via FireWire cable to a second Mac.
In iChatAV, use FireWire Camera as your video source for a video chat. Run the audio from the edit if that is what your client wants to hear and be on the phone or speaker phone with them, or use a microphone and feed that into hardware codec to do simultaneous voice chat with client as well. A sneaky person might mix down the audio to one channel, and use the other audio channel for chat purposes. The editor wears headphones to avoid feedback.
I thought of this sometime last year and I think I even blogged on it, just haven't had to use it yet, but I should test it.
Somebody go try this, I don't have time today otherwise I would.
UPDATE - and, duh, if you're just doing a DV based edit, run the FireWire out directly into another Mac running iChatAV and define that FireWire source as a FireWire camera (camera, edit Mac - it doesn't know the difference! Just a DV stream coming in) and that's all there is to it.
Monday, April 03, 2006
Need an ULTRA fast laptop? Check this super fast Dell out that has 2 GB of RAM, Intel Core Duo, and will drive TWO additional monitors.
Read on for Charlie White's full review.
More on Netflix - they have a guy in charge of original acquisition. If you don't get theatrical distribution, fret not - think about talking to these folks.
This is a nice little example of "It doesn't have to be perfect to be Teh Bombinest!"
Stop motion joy and the ever popular Pulp Fiction title theme song.
Oh, and to make it relevant to HD stuff....hmmm...yeah - he coulda used a digital still camera to get gorgeous hi res stills 1920x1080 or better. Yeah - see? This isn't off topic - it's still HD (or COULD be).
And just for geeky thrills, I noticed that this is my 2000th post to HD For Indies in just over two years. Happy, um, bimillenium postings to me.
Other site stats -
From Sept. 8, 2004 until now as I type this, I've had 1,490,775 pageviews (when I started tracking, blog started March 2004), so I should cross the 1.5M mark this week no problem. So it looks like some of you people have almost as much spare time as me to read this stuff. For the last year or so, traffic is running around 1M pageview/year.
And now, class - DISCUSS! (in the comments, link below).
PS - I meant discuss the stop motion stuff, but discuss whatevah ya wanna
As usual, Scott covers this stuff faster and better than I do.
Read on for the full scooby on Movielink, the official, legal way to download movies.
It has it's issues (Movielink, not Scott's coverage).
And by the way, if you're interested in following the evolving digital distribution stuff going on in Hollywood and around the world, Scott's coverage is the best I've found on the net. Dude knows his stuff and gets on it promptly.
Ah - and here is a Yahoo news article about Movielink as well.
Some of my least favorite things about it - Windows only, DRM'd for only 3 computers, can burn a DVD that plays on computer, NOT on a TV, no feasible/rational way to play on PSP, video iPod, etc., includes no DVD extras, costs as much or MORE than a DVD.
Folks carped about iTunes Store being expensive, but at least you could burn a CD that plays anywhere, can load CDs and stuf from elsewhere into it....etc.
Flooey on that. But it is at least a start...
Rescheduling launch of Blu Ray from May 23 to June 25th.
HD DVD pushed back recently too.
Blu Ray hold ups are expected to be at the core of why Sony delayed PS3 game platform until fall (November I think it is to be, somebody bust me if I'm wrong on that).
Yeah yeah yeah - that's their headline. To me the interesting bit was this -
income in first FIVE days of theatrical release: $66 million ($75 anticipated)
income from first SIX days of DVD sales - $100M
No matter how you slice it, DVD sales were higher than theatrical sales for about the same length of time.
Now, theaters are harder to get to than DVDs are to get. I"d be curious how many of those sales were from pre-orders that shipped on Day Zero, rather than straight up retail sales.
For instance, how many Amazon.com backorders shipped out when it finally became available?
In any case, note that thisis a very clearly demonstrable data point as to why DVD sales are important for a movie's success, moreso than the theatrical run. While indies dream of getting theatrical release, DVDs are where the $$$'s at. Of course, theatrical is your marketing release, so that's all complicated, but, grumble grumble grumble...
There is a version of Final Cut Pro now shipping that supports Intel based Macs - iMacs, and MacBook Pros, not Minis (more on that below). And it is FAST.
From a Creative Cow posting (read for full details), Peter Wiggins tested a dual 2.0 GHz G5 with Radeon X800XT against a MacBook Pro. Both had 2 2.0 GHz processors (dual 2.0 GHz G5 vs. dual core Intel 2.0 GHz), both had 2 GB RAM, both were running 10.4.5 and the Universal Binary of Final Cut Studio.
In Motion tests, the MacBook Pro was consistently FASTER than the dual G5 (18 vs 23 fps in one test, 54 sec vs 1 min 24 in another).
In some rough Final Cut Pro rendering tests, the laptop was faster as well by modest margins (5-15%....ish)
As I mentioned the other day, there are some significant new features, which you can read all about here.
P2 import (for HVX200
Using a new option in the Import Panasonic P2 window, you can now remove
duplicate frames when transferring footage to your scratch disk. This option provides
several benefits: the transferred QuickTime files require less disk space and they can be
edited at their native frame rate.
So when shooting 720p24, you end up with 40 megabit data rate files on your drive, not 100 mbit.
720p30 drops from 100 to 50 mbit
24pA for SD is 20mbit not 25
1080p24 (24pA mode) is 80 mbit not 100
The DVCPRO HD Frame Rate Converter plug-in is now available in the Extras folder of
the Final Cut Studio 5.1 installation disc. This plug-in processes 720p60 DVCPRO HD
footage shot with a variable frame rate.
Canon HDV XL-H1 Support
Final Cut Pro 5.1 recognizes the Canon HDV XL-H1 camcorder for most Final Cut Pro
operations, including Log and Capture and Print to Video.
-It'll do 1080i50 and 1080i60, but NO MENTION is made of 24F mode. That may be a v6 feature, or a 5.1.x feature.
-It'll record the first 2 channels of audio over FireWire even though the camera can do 4, have to use HD-SDI capture for 4 channels. Probably possible to do a mixed capture, video over FireWire and audio over HD-SDI.
-fixes some problems with HDV during output and long 720p (over 10 minutes), doesn't crash when fast forwarding like it did w/5.0.4
-tweaked/improved some Multi-clip behavior
-improved Paste Attributes behavior (uses start frame in subclip more accurately)
-using scale and motion blur on stills - artifact issue fixed
-XML reading is improved (maybe Final Touch HD will benefit, or perhaps this in response to requests?)
This is a biggie for me so I'll quote it all:
Final Cut Pro 5.1 resolves several issues related to the Delete Unused option in the
Working with clips with speed changes:
Choosing Create Offline and selecting Delete
Unused no longer results in sequences with gaps between edits.
Choosing Copy or Recompress and selecting Delete Unused
now correctly deletes media from angle 1 of a multiclip.
Processing clips with negative speed:
Choosing Copy or Create Offline and selecting
Delete Unused no longer results in media with shortened audio.
Processing offline clips:
Selecting the Delete Unused option no longer results in a new
project in which offline clips are untrimmed, so you no longer need to recapture the
original media file duration.
Similarly, Reconnecting Media has been improved:
In previous releases, Final Cut Pro detected when a clip’s media file was altered and the
clip went offline. The only exception to this behavior was when you modified a clip’s
media file by using the Open in Editor command. In this case, Final Cut Pro would
automatically reconnect a clip’s media file when you returned to Final Cut Pro from the
external editor application.
In Final Cut Pro 5.1, you can configure Final Cut Pro to always reconnect modified
media files, even if you don’t use the Open in Editor command. This means that when
you modify media files in other applications, the clip does not go offline when you
return to Final Cut Pro.
The Bad News
-STILL no 24p mode support for the JVC - maybe in v6? LumiereHD can help you use 24p mode, but it's not as slick/obvious/direct as native FCP support. But it works!
-no 24F mode direct support for XL H1 mentioned, only 1080i50 and 1080i60, so do what? Hullo, LumiereHD?
-only 2 not 4 tracks of audio over FireWire with XL H1
-no After Effects plugin support on Intel based Macs this is contentious - some developers are saying it could be easily done and might come soon. See the Creative Cow thread linked to above for more details, search for Pierre to find his posts.
-Final Cut Pro v5.1 projects CANNOT be opened in FCP v5.0.x
-Final Cut Studio won't work on an Intel Mini according to this Apple page - officially, it's not supported. It'll install but it doesn't meet the minimal specs (why isn't clear and I'll need to look into this more - anybody got a clue? Email me!) Might this change with a future upgrade? I don't know. This is all so oddly vague - why not just say it lacks the right video card (or whatever)? Maybe there wwill be an upgraded Mini that WILL work later.
....and you still have to send in your original install discs and wait to upgrade, which is kinda lame.
Thanks to numerous readers for sending in much of this info and links!