Most 35 mm film outs that have gone through a scanning process have been scanned at 2K, or one quarter the resolution of 4K.
Only extremely high content VFX work is generally scanned as high as 4K. Viewers that see RED footage for the first time either describe the quality as 65 mm film or “grainless 35”.
No mystery, no hidden message, RED is simply the color that represents strength, power and passion, in addition to being chosen as a favorite color by a high percentage of the population. When researching the color Red, one learns that Red “grabs attention”, gets people to “take action” and suggests “speed combined with confidence and perhaps even a dash of danger”, all apropos for the RED ideology.
At RED, we flip that to ask, “why is everyone else so expensive?” The RED philosophy is a “guerilla” attack by a small, highly elite group of visionaries, designers and engineers, driven by a passion for the technology, and a desire to change an industry by challenging all of its conventions.
The sales model is factory direct, eliminating the middle man and the marketing is advertising free, depending on the greatest marketing tool of all, word of mouth.
By creating a camera that transcends conventional demographics, the volume of production is magnitudes greater than its competition, thereby further reducing its cost.
Canon's update to the wildly popular full frame EOS 5D is here, and it's better than ever. The EOS 5D Mark II has a stunning 21.1-megapixel full-frame CMOS sensor with DIGIC 4 Image Processor, a vast ISO Range of 100-6400 (expandable to ISO L: 50, H1: 12800 and H2: 25600), plus EOS technologies like Auto Lighting Optimizer and Peripheral Illumination Correction.
It supports Live View shooting, Live View HD videos, and more. It can shoot up to 3.9 fps, has 9 AF points plus 6 AF assist points, a new 98% coverage viewfinder, a 3.0-inch Clear View LCD (920,000 dots/VGA) and a rugged build. Full-frame shooters rejoice!
Here's a quick sample video taken by Jeff Lewis using the Canon 5D Mark II's video setting at a USC game. This is played back in 1280x720 HD, but the camera can record full 1920x1080 HD.
I think red one is more professional for film maker. because it is designed for this. canon 5D MK2 is a jelly cam (if you pan the cam fastly you can get what I mean) Also even after post production high fps looks like a "video" instead of "usual movie quality"
Yeah, not really comparable in terms of functionality and use in a professional environment. But that being said, the 5D can produce some stunning motion pictures especially in low-light.
Another plus for the canon is its compact size and affordability. The Red is a monster and very impractical for hand held use.
Aside from video quality, the read also has nice cinema features like 120fps, proper full frame preview. I would have the RED any day, but then again, I would miss the 5d mark II - its awesome.
We had hoped to put up a good old-fashioned Friday afternoon smackdown between a commercial shot by photographer David McClain and Jerome Thelia of Merge using the 4K Red One camcorder, and the now already legendary "Reverie" commercial shot by Vincent Laforet using the Canon 5D Mark II, but looks like Canon's already taken down Vince's commercial due to heavy demand.
Of course, this would be an "apples to oranges" comparison since the Red One shoots in four times high definition while the 5D peaks at just 1080p HD, but we're still intrigued by both videos and how quickly the photographers were able to adapt to shooting commercials in HD+. Videographers should be afraid...very afraid.
Even if they've never shot a single frame of digital video in their lives, most photographers have probably heard the name Red One.
Hailed as the "next big thing" in camcorders since it was only a whisper of a rumor three years ago, the Red One aimed to do for digital cinema cameras what the Nikon D1 did for digital SLRs in 1999: create an easy-to-use and affordable tool for capturing beautiful, high-resolution, digital imagery.
To say "high-resolution" in conjunction with the Red One is a crazy understatement. The camcorder can capture what is called "4K" digital video which is more than four times the resolution of High Definition, all at a price ($17,500) that makes it six times cheaper than comparable digital cinema cameras.
I'm convinced that the HD DSRL is a truly great camera.
Entering the world of the videographer means understanding pulling focus, if you want cinematic shots, and thinking about movement as well as framing. It's a big jump, but a fun one.
Using the HD DSRL is a very different experience than running about with a miniDV camera, there's more to think about, but it is also a very rewarding experience.
High-end cameras designed specifically for the digital cinematography market often use a single sensor (much like digital photo cameras), with dimensions similar in size to a 35mm film frame or even (as with the Vision 65) a 65mm film frame.
An image can be projected onto a single large sensor exactly the same way it can be projected onto a film frame, so cameras with this design can be made with PL, PV and similar mounts, in order to use the wide range of existing high-end cinematography lenses available. Their large sensors also let these cameras achieve the same shallow depth of field as 35 or 65mm motion picture film cameras, which is important because many cinematographers consider selective focus an essential visual tool.
George Lucas discusses his ongoing effort to shape the future of digital cinema.
George Lucas began his career as an editor and a cinematographer (he was one of several cameramen on the Rolling Stones concert film Gimme Shelter), but his frustration with the tools of his trade, coupled with his desire to tell stories that were galactic in scope, drove him to seek new ways to make films.
Today it can be said that few have impacted the craft of filmmaking more than he has. The ultimate noodler who enjoys seeing his films come together in editing, Lucas has transformed the medium into a postproduction fantasia.
Cinema was the art form that helped define the 20th century, and Lucas is passionate in his conviction that its expression in the 21st century will be digital.
The latest installment in the Star Wars saga, Episode II-Attack of the Clones, was the first major Hollywood feature to be captured digitally, on 24p high-definition video cameras.
But in his determination to push the medium of cinema with new technologies and techniques, Lucas has encountered both support and skepticism. Episode II's extremely high profile situated him at the center of a raging debate over the merits and drawbacks of digital technology, which is seen by some as a viable alternative to traditional film-based methods, and by others as a concept that still requires a great deal of refinement.
Lucas recently discussed his impressions of the evolving technology with Ron Magid, American Cinematographer's visual-effects editor.
American Cinematographer: If there indeed is going to be a digital revolution, what will it look like?
George Lucas: I've always [said], "This is like the film industry in 1902," so the advances are going to be huge, because what we did on Episode II, we did in essence by ourselves. We had to talk Sony into it, [but] they built the cameras and they tried really hard to make this work; we also had to talk Panavision into committing a lot of money to build those lenses.
Both companies really went out on a limb. This was a giant experiment for everybody, and nobody knew if it was going to work or if they were pouring money down a rat hole.
George Lucas: Now that the whole medium is opening up, there are lots of lens manufacturers out there building lenses and lots of other camera people building cameras, so you've got competition. And once you've got competition, you're going to get a lot of people making vast improvements on the system. They've already got a 10-million-pixel camera, and that's just happened in the last year, so [the potential has] gone from 2 million to 10 million ? and that's much, much higher quality than film. Any other issues out there will eventually be addressed, because a lot of cameramen are going to use the technology [and] say, ‘I want it to do this or that.'
There is so much misinformation being put out there by people who have interests other than the quality of film. They're determined to slow this down or stop it, but they can't. It just won't happen. It was the same with digital editing ? for the seven years that we had Editdroid [almost] nobody would use it, and even after we sold the company to Avid another two or three years passed before they got anybody to use it. All [digital technology] does is give you more to work with.
It's a much more malleable medium than film, by far; you can make it do whatever you want it to do, and you can design the technology to do whatever you want to do. This whole field is really going to ramp up in the next 10 or 20 years.
Many filmmakers have a wait-and-see attitude, which is to be expected. Some early filmmakers resented sound, and Orson Welles insisted till his dying day that a good color movie had never been made.
Lucas: And those criticisms are valid. There is the very real issue that you are going from a photographic medium to a painterly medium, and for those who are really wedded to the photographic process, that's going to be a tough thing to get around. It's very much like going from frescoes to oils ? one is very rigid, very disciplined, very definite about the way it works, and the other is much more open, offers you more options and enables you to manipulate the pictures more, and I think that bothers people. But audiences can't tell the difference. We knew that right from the beginning because we shot [parts of] Phantom Menace digitally, and nobody could tell which shots were digital and which weren't.
it seems as though a new style of filmmaking is evolving, particularly in terms of the stunt and effects sequences, which felt more believable because there was less cutting around to hide the tricks. Have digital tools allowed you to develop a different style?
Lucas: The reality of how you're shooting and the limitations of what you have to work with sometimes determine how you shoot a scene, and now that's less of a factor with stunts. I don't know if it's a different style, but it's a very different process now. I've refined the process of working more visually; I shoot for a period of time, about 60 days or so, and then I stop and work on the film for a while. Then I come back and shoot for another 10 days or so, and then I stop and go back and work on the film, rewrite and change things, and then I come back and shoot for another week. I do it in pieces rather than in one long shoot.
That way I can actually look at what I'm doing, cut it and study it. The previsualization process [allows me to] put scenes together without having to shoot them, see how they fit in the movie and then, if they work, I can cut them in and actually go out and shoot them. There's a lot of freedom and malleability that didn't exist before. It's easy to move things around in the frame, to change various visual aspects of the film, which just wasn't possible before. It's the same kind of thing that you find in still photography if you use Photoshop.
Lesley Vanderwalt, the Episode II makeup artist, has said that sometimes the hi-def images were so clear that smudges and brushstrokes were visible in the actors' makeup. What strategies did you employ for makeup, costuming, props and sets to work with hi-def as opposed to film?
Lucas: We used filters to soften the image and make it a little less sharp so we could get away with more, but you do have to be very careful [because] you can't get away with as much fudging as you used to. The sets, costumes and makeup have to be more finished. It's going to require refinement in all the crafts because the digital image is so much sharper.
It's easy to degrade the image. You can hide all the little seams and imperfections that inevitably show up on props and sets and costumes simply by putting a filter on the camera so that the image is a little smudged, or you can have everybody come up a notch so you can do a really sharp close-up on somebody's face without seeing the brush marks on the makeup.
Were there any other restrictions you had to address when working with hi-def?
Lucas: There are certain issues with action perpendicular to the lens, when you're panning with the [actors], which created effects that we weren't completely happy with, but part of that is because I shoot in pieces. I'll shoot a person running across the screen, and then we'll put the background in, and it's really only when you get all the elements pulled together that you finally get to see what you've created. So there are certain [shots where] I wouldn't pan with people running completely perpendicular to the camera because it strobed. We also had little problems with softness on certain lenses, because these are all Beta cameras and Beta lenses, but that's been fixed.
We've heard reports that the cameras were less efficient on location than they were in the studio, particularly during the shooting of multiple-camera exterior sequences. Is hi-def practical for extensive location work or for extremely mobile camera setups?
Lucas: We never had a problem with the cameras, ever. We shot in very difficult locations and in 135-degree heat; we were shooting in and around the water and in the rain, and we had no breakdowns or problems at all. We were running cables because this is Beta, so everything was backed up six ways from Sunday; on location, we were running cables farther or over difficult terrain. But we were just as fast and efficient on the locations in the middle of Tunisia [as] we were in the studio.
Did all of the cables limit the kinds of shots you wanted to achieve?
Lucas: Not really. You can go a long way with those cameras. Obviously, if we were on location and we wanted to go a quarter mile up the road, we just unhooked the umbilical cord because we didn't need it. The cameras have recorders built in, exactly like TV cameras. But because these were the first seven cameras that were built, everybody was very nervous ? I think Panavision and Sony were more nervous than we were ? so we were double-backing up our recorders with two recorders. We did shoot some material without backing it up. The second unit didn't want to bother, and it worked fine. We now own other cameras that don't have recorders in them, and we use them at ILM, where they want the cameras to be very small. They're about the size of a small book, and we do umbilical them because [the effects cinematographers] prefer to use them that way for motion-control or [to get into] very small [places].
Why did you opt to use hi-def instead of VistaVision for miniature effects photography? We understand that this choice created some problems.
Lucas: The only problem it created was that we had to reinvent the system. We had to get new cameras and build the system rather than just use the system we had. But I wanted Episode II to be consistently digital; I didn't want to have to use film. Film ultimately is very cumbersome. It's like working with the lights out ? you can't see the work until the next day. Being able to look at what you're doing while you're doing it, without having to run to the lab or [hurrying] because you want to break down the setup and all that, makes hi-def a much more efficient way of shooting visual effects.
But the visual-effects cinematographers on Episode II said it was a struggle to get the proper depth of field on miniatures because the cameras can only capture at 24 frames per second, so they couldn't do 1-fps exposures. They said the model sets were sometimes melting because of all the light required.
Lucas:
To be honest, I never heard of a set melting. They didn't have to pump all that light in there. You can shoot at extremely low footcandles and make hi-def look excellent. And you can actually maintain the same depth of field with hi-def that you can with film. You still have a higher range than you do on film, so if you have to light it up for digital, you have to light it up twice as much for film. But regardless of whether we had to do several passes at ILM, we saved millions and millions of dollars shooting digitally.
Was any portion of Episode II shot on motion-picture film?
Lucas: Yes. We took a scene in the Jedi temple out of Phantom Menace, which was shot on film, and we erased all the characters and put in a new background and new actors, so it's an actual photographic set with digital characters and digital backgrounds outside the window. So in a way, we had a filmed set, but that's the only film in the movie.
Given that you now have the ability to endlessly tweak and repurpose shots like the one you just described, is there such a thing as having too much control over your images?
Lucas: I don't know, you should ask a painter. Having lots of options means you have to have a lot more discipline, but it's the same kind of discipline that a painter, a novelist or a composer would have. In a way, working in [digital] is much less frustrating than working in film, but it's not as though it's limitless no matter how you go. The artist will always push the art form until he bumps up against the technology ? that's the nature of the artist. Because cinema is such a technological medium, there's a lot of technology to bump into, and I think as more people use digital they're going to find [it has] a lot more limitations. Some of those limitations will be [equivalent to] the limitations they had with film, and some of those limitations will just be because they've gone so far that they finally bumped into the technological ceiling.
Because there are no set standards for digital cinema, did the Episode II digital film file have to be tailored to each of the digital-projection systems currently in use? How do the differences among the servers affect the image quality?
Lucas: I haven't really checked out the various digital theaters yet, but I've seen the film digitally projected on several different systems and they're comparable. You'd have to run back and forth to actually see whether there's a difference, and if there is, it would be very highly technical. But whenever you have different projectors, you're going to have differences. You have that with film ? you go to one theater to see afilm and it's fine, and you go to another theater and the footcandles are way down, [the image is] fuzzy and the left corner of the frame is completely out of focus. That just has to do with that particular projector and that particular theater, and you're going to have that with digital projection, too.
I don't know whether anyone has actually done any studies [comparing the various digital servers].
The important thing is that they're all compatible in terms of us doing our transfers and sending the files. You have to make several different kinds of transfers for different media, anyway, whether they're sending by satellite, disc or cable. I don't think we made many adjustments to the various versions of Episode II that we did for the various systems.
Is it true that you recently assembled a forum to explore the state of the digital art with a number of directors?
Lucas: Yes. Because there are so few of us working in the theatrical digital medium, about a half-dozen, we decided we should all come together to talk about our experiences and share information. It was a two-day conference, and there was a lot of discussion, mostly from the point of view of the director. This was before Attack of the Clones was released. I showed that movie, Robert Rodriguez showed part of Spy Kids 2, Francis Coppola showed part of the film that he's shooting, Jim Cameron showed his 3-D underwater movie, and Michael Mann showed a little bit of Ali because parts of it had been shot [with digital cameras]. Pixar showed some [footage] digitally and on film [to demonstrate] what happens to [images] after about three weeks of being on film; you could really tell the difference between the film version and the digital version.
We invited a bunch of directors, including Ron Howard, Bob Zemeckis, Steven Spielberg, Marty Scorsese and Oliver Stone. There were a lot of skeptics, and Marty, Steven and Oliver asked very hard questions. But when you get the answers, you say, ‘Oh, it's not the big boogeyman that everybody says it is.'
James Cameron recently told the Hollywood Reporter that you showed him tests that led him to believe ‘the Sony HD 900 series cameras are generating an image that's about equivalent to a 65mm original negative.' How can you acquire more information on HD tape than on 35mm film, given that no one else shooting HD is able to capture that much information?
Lucas: We don't have a bias. For the short time being, the test is really Attack of the Clones. You [watch it] digitally projected and say either, ‘It looks like s**t' or ‘It looks great.' If that isn't enough, then wait till Spy Kids 2 comes out. In the end, cinematography is not about technology; it's about art, it's about taste, it's about understanding your craft, it's about lighting and composition, and anyone who gets off on technological things is missing the point. I care about good lighting and good composition. I'm not interested in an engineer who knows a lot about the technology; you get into these kind of arcane discussions about ‘black curves' and things that no audience is ever going to see.
All of us [working in digital] are using different styles of photography, different kinds of conditions and different kinds of lighting. Coppola has just shot some unbelievably gorgeous material, wide shots of cities with incredible detail at magic hour and all kinds of available-light material, whereas Rodriguez has lit his to be very bombastic color, really exuberant and wild. So it doesn't have to do with the technology, it has to do with the eyes of the filmmakers working in the medium and what they want to do with it.
Did you read these articles? I think that Full-frame digital SLR is useful. Here we go, again.
What happens after Episode III?
Lucas: I'm going to do other types of projects, things that I've wanted to do for a long time, definitely a very different kind of filmmaking than what I've been engaged in for the last few years. I'm just somebody who's trying to tell stories, and in order to tell the kinds of stories I've wanted to tell I've had to push the medium. But all the directors and cameramen I know push the medium. They're always trying new things, trying to get a different look or push something a little further by using a new trick or a new technology. That's the nature of the business. Everybody does it, but I get more attention for it.
To begin with the Red One is not a film or HD camera; it is like a large digital SLR camera, except it captures metadata at 24-30fps at 4k, 60fps at 3k, and up to 120fps in 2k resolution.
Filmmakers who consider their work art had better become technicians fast if they intend on using the Red One, as it does not have the color palette tool in the creation of the nuanced finessed brushstrokes you are accustomed to getting from film. You simply cannot be green if you want to go Red. The Red One currently uses a CMOS digital sensor capturing images at 4520k pixels of resolution by 2540k pixels of resolution.
It is retrofitted with a film PL mount, which means its digital sensor is able to gather light, utilizing and taking advantage of the finest optical quality film lenses ever made. This gives the Red One the equivalent characteristics of 35mm film’s narrow depth of field in terms of focus.
But the Objection is Great, because too many people use it and take it for good, therefore we must face it at once and make people understand that the truth is exactly the opposite.
The most exciting aspect of Red is that it enables you to watch and approve dailies instantly, whereas film needs to be processed and scanned in order to be outputted and viewed.
Furthermore, the Red proprietary hard drives allow up to 3-hour takes compared with 10-minute takes in traditional film. Without a doubt the Red One’s tapeless workflow acquisition is unparalleled in facilitating post-production schedules and budgets, but at what cost visually?
On film shoots you are able to overexpose your negative, blow out the highlights and recover them later. Personally, I overexpose film negative by two-thirds of a stop in order to get a thicker negative, in other words more image information.
There is an assumption that with Red One the look is not "baked" into the raw files when you shoot, and that with color correction software you have infinite information to manipulate. Not true. The Red One is not immune to digital exposure rules and it has a hard floor for the blacks and a hard ceiling for the whites.
If you blow out the pixels on a Red digital sensor they will "hard clip," leaving you with no image information to manipulate. The reason digital cameras have limited exposure latitude is that they capture information in a linear color range space, as opposed to film which captures information in a logarithmic color range space.
The human eye perceives contrast, light and detail logarithmically, so it is no surprise audiences still have a natural disposition toward a "filmic" versus " digital" look so deep into our digital generation.
In defense of the Red One I will say its wavelet compression and 12-bit color space do hold highlights in a fairly pleasing and organic manner for a digital camera, but nothing has ever compared to the controlled and pleasing manner negative film's highlights roll over the exposure curve.
So here’s the rub: On digital cameras many filmmakers skew toward slight underexpose in order to preserve highlights. Although underexposure in the digital world is a fantastic technique for both protecting whites and achieving great-looking velvety blacks, this is not the case with the Red One.
When you underexpose the Red you essentially waste linear bits of metadata the camera could have captured. When you open up underexposed Red One footage in the raw conversion, you end up with digital noise, milky blacks and posterization. Posterization is the undesirable concept of spreading too little continuous information too far apart in a digital camera’s linear range, resulting in bands that run across the image, particularly at the points when two colors without sufficient tonal information meet.
The Red has a sensor balanced to 5000K daylight. This is an important point to grasp because the sensor captures the cleanest images under daylight sources.
This does not mean you can't shoot the Red in tungsten mode, but if you do you are more susceptible to getting digital noise as you have not activated the Red sensor's blue channel. To activate the blue channel while in tungsten mode, you can introduce a blue backlight. However, having to resort to using blue light in a warm, motivated tungsten scene to avoid potential digital noise is aesthetically ridiculous.
The Red sensor's bias to daylight is without a doubt my single biggest peeve about it.
Further exacerbating my gripe, while traditional daylight film stocks and daylight lights measure 5600K, the Red chose a sensor balanced to 5000 Kelvin.
I can't fathom why, however my task is not to question and complain, but rather to find solutions and solve.
In response to the Red sensor bias issues, I add 1/4 Color Temperature Orange (CTO) gels to my HMI's daylight lights and carry 81 series filters to help balance the color temperature as needed. As a side note for filmmakers who are fans of daylight balanced fluorescent and LED lights, you are in fine shape as those sources - like the Red - are balanced to 5000K exactly. The 81a filter converts 5600K to 5000K with one-third of a stop loss, the 81ef converts 7500K to 5000K with two-thirds of a stop loss.
These filters are very useful when shooting with the Red as I find the camera's sensor moves blue toward the purple spectrum for some reason, when the color temperature increases from 7000K to 11000K.
When shooting with the Red I am not scared to stack filters up to control exposure and enhance incamera images as I desire. I use polarizers to saturate skies or control reflections when shooting at angles through glass or water. I use graduated neutral-density (ND) filters to control exposure. I use straight neutral-density (ND)
filters to reduce the amount of light hitting the sensor, thus allowing a larger aperture to create shallow depth of field. But be careful because multi-stop neutral-density (ND) filters exacerbate the Red camera's sensitivity to light in the IR spectrum, which though not visible to the human eye can result in color shifts and prevent capturing of true blacks. To help prevent this oddity when using neutral-density (ND) filters on the Red, I use NDs in conjunction with a Tru-Cut IR-750 filter, which corrects the potential for color shifts.
As a cinematographer who likes to move quickly, neither film nor the Red give me the edge in speeding things up on set. Interestingly enough, my lighting and grips package for a Red would be the same as a film shoot requiring 320 ASA stock.
Film has loading time issues, Red has battery changing time issues. Red can capture longer takes than film, but more footage invariably takes more time to shoot and review, so timewise both formats cancel each other out. However, I do concede that with regards post-production, there is no doubt the Red is the most exciting tapeless workflow acquisition camera around. It is going to revolutionize the speed in which we move through post.
In conclusion, I don't think the Red is a compromise - I think it is merely another professional tool we ought to embrace and enthusiastically add to our repertoire.
For a linear-capturing digital camera it is no slouch. It is capable of capturing beautiful images, which ultimately and always are in the eye of the beholder, our audience. The biggest difference between shooting film and shooting with the Red is still the way the two media respond to capturing light.
And no matter what the future holds for film, we should always remember that film responds to light the same way our eyes do.
In the new film Collateral, a lonely taxi driver, Max (Jamie Foxx), agrees to chauffeur the smooth-talking Vincent (Tom Cruise) around Los Angeles for an entire evening. He soon discovers that Vincent is running an unusual errand indeed: he is a mercenary who is methodically eliminating five witnesses scheduled to testify against a drug cartel in federal court. Unable to escape Vincent’s grasp, Max quickly becomes the chief suspect in the murders, and as federal and local law-enforcement officials close in on the duo, Max realizes his only way out is to prevent Vincent’s final murder.
Collateral director Michael Mann had experimented with high-definition (HD) video for a few scenes in Ali (see AC Nov. ’01), and he went on to produce the television drama Robbery Homicide Division, which was shot entirely with Sony/Panavision 24p CineAlta HDW-F900 cameras (AC Feb. ’03).
Intrigued by the format’s potential for feature filmmaking, Mann decided to use it on the extensive night-exterior work in Collateral to make the most of available light in and around Los Angeles. “Using HD was something Michael had already settled on by the time I came aboard,” recalls director of photography Paul Cameron (Man on Fire, Gone in 60 Seconds), who prepped Collateral and shot the first three weeks of principal photography. “He wanted to use the format to create a kind of glowing urban environment; the goal was to make the L.A. night as much of a character in the story as Vincent and Max were.”