Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Star Wars Prequels Media Movies

Are Digital Movies Really Better than Analog? 68

Beatlebum asks: "I have watched two digital presentations of AOTC, the first at AMC1000 San Francisco and the second at the Metreon S.F. I did notice a few digital artifacts; however, what bothered me most was the lack of clarity of the colors. Many scenes seemed to be very slightly foggy. I expected the colors to be clear, crisp and rich. The Matrix Reloaded trailer looked significantly better in this regard. Am I crazy or did anyone else notice the same thing? I'm especially interested in hearing from those of you that have seen both analog and digital versions."
This discussion has been archived. No new comments can be posted.

Are Digital Movies Really Better than Analog?

Comments Filter:
  • The picture loses all of its warmth.

    You really need to use IMAX film and the screen itself needs to be of very high quality.
    • I was under the impression that IMAX was just 70mm run sideways. At least, initialy it was. Do they use special film stock as well now?
      • by OneFix ( 18661 ) on Monday May 20, 2002 @05:33AM (#3549022)
        I was under the impression that IMAX was just 70mm run sideways.

        No, as this link [1570films.com] shows, it is actually 3 times bigger than 70mm film ... however, the film runs at such a high speed that signifigant part of the film is "wasted" on the camera spin-up and to keep the cameras small enough to lug around, they only get 15 minutes of film per reel.

        This as well as a few other reasons is why most IMAX films are under an hour long and feature little or no scripted characters (documentary films). Why documentary...because you can't really get a dolphin to perform better for a different take...

        For insight into IMAX film making, try one of the many IMAX DVDs with commentary...one of the best to decribe the unique IMAX film making process is Super Speedway: Mach II Edition [digitallyobsessed.com]...

        The IMAX film format is actually the best quality you can find for dvd transfers.

        Like it or not, anything that lucas used on EP2 will not compare to the film quality of IMAX!!!
        • That would be why they call it IMAX 70mm then?

          It is basically just 70mm film run sideways at a much higher speed than normal 70mm (or 35mm) film.

          However, while the quality of IMAX is very difficult to match, or surpass, I have also enjoyed to a similar level some of the 360 theatres (shot with 6 [iirc] 35-70mm cameras...)
        • ummm, no, it says that it *is* 70mm, there just happen to be a wide variety of 70mm standards, there's;

          academy, which is like 35mm just 4 times as big,

          widescreen, (the 5 perf version they mention, similar to techniscope which is 3 perf 35mm. Created as a way to save money on film, turned out to be strangely unpopular considering that the normal way to get widescreen[not counting panavision,] is to just put a metal slide in the projector to cut the top and bottom off*)

          vistavision, 70mm run sideways, the basis for all IMAX gear, and the original ILM optical printers

          there's a panavision squish lens around for the academy format too, (hollywood does love overkill)

          VistaVision is the one that has all the history, as all the equipment suddenly jumped astronomically in price in 1977-78, (you may draw your own conclusions.)

          The first VV movie I saw was "White Christmas" (Bing Crosby et al.) I just remember feeling really weird as I watched the movie and realizing about 10 minutes in that it was becasue the scratches on the film were running sideways.

          If you want to be really picky about it, 70mm is really only a print stock, as the film is generally shot on 65 mm and then printed onto 70 leaving 5mm for the soundtrack.

          regarding the length of the IMAX being limited to 1hr or less, there was a documentary about it years ago where one of the producers claimed that they were short because the experience was so intense that it seemed longer. Personally, I think this is just an excuse.

          ------------
          *this is the reason why a lot of video versions of movies have so many instances of the microphone intruding into the frame. it was supposed to be behind the metal screen, but the transfer was done full frame rather than pan n scan.

          • sorry, my mistake, I was up too late, and the coffee machine is gone.

            VistaVision (of white christmas and ILM fame) is 8 perf 35mm running sideways, not 70

            Imax is the same principle, but their own cameras

          • I realize this, but the origonal poster seemed to suggest that IMAX offered no advantage over standard 70mm, and that is simply not true...

            And as to your final comment...yes, alot of directors started shooting in "open matte" after the popularity of home video. One of the more popular directors that films open matte is James Cameron. And it is true that there are often "goofs" in the full frame version and this why the matted (theatrical) versions of these films are still prefered. Actually the whole Widescreen vs. Pan & Scam thing would be alot simpler if open matte didn't exist.

            And as is obvious from the comparisons, IMAX films can fill the whole screen on a standard TV.

            So where are we...well, the point of this was that the "IMAX is just 70mm sideways" comment seems to be missing a huge part of the picture (no pun intended) :)
  • Well, you really can't compare the two if you haven't seen both. Truth be told, I've only seen it digitally -- 4 times projected, 3 times on hi-def monitors -- but I'm pretty happy with the color.

    I could see it analog, but probably won't as my favorite local theatre is showing it digitally too. (why see it an 8th time? My wife hasn't caught it yet; and I haven't seen it in a crowded theatre.)

    You're not going to get me to say digital is better, because like I said, I haven't seen both. But I do think it does the picture justice.
    • I haven't seen it in a crowded theatre.

      You better hurry... I saw it opening night, and again on saturday night, and the crowd was considerably less enthusiastic the second time. This wasn't a cow-town, tiny theater either; it was the Fox in westwood.

      • I saw it at 12:01 on opening morning. Lots of cheering and all. At 10:00 that morning the theater was half full, and there was very little audience participation.

        Something odd at the first showing though, the theater filled with nervous laughter during Anakin's dream scene. Anyone else experince that? I think it was the giant nipple waving around on the screen.

        Oh, and will you do a "special" 1.0 release of QT Mozilla? Can you start linking against a newer version of QT?

        -Peter
  • Even on the analog I noticed some of those black ovals ,the ones appear on film movies, in some important scenes so what you notices in the artifact department may be Lucas film's Fault not the Digital equipment or the show's
    • If you mean the black ovals that appear in the top right corner of the screen, these are used to tell the person showing the film when to switch to the next reel. You will notice they flick up twice; the first time to indicate to get ready, and the second time to indicate to flick the switch. The transistion always occurs at a scene change so that the audience don't notice any jumps. I have never worked in cinema so I have no idea whether this is still the way films on reel work today.

      • I learned that little piece of trivia watching Fight Club.
      • I believe these are still fairly common, even though larger theaters have moved to platter systems rather than 2 projector systems that require reel changes. With a platter, I believe the film is looped back to the center of the spool (like an old 8-track tape) so the reels don't require rewinding between showings.

        IIRC the first dot is "start the motor of the second projector" and the second is "open the shutter". (they're actually circles on the neg, but due to lens distortion to push 35mm to wide-screen, they often become ovals...) In general you won't see these on digital projections. In fact, I recall seeing them at times in films on TV years ago, but I think they've mostly disappeared there... gone back to original negs or cleaned up existing prints for the telecine transfer. Don't remember if I've seen them pop up in DVDs or not. I would think the only case that would happen would be if the only source prints of the film already had the dots and they didn't want to pay to clean them up.

        *shrug*
        • I'm a projectionist. In fact, I own and operate a theatre.

          I believe these are still fairly common, even though larger theaters have moved to platter systems rather than 2 projector systems that require reel changes. With a platter, I believe the film is looped back to the center of the spool (like an old 8-track tape) so the reels don't require rewinding between showings.

          This is all correct. However, the changeover dots are still present on film reels because not everyone has a platter. Old theatres use two-projector systems with 2000-foot reels (the ones that the films are actually shipped on), and some "newer" ones use two-projector systems with 6000-foot reels which allow for less changeovers but still have to be rewound.

          Platters allow for no rewinding between shows and no reel changes.
    • Was this a joke? The "artifacts" are to signal reel change.
  • Also worth noting... (Score:3, Informative)

    by cei ( 107343 ) on Monday May 20, 2002 @03:40AM (#3548732) Homepage Journal
    If you saw the feature digitally, the trailers were digital also. So if you liked the color in the Matrix trailer, then you liked the color of a digital projection. The trailers were loaded in a playlist before the feature (along with the THX logo, the DLP logo, and perhaps one other depending on which playback device was in use by that particular theatre.)
    • The trailers were displayed digitally, perhaps. The real question: was the entire production digitally? I think not.

      George Lucas wanted the original master to be done digitally to improve the lifespan of the film. Thus the film will never need to be digitally remastered, and will never fade.

      Perhaps other films have more brilliant full colours because they are expected to fade? But this is only un uneducated guess.
    • Hey Idiot, I know that. The point is the Matrix was shot on film.
  • ...and render unto 35mm that which is analogue.

    What you noticed is actually the *opposite* of what is generally the case regarding *AotC*. Mo0st likely, if the digital projections looked worse in some way--more blurry, less saturated, etc.--than the film projections, then this is entirely the fault of the digital projector or some other element in the theater being set wrong. It's probably that the digital projector's settings were not all adjusted optimally, since the tech is so new. Hell, my local multiplex often sets their standard film projctors sub-optimally, and that tech is ancient...

    The fact is, assuming the digital projector is set up correctly, *AotC* will look better on a digital screen because it is an entirely digital movie. The masters are digital, and when you see a digitally projected version, it should be as pristine as the masters (or nearly so, if resolution has to be adjusted). If you see a standard 35mm print of the film, you're seeing a digital->analogue conversion which willn not be as crisp and vibrant.

    This is not true of showing most films, though, because most are primarily shot on film and not digital--even films with a lot of digital effects everywhere are generally primarily film. This has the advantage that a 35mm print has a far superior resolution than even the special custom digital camera which Sony made for George Lucas to shoot his digital movies with. 35mm film also has much greater sensitivity to a broader spectrum of colors than current digital cameras will allow--50 years of development on the color film stock front has produced some amazing things. So, while there is generational loss in th analogue->analogue transfer from master to new print, it hardly matters since the resolution is so vastly greater than the resolution of digital, and since the color spectrum is wider than current digital video camera sensitivities. This is why people like film critic Roger Ebert, and even me, can't stand digital projection for 35mm movies--even with my 20/100 vision, I can see the inferior resolution and color saturation of a film that was intended for 35mm when it's projected digitally on a very big screen.

    So, naturally, it would seem that all-digital films like the new *SW* movies and digital animation like *Shrek* should best be viewed on digital projectors, while movies which are primarily 35mm are bet viewed on a traditional 35mm projector. And the fact is, until digital technology makes resolutions and color spectra approaching that of 35mm film possible both on the shooting and the projecting ends, I don't believe digital should be adopted as the standard.

    Don't get me wrong--the time will come to go digital. But until its resolutions and color sensitivities can truly rival 35mm, that time is not now.

    As an aside, here's film critic Roger Ebert's take--he's an outspoken critic of current digital projection for films shot on 35mm, but he shows an even-handedness when it comes to allowing that digital films naturally look better on digital projectors: http://www.suntimes.com/output/eb-feature/cst-ftr- star15.html http://www.suntimes.com/ebert/ebert_reviews/2002/0 5/051001.html

    The fact is, except for all-digital special effects films like the *SW* movies, the current push for digital is coming out of economic penny-pinching rather than better quality. There was a time when Hollywood was interested in greater quality and experimented with impressive 70mm filmstocks and 48fps speed. 35mm and 24fps stuck around because it's cheaper, albeit less visually stunning. And now digital, which has 1/5 to 1/7 the resolution and less color sensitivity at the moment, is chomping at the bit. For all-digital special effects flicks likw *SW* and *Shrek*, naturally it makes sense since rendering isn't yet done at very high resolutions (compred to 35mm). For other films, it doesn't, especially when projected on a rally impressive screen where the resolution, saturation, and intensity will be exposed for their inferiority.
    • > If you see a standard 35mm print of the film,
      > you're seeing a digital->analogue conversion
      > which willn not be as crisp and vibrant.

      It is my understanding that for some time, feature films have edited digitally. Not like they digitized it, came up with an EDL, the used the EDL to cut the film, but rather that they digitize it, edit it, the print back to film.

      I don't know what exactly was done for star wars. But a lot of digital work isn't done at the highest quality possible. I think that film is a much more flexible medium for the forseable future, but most people don't work in a way to take advantage of it. Starwars would have been one film that should have though.
      • > It is my understanding that for some time, feature films have edited digitally. Not like they digitized
        > it, came up with an EDL, the used the EDL to cut the film, but rather that they digitize it, edit it, the
        > print back to film.

        No, it is not the standard practise to digitize the film and then edit the digital copy and then print it back to 35mm. This has been done, but is not the norm last time I looked, being reserved for almost-all-digital-effects films. Most films will thankfully never fit this bill, focusing on story content rather than special effects--Ithink this has been Lucas' downfall with his prequels; he focuses on the digital effects to the detriment of the dialogue, acting, and all the rest that goes into a great film.

        Aside from which, film can be digitized at a higher resolution than current digital cameras can capture, since specialty equipment has been developed for years to do just that.

        > I don't know what exactly was done for star wars.

        *AotC* was filmed with a special digital camera made by Sony especially for George Lucas--state of the digital art. Then the rendered scenes and characters and the digital film of the live actors were all edited together and then special effects where the live actors interact with the digital characters and scenery were added to that. The resulting "masters" are completely digital, and have been at every step of the process; IIRC the "masters" of *AotC* and *PM* are digitally preserved across a bunch of magneto-optical, optical, and magnetic media, redundantly, for posterity.

        > But a lot of digital work isn't done at the highest quality possible.

        Absolutely true. To cut costs, rendering is typically done at only a quarter or a half of film's full resolution. The result is usually not noticeably inferior, since that's still a decenmt resolution and if done well will look good even when printed back to 35mm. However, there are many exceptions--like Gladiator. Its digital elements, like the Coliseum background and its digital audience, looked very poor to people with a discerning eye when projected at theaters--muddy, indistinct, undersaturated, and not matching up seamlessly with the action in the foreground. It was all the more noticeable in the action sequences which were shot deliberately at a lower FPS (I think offhand 18fps) in order to produce that famous "tearing" effect during some fight sequences. And, a lot of films with lower-quality digital effects just look *awful* when projected onto a really huge theater screen, even if the effects were done really well and would have looked pristine on a smaller theater screen like you find at your average multiplex.

        At any rate, yeah, digital effects are typically done at far less than full 35mm resolution. It makes me long for the days of special effects artists who'd spend weeks on a single detailed model... Today, the same effect is usually done digitally, with the loss of a "real" object and its associated weight and presence. Digital characters and effects often look less realistic precisely because they lack weight and mass, or don't seem to be effected by gravity, or other things which would just "be there" if they were being done with highly detailed real-world special effects artists at the helm instead of CGI artists.

        > I think that film is a much more flexible medium for the forseable future, but most people don't work in a way to take advantage of it.

        You're right in general terms. A whole lot of movies are using lower-res digital effects, or not using their 35mm capabilities to their utmost. But then again, I think we have to look to those few films that really do transcend anything digital has to offer today. Look at a truly great, well-shot film like *American Beauty*. Its cinematographer has had 50 years of experience behind a 35mm camera, and it shows--no scene is shot poorly, the lighting is so perfect as to generally not be noticed, and the effects that different film stocks and camera settings have have been used to great advantage. Digital (except of course in the scenes where a consumer-level DVcam was used to show the kids' use of that camera) would never have produced that richness and texture--things would have looked, well, more ordinary. In addition, while digital special effects had to be used for some scenes for obvious reason--the rose petals flying from Mena Suvari's chest, e.g.--most things were done with good old-fashioned "real" effects. The scene in the steamy bathroom between Spacey and Suvari, for example, used real steam and real lighting effects. Theoretically, digital effects could have been used to produce steamy fog and lighting effects--but it ouldn't have looked so real and so striking as when done with the real thing.

        Let's also not forget that film just has a "look" of its own, distinct from either digital processes or "real life." That look of film is what we're more or less used to seeing on the big screen and even on the small screen when it comes to movies. It's stylized in a soft, "natural" way, due in great measure to the lighting required when working with film. Digital film seems to me more harsh and less forgiving, showing off every blemish and making things look like harsh reality rather than the escapism of a film. This can be used to great effect, as in the boot-camp war movie *Tigerland*, which was shot on digital and looks the better for it because the harshness and "reality look" of digital matches the harshness and eality of the situation. But in most cases, that "harsh" look would be far less welcome. Even George Lucas has gone to great lengths to make his digital movies "look" like they were shot on film. Ironic.

        > Starwars would have been one film that should have though.

        I do miss the old *Star Wars* movies. First off, because as much attention was paid to dialogue and character development and acting as was paid to the special effects. I think this is Lucas' biggest problem with his latest films. But I also miss that the special effects were "real," done with very elaborateloy detailed life-size models of aliens and real people inside costumes--because they have a weight and real presence, and little unintentional movements which make them look natural and real. You don't get unintentional movements with digital effcts, and consequently they often look unrealistic, and they always lack realistic responses to gravity since weight is very complex and doesn't usually figure into digital effects yet except as a "guesstimate" by some animators. Before digital effects can look realistic, gravity and mass will have to be accurately modelled for digital elements instead of guessed at by an animator.

        just my opinions though. :-)
    • excellent post!


      i'd like to argue one point: assuming standard 1080 high by 1920 wide digital dimension, i have studies which point out that 35mm is notably better at first generation - the negative. but at the distrubution level, which is at least fourth generation, 1080x1920 with 10 bit luminance far exceeds 35mm in all aspects: spatial resolution, evenness of illumination, evenness of color balance, color saturation, contrast, noise.


      provided, of course, the production is done properly!

    • Great post, but your theories on resolution needing to be higher are really flawed...

      I do a lot of "hobby" work in the home theater field, and I have to say, a crappy old CRT-based video projector will run circles around almost any LCD, DLP, or other digital projector. Why?

      Not resolution at all, but CONTRAST. The resolution factor is a must, but to say that 35mm film is "higher resolution" than the DLP's they use in high end digital cinema is flawed as well. Maybe the original masters are pristine, but the film we're watching is flawed because it could be 5 or 6 generation, and it's going to be old (scratches, warps, missing spliced frames, etc). Also, the film has a definite grain that can be seen even on 70mm prints. With digital, it's the same movie, every time.

      The problem with digital is that the color gamut is not as great as film (although soon will be beyond what film can produce at the generations of film we watch). The other problem is the contrast generally sucks. The reason old low resolution CRTs (worth $500 used) that are properly calibrated can look better than LCDs that cost $50,000 is because the color gamut is better, the contrast is WORLDS away from digital, and there is no screen door effect.

      In the next few years, I believe digital theatres will overwhelm analog, mostly because of the cost controls the studios can impose (controlling how often the film is played), distribution costs being lower, and eventually, the medium will be better quality.

      Already we're into our third generation of digital theatre projectors, and the ones that are up and coming just literally blow film away in many ways, with contrast, and yes, even resolution getting better at each step.

    • Hell, my local multiplex often sets their standard film projctors sub-optimally, and that tech is ancient...

      The problem in many larger theatres is that they don't change the xenon lamp (the "bulb" that puts the light onto the screen) often enough. After they get to a certain age, those lamps tend to get blackened and to start putting out less light generally. The result is a picture that is not as bright as it should be. Some also try to "save the lamp" by turning the power input down so the light is also dimmer if this has been done. The lamps also have to be properly focused after being changed and, guess what -- many people change them who don't know how to focus it afterward. Result: Dim light on the screen.

      Couple that with folks who don't know how to focus a picture or who just don't give a damn, screens that have had too much popcorn and candy thrown at them, and sometimes the picture at your local multiplex will be less than what it should be, indeed.
  • How far we've come (Score:3, Interesting)

    by ptomblin ( 1378 ) <ptomblin@xcski.com> on Monday May 20, 2002 @09:23AM (#3549828) Homepage Journal
    3 years ago I was working on a project that took film, scanned it at 4kx3k resolution with laser scanners, did digital post production at that resolution, and then printed it back on film at 4kx3k resolution. The powers that be at this company canned the project, even though it's still the best in its class, because it was expensive and slow compared to the lower resolution competition. People who care about quality, like the people doing post production on all three Lord of the Rings movies, are still using our software even though we pretty much abandoned them.

    3 years later, I'm back at the same company, and now we're working on a way of delivering digital movies to theatres, and presenting those movies on the screens. Guess what the resolution for the first generation is? 1280x1024. A resolution I consider barely adequate on a 17 inch monitor, and wouldn't even put on a 21 inch monitor, and they've going to blow it up to a huge theatre screen. Yuck.

    • Ugh. That totally sucks. (Thinking... what would be the actual DPI on a movie screen?)
      • Thinking... what would be the actual DPI on a movie screen?

        For a 50-foot screen, with a 1280-pixel wide print, each pixel would be about half an inch across. OK, you'd probably not see it if you were a good way back, but in the front row you would see individual pixels. They'd be about the size of your fingernails.
  • DLP will NOT look better than normal film stock. The reason is simple: DLP uses thousands of tiny, independent mirrors to project the image. On a screen the size of a public movie theater, you're going to see all sorts of aliasing, because you'll be able to discern each individual mirror that makes up the image.

    This is normal for DLP; it always has been. Go back and research home theater discussion groups for the past two years or so, and read all the complaints about DLP.

    I saw AotC at the AMC1000 also. It was fuzzy. It was blurry. IT WAS ALIASED. And it was all because of DLP, not because something wasn't "set up properly". That's the way DLP looks. It's aliased. It's pixellated. It does NOT look better than good quality film stock.
    • You can't compare high end DLP with home theater DLP as they do in the discussion groups (I attend to all of these regularly).

      The home versions use a single DLP with a spinning color wheel to "recreate" the colors you need. The professional versions use 3 DLPs with a prism to get more exacting colors.

      The home theater versions use weaker bulbs, much lower quality lenses, and power supplies that all contribute to lower contrast ratios. The digital theater projects use MLA's and other techniques to extract as much information as possible to the screen.

      Last month, I saw a DVD movie (basically 720x480 resolution) upconverted to 1080p (1920x1080 resolutino) and then projected on a Sony G90 CRT projector, in a guy's basement, onto a 140" horizontal screen.

      I was literally blown away, and I know good quality when I see it in theatres. This low resolution DVD absolutely gave me the most breathtaking, goosebumping movie experience I had ever seen, and it was in this guy's basement, accomplished with less than $40,000 in hardware.

      If Star Wars takes 1280x1024 resolution (which is 4+ times higher resolution than the upconverted DVD I watched), its not resolution that will bother me, not even on a 80 foot wide screen...

  • An important point that everyone seemed to miss is that AotC was shot on Video, not on film. This is really a major step forward for the movie making industry. Shooting on Video instead of film should decrease post production time and expense while preserving quality. I got a demo of a Hi-Def Editor using Hi-Def monitors and the fact that this wasn't film pbly wouldn't have been noticable had I not been aware that it was video not film. When shooting Hi-Def you get almost the Same color-space and definitly the same (if not better) quality without having to pull down to video the edit and then back up to final cut it. Long Live Video. The real reason peopel think digital is inferior is the digital projection not the quality of the output "film".
  • Roger Ebert mentioned a certain fuzziness [suntimes.com] in his review.
  • I was unable to see the movie on a digital projector but settled with seeing it on analog. I was watching the movie paying particular attention to the sharpness of the images. I seemed to notice that on the analog version, the flesh and blood actors seemed much more out of focus than the CGI images. This may have been an improperly adjusted projector, but it did seem that the actors - especially the whiny Anakin - where out of focus. Maybe Anakin was out of focus on purpose?

    Is it just me or does anyone else want to see the 3rd episode just to see all those arrogant, light-saber wielding fanatics wiped out. Unfortunately Anakin gets to live, that is the greatest injustice of the series. The least likable character - yeah he surpassed JarJar in my mind - is the focal point of this series.
  • Is digital better? (Score:5, Insightful)

    by mosch ( 204 ) on Monday May 20, 2002 @10:13AM (#3550240) Homepage
    Digital projection in it's current incarnation, DLP, is terrible. Theoretically the color depth should be fine, the spec is for 45 bits of color, though AoTC demonstrated well that apparently they don't use all those bits, because at least at the theatre I was at, the blacks weren't very black.

    The real problem is the incredibly shitty resolution, 1280x1024. 35mm film is roughly equivalent to 20 million pixels, a wee bit more than digital. Watch a slow pan of a detailed scene carefully (the waterfall scene would work), and you'll actually see everything moving pixel by pixel.

    Oddly enough, the digital projector should be able to get an equivalent or better contrast ratio than film. 35mm film is generally specced to get about 1000:1, but the Barco DLP Projectors can get up to 1250:1.

    The storal of the ,mory is that contrary to popular opinion, adding the word 'digital' to a technology does not make it better.

    • Is this true? Is the resolution used for projecting AOTC in DLP really 1280x1024? That's horrible! Stretching 640x480 resolution to display on a 21" monitor is bad enough. Stretching a conventional monitor's resolution to display on a theatre screen is just plain ludicrous.

      When I went to see AOTC last night, I noticed that the pixellation was rather bad during the closing credits, but wasn't distracted by it during the main movie. I guess I was just too interested in the film itself to care.
    • The only 35mm film that can get 20 million pixels worth of resolution is a single snap frame of a perfectly lit scene, shot a tripod and high end lens that weighs a ton.

      Add motion, irregular lighting, and the vibrations of the camera onto the film media as it passes, and you really end up with somewhere between 3 million and 5 million "pixels."

      Go through a few generations of editing, color correction (very important), and other steps, and you may be closer to 1 to 3 million.

      You will NEVER see a film with 20 million pixels, or even half that. It just won't happen.

  • One thing that careful observers of the 70mm print of tPM might have noticed: it had sections that were out of focus. This would have been caught in the rushes, but Lucas was insisting on using HD to review the daily material. Thus, the problem wasn't detected until post-production, where it was noticed but it was too late to fix it.

    Lucas's conclusion appears to have been that if everthing in the chain is HD, this problem would not have occurred. Not a guy I talk with, so I wouldn't know his real thinking.

    However, his push for HD, while interesting, is wrapped up in the particular economics of Star Wars: he's produced a license to print money, and the money he gets has little or nothing to do with the movie itself, it's technical quality, or lack thereof. The last three movies were largely licensing constructs: whether the movie made or lost money at the box office, it made tons-and-tons of money in product licenses. As a result, the idea would be to continue to create movies that contained lots and lots of licensible elements, and then cut costs everywhere else you can. So, you spend money on effects using cool spaceships, light sabers, and digitally synthezized characters (much better, though not actually cheaper than those pesky actors!), and cut it on the costs of printing and distribution.

    Lucas has preached how the inspiration for the Star Wars series was the Republic serials of the 20s and 30s. However, as the NYT has noted, those serials didn't cost $140 million (per episode) to make. Nor, I would note, did those serials have the licensing tie-ins.
    • Careful; HD is not digital, and digital is not HD. The two reference separate aspects of delivery. One deals with resolution and presentation, the other deals with encoding and delivery.

      This already causes problems in the HD space, with cable companies and home entertainment vendors using the two interchangeably, and it's dangerous. "Digital" can be whatever resolution you want, and can be presented just as (if not more) crappily than analog (just look at digital cable, where the cable co's have used the "higher quality" to instead lower the resolution in order to squeeze more bandwidth out of the available pipes).

      What Lucas is on about is digital encoding and delivery. One only has to walk into a DLP cinema and watch the dreck that is the poorly-presented AotC to see that it's NOT high-definition by any stretch of the imagination.
  • I saw it last night - did think there was one in particular when Obi-wan is going through caves and the background looks like he's walking in front of a smooth painted wall. maybe it's just me... that and the yoda-lightsabre battle was clearly speeded up. looked cool but same lack of definition.
    • The guy who played Yoda could not move that fast in real life. "The magic of Hollywood."
      • Um, wasn't Yoda CG in this episode? As in, Frank Oz only did the voice, not the usual puppetry?
        • No, it was Warwick Davis. Also, the scenes on Kamino were not shot in real rain. They set up a sprinkler between the camera and the actors, so it just looks like rain.
  • I remember seeing TPM in New Jersey on a digital projector, and it was incredible--the best picture I've ever seen.

    Last night I saw ATotC in Framingham, MA in the digital theater, and it just wasn't as good as I would expect from analog. Sure, there was no bouncing in the picture, but there was pixelization. It was really obvious during the credits, and was distracting in some of the other scenes.

    I'm not sure what the processes involved were, but clearly digital can produce an outstanding picture, but due to the costs involved, they aren't using the technology necessary to produce the best picture possible.
  • My biggest worry about digital projection is the calibration - who is going to do it? The theater manager? Is s/he going to be trained to do it properly? Who is going to train them? This is someone whose main job is to manage a bunch of kids making 6 bucks an hour, not an image professional (and that goes back to the near-elimination of the projectionists' union). How often do you go to the theater now, and the film is out of focus? Can we trust that every morning someone is going to calibrate the projector according to specific standards, ie color balance, brightness/contrast, etc.? What I'd like to see is a digital projector that refuses to project unless it's calibrated once a day, but that won't happen. Even the most expensive fiery laser printers out there don't produce a constant output, where identical files printed hours apart have a significantly different color balance, and one can only assume that the identical thing happens with a digital projector - it must look different from the first hour it's on versus when it's on for 12 straight hours.

    Maybe film has its benefits in that if film prints are timed and struck properly, it'll look pretty much how the director/cinematographer wanted it to to look. Across different prints in different theaters. Of course, there will be variations according to the film projectors, light bulbs, and films fading (but not new prints, since new film stocks are pretty rugged), but there's a constancy in film that you can't expect from digital projection.

    With digital, there is so much potential for it to be seen incorrectly, that you can't trust what you're seeing on the screen. With digital, the image is in the hands of non-professionals.
  • I saw it digitally at the Van Ness AMC and from the seccond row center thought just about everything looked great. What stood out the most, however, was when the credits rolled. You could see the pixels of the text moving up the screen. I assume that all credits are done digitally these days, so it would seem that the problems stem from the projector, not source material.

    The thing with digital is that at this point it may not be better than film, but soon enough the resolution will surpass film. The purists will still complain about motion blur and "warmer colors" however...

    Very soon, digital projection is going to be very affordable compared to film when you include the cost of making a print of a movie. If it costs less, the quality available today would be plenty to keep the average movie goer happy to keep buying popcorn.
    • Very soon, digital projection is going to be very affordable compared to film when you include the cost of making a print of a movie.

      That's part of the problem. Who pays for the new (and expensive!) digital projection equipment?

      The film companies pay for the prints of a movie. The theatres pay for and own the projectors and so on. The saving through digital distribution will accrue to the studios and the cost of upgrading will be shouldered by the theatres???

      See the problem?
  • Saw it last night in a Loews, in analog. Terribly, terribly blurry. Maybe it was the projectionist, but the odd thing is, noone else seemed to complain. The theater was also only half full. Now, this was at 6:45 pm on a Sunday at a very large, new, not very popular theater, but that was still a surprise.
  • I've seen both versions of Star Wars II. I had not been to a movie theater for half a year. On opening day I went to see AOTC projected digitally at Universal City. My opinion while watching it was that it looked super clean and very sharp, but no jump between night & day from film, perhaps 15-20% better than film. The next day I went and saw the film version at a respectable AMC9 theater in Burbank. WOW, I could not believe how inferior it was in comparison. I think everybody complaining about the "shitty" 1280x1024 is crazy. The resolution on the film version *looked* terrible, the film version reminded me of a PC game that was running at 512x384 vs 1024x768. It was very blurry.

    I also thought interestingly enough that the special effects/CG characters looked more fake on the film than the digital. Most of the special effects were worthless on the film version, you couldn't even see anything. For example in the scene where the camera pans around the cloning factory, the background in the film was extremely blurry and had a terribly overly high motion-blur that distilled the effect, whereas it was super clean, detailed, and impressive on the digital version. Plus the terrible amount of DIRT on the film projection was unbelievable. Clicks, pops, hairs everywhere, a nice hole in the upper right corner that popped up every 20 minutes, it was really bad. Instead of 15-20% better, I came out of the film version with the opinion that the digital is 40-60% better quality. I suppose it's possible that the film projectionist had a bad focus and a bad print (always a possible problem with film), but I honestly felt bad for my friend who went with me to the film version who had not seen the digital version. It's really amazing how crappy the film version was after half a year of being conditioned to watching DVDs on nice televisions.

    This is the opinion of a more or less regular guy when it comes to video, I didn't even know the actual resolution of DLP projection before reading the comments on this story.
    • One thing to remeber though is you are dealing with a movie shot on HD video and then transferred later to film. So basically, it has all the disadvantages of the film medium (can be scratched, hairs, etc) and none of the advantages (better color depth, greater clarity, etc).
  • I've done some work with Panavision on Hi Def and in the process got to see some very nice uncompressed HD footage shot at Panavision's studios. The quality of raw HD is very nice, an practically no different in resolution than all the CGI rendered for adding effects into traditionally shot movies.

    Unfortunately, there's nobody showing AotC in town in Digital, but I did see the film version, and it was a bit iffy. I saw some evidence of aliassing, but it was slight, and some shots were grainy. Other shots in similar lighting situations were not, so I'm not quite sure what was causing that.

    I also went to a SMPTE conference where they showed HD v Film under the same shooting and viewing conditions, and HD held it's own in that test. HD has much better low light results than film, but equally, looses a little to film in the bright areas.

    The main problem with using HD for movies is that projectors cannot do 1920 x 1080, which is the resolution of HD. Watching HD on an HD monitor, especially the new Sony 24p ones is quite something.

    HD is certainly the future for movies, but first the projectors need to get up to the right standard, and then HD needs about twice as many pixels, IMHO.
  • Fellow geeks and gnurds,

    I just got back from viewing AotC in digital projection format. After all the chatter on /., I had to go a third time and check it out in this venue. Last week, I saw AotC twice in two different theatres in film format. As expected, the digital presentation was better. The image on the screen was far more consistent that at either of the two film presentations. The sound was also better throughout the digital venue. During my first film viewing on Thursday, the sound system kept going wacky, collapsing into mono from surround - I should have complained and asked for my money back. On Friday, during the first reel, there was a vertical strip that kept rolling by and the right hand side of the screen was out of focus for the whole duration - again, I should have complained and asked for my money back. During both film presentations, during different shots in same scene, you could see graininess and general inconsistency. In digital format this did not happen - the movie had the same visual texture throughout. Now, the digital projection did texture the image. If you have ever seen a DLP-based projector or some high-end wide screen TVs, you will know of what I write. The best way I can describe it is a pastelization of the image.

    The good part was, the movie got better the more I saw it. For whatever reason, it flowed better once you've already seen it - something Lucas may want to consider. I also spotted a few more mistakes: a) during the flying pear scene with Anakin and Padme, he slices the pear in the top part, but it's the bottom part Padme bites into. b) when Padme biffs the sand, and the clone trooper comes to here, the toe of his foot fails make a depression in the sand at one point.

  • "Digital movies don't have that infamous pube running around the screen."

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...