Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware

Is CRT Burn-In Still a Problem? 107

coloth asks: "We've all been told for many years that monitor burn-in is a thing of the past, that CRTs use a different kind of phosphor now, and that screensavers are more toys than practical safeguards. After a few minutes with Google, nearly every PC advice site I found said as much. Well, I just realized tonight that I've got burn-in from the Seti@Home screensaver on my Dell P991. I took a picture with my digital camera. (disregard the bar of interference) I added the arrows with PhotoShop and enhanced the image a bit, but the burn in is clear. Here is the image of the "screensaver" to compare the pattern. Is my monitor sub-par? Is the conventional wisdom about burn-in untrue? Are most people doing anything specific to avoid burn-in?"
This discussion has been archived. No new comments can be posted.

Is CRT Burn-In Still a Problem?

Comments Filter:
  • CRT burnin is virtually non-existent on modern color monitors as long as you don't have the exact same image on the screen forever. It was a significant problem in old monochrome monitors.
    • I think the Sony tech that told me that was wrong. Or lying.
    • as long as you don't have the exact same image on the screen forever

      I thought that was the problem in the first place...
      • I think the issue is the time-thresholds involved. Older monitors could burn in relatively quickly. Modern displays take a bit more "persistence" to burn in a specific pattern. Old projection TVs were really bad about burnin -- playing video games with "unnaturally white" objects onscreen could scar the CRTs. (That probably had something to do with the light-levels involved in order to be able to project the image.)

        The problem hasn't gone away, but I'd say it's lower magnitude than it used to be. For instance, you don't see too many modern arcade games with "INSERT COIN" burned into the CRT tube any more, but it was very common 20 years ago. How many Win9x and later machines have you seen with the Start bar burned in?

        --Joe
    • Old monitors in the days before GUI's just had the problem that all was text, and this text changed, but it degraded the CRT nevertheless. THe effect was that after a few years of operation, you could see by the lines on the screen where the text was positioned. A good screensaver then should have just turned off the screen.

  • Damn you (Score:3, Funny)

    by Cyclone66 ( 217347 ) on Wednesday December 18, 2002 @08:36PM (#4919619) Homepage Journal
    Now I'm going to have to turn off my monitor all the time. Thanks for making me paranoid again!! (I never did trust screensavers. If your computer crashed you would have the nice screen saver image burned into your screen.. black is the only way!)
    • About screensavers (Score:5, Informative)

      by MacAndrew ( 463832 ) on Wednesday December 18, 2002 @09:09PM (#4919774) Homepage
      Yes, Virginia, phosphors do wear out over time. The problem used to be much, much worse.

      I hate screensavers that run more than ten minutes -- they rarely seem clever any more, and more importantly they seduce a lot of people into thinking they are somehow saving energy. In fact, if the tube is fired up the box is consuming nearly full power and releasing nearly full heat.

      MUCH better is any kind of sleep mode, which might reduce a 75-watt load to 5 watts. Or ... turn the darn monitor off for a power consumption of maybe 2 watts (off doesn't quite mean off). A friend and I went to ridiculous lengths during the California energy crisis to persuade his tech company IT dept. to tell everyone to turn off their 3,000 machines when not in use, or at least overnight, or AT LEAST over the weekends and holidays.

      They told workers to leave them on because of the old tale that electronics last longer that way. More research, I eventually reached a very friendly engineer with the Sony monitors operations. In short, the leave-'em-on philosophy made some sense before all solid-state surface-mount design prevailed because of thermal shock and "creep," which made IC's rise in their sockets. Now everything is soldered and the components are many times more reliable. In terms of monitor wear, it's probably a toss-up. No one would seriously leave their TV on 24 hours a day to save money, right? And even if wear were accelerated somehow with the machines on 1/3 as long, the amount of energy consumed would have bought a ton of hardware.

      We calculated the excess electrical in the hundreds of thousands of dollars, and that's before factoring in the added load of the air conditioning to remove the waste heat (most of the energy used goes to waste heat). (A rough guess: the best estimate I could find suggested it takes 1 watt of A/C consumption to remove 3-4 watts of heat.) There was also a citizenship issue to energy conservation at the time, as there were rolling blackouts to ration energy the closer the system came to overload. (You all remember the stories.)

      Um, anyway, I hope I made a point. Er, my point. One thing Apple did right was to support Energy Star early on. "By design," [microsoft.com] MS Windows NT 4.0, which his company uses, does not support power mgmt [energystar.gov] even though the newer monitors typically do. Yes ... they could upgrade the O/S ... but what they have works. There are cheap utilities that persuade NT to play along, but that would require getting the IT people to install it, and if they were unwilling to listen about turning monitors off, well.... you get it.

      I think they did start telling employees to turn the monitors off just recently, maybe 18 months after our email campaign and a half-million in electricity. Could the computers be next? They're just dumb workstations and don't do anything in the off hours.

      Last nail in the coffin: To give you a sense of his IT department, they sent a tech down once who could not be made to understand, by several engineers, the difference between a SCSI and a parallel port ("Well, it should fit."). No shit. :)
      • [...]most of the energy used goes to waste heat[...]
        All the energy used goes to waste heat.

        In fact, the only way some energy you use doesn't go to heat is if

        1. you increase mechanical potential energy (lift something higher than it was before and leave it there, or pressurize a vessel)
        2. you increase chemical potential energy (like boiling water)

        Even heat pumps (like an air conditioners) heat the whole system (indoors + outdoors) equal to the amount of energy put into the heat pump to run it. Luckily they are fairly efficient, and can move 3-5 times the heat they waste.

        • Look, nothing's perfect.

          Some of the energy is turned into light by the phosphors, which then travels out a nearby window and off into space (energy lost from system).

          Or maybe some of the monitor radiation gives the engineer a tan (chemical changes).

          Or engineer takes data disk brimming with data home and loses it.

          There, happy? :) It is safest to avoid unnecessary absolutes.
      • by shepd ( 155729 )
        As far as heat and monitors goes, here's the story, straight up from what I remember in EET:

        Leaving a tube (not the monitor) on all the time, even if it is displaying a blank image will shorten its life because the heater in the back will eventually wear out (just like a light bulb). Again, just like a light bulb there is a thermal shock to the heater in the tube which can cause it to fail prematurely during power on, so there is a tradeoff (I have a thread where I discussed this. Only a physicist can tell you the numbers; sorry, I'm not one.) as to how long you leave it on before it is worth turning off. It is a very safe bet to say it is best left off overnight.

        Now, we have the electronics. It's more than parts jumping out of their sockets due to thermal expansion and contraction. It's also poor solder joints cracking, and very poorly designed circuit board traces frying up from the turn on current surge, and crappy capacitors boiling.

        Overall, you should leave the tube off when not in use, and the electronics on, since unlike the tube the electronics generally do not perform worse over time. ie: Use your power saving mode.
  • arrows (Score:3, Funny)

    by dostick ( 69711 ) on Wednesday December 18, 2002 @08:37PM (#4919622) Homepage Journal

    You must do something about that terrible black arrows on your monitor!

  • Yes...... (Score:5, Informative)

    by Smidge204 ( 605297 ) on Wednesday December 18, 2002 @08:38PM (#4919627) Journal
    Every computer at my (ex)university had that same problem. Seti@Home (And, actually, all of the distributed-computing project 'screen savers') make extreamly POOR screen savers. If you're going to set it up like that, you should enable the "turn monitor off" option if your monitor is capable. (Saves electricity, too)

    =Smidge=
  • Gateway VX900 (19") sitting on a KVM. Due to some funkyness "Invalid Scan Frequency" often shows up on the monitor when things go to sleep... After having left the monitor on for probably pushing a year or so, I can barely see Invalid Scan Frequency on the monitor when it is powered off.

    I'm guestemating the monitor is no older 3 years, probably less than 2.

    It's probably not nearly the problem it use to be, judging from some of the old junker VT100 displays we have sitting around with VERY prominant burn in on them. It looks like it's still something to wory about for monitors that show the same thing day in and day out for months on end.
  • by dacarr ( 562277 ) on Wednesday December 18, 2002 @08:39PM (#4919638) Homepage Journal
    What Brunson said (above). To wit, it became a thing of the past largely because of durability, but also keep in mind that for the most part phosphor burn doesn't happen so much because there tends to be something always happening.

    Also keep in mind that your monitor sucks down a lot of power anyway - you'd save power just powering it down.

    • I have a feeling those sites that say burn-in isn't a major problem anymore meant to say that monitors are far less susceptible to the problem nowadays, especially since most of them now go into a standby mode if you don't use your computer for a given amoutn of time.

      If your monitor doesn't auto-off and you leave your computer running all day with no screensaver or a screensaver that leaves regions of pixels unchanged (such as the Seti@home client), you can still develop burn-in over time.

      The best solution is to just turn your monitor off when you aren't using your computer - not only does it avoid burn-in, but monitors suck a whole lot of electricity, so you will also save money on electricty and be helping to not destroy the planet so quickly, too.
  • by Yarn ( 75 ) on Wednesday December 18, 2002 @08:40PM (#4919643) Homepage
    Apparently plasma screens are particularly suceptible to screen burn, and the so-called 'DOG' (Don't know what it stands for, it means the logo of the station/programme you're watching) often gets burnt in if you watch one channel frequently. The displays in the BBC TV newsrooms used to show this effect, but they've either been replaced or recalibrated.

    I just wish they'd stop cluttering *my* screenspace.
    • Bug (Score:1, Informative)

      by Anonymous Coward
      You mean "bug", not "dog".
      • No, "dog" is also correct - stands for "Digitally Originated Graphics". There are several campaigns running to try to persuade broadcasters to remove them for reasons such as burned TV screens [logofreetv.org.uk] and aesthetic damage to programmes and films infected with them. See here [uk.com] and here [logofreetv.org.uk] for more info.
  • by drivers ( 45076 ) on Wednesday December 18, 2002 @08:49PM (#4919686)
    I've seen the line "PowerMenu has shut off the screen to prevent burn in" (paraphrased) burned into each of the 25 text lines on a monitor. The program moved the text between lines to prevent burning in one line, and burned in 25 lines instead.
  • The SETI screensaver may be an extreme example, if you leave it running 24x7 without putting your monitor into powersaving mode. I tend to believe that burn-in is not the problem it once was. However, I think alot of the problem has been 'solved' by the gui. In old terminal based systems the same exact text always sat at the exact same spot. Now with Windows and other graphical interfaces only have a small area that is static and that is frequently covered by other windows and moved about the screen. Now you putting up SETI, without having the computer put the monitor in powersave mode definitely sounds like it would cause some burn in.
  • You can get pay a good $15 less on a used monitor at a used computer store when it has either "burn-in" or is "slightly dim."

    Even our 36" Sony WEGA says it will burn-in when used with video games too much... and that's a CRT.

    KRis
  • by Anonymous Coward on Wednesday December 18, 2002 @08:55PM (#4919709)
    You're worried about CRT burn-in? The US Federal Government still requires a two-person check to verify that LCD panels aren't burned in. And nobody calls it a dumb rule, except when you have one person certified for every 3,000 employees certified to allow the monitor to be removed from a secure environment.

    Top two images most often burned into monitors: WinNT/Win2k "Press Ctrl-Alt-Delete for a secure login." and the BSOD. :-)

    • Actually, the WinNT/2K logon screen moves around during inactivity.

      Try finding some webcam of a computer lab and watch the MPEG timelapse video. Then make funny noises with your mouth as the NT login boxes flying around when teh room is dark.

      Weeeeeeeeeeeeeeeeeeeeeeee!

  • On the 6 year old 14" CTX cheap$#it monitor that I am typing a reply on a very clear burn-in of the windows task bar is burnt in. In the same manner check out the color CRTs on ATMs. At home I use power management and no comparable burn in seems to occur.
  • Burn in. (Score:4, Funny)

    by Twintop ( 579924 ) <david@twintop-tahoe.com> on Wednesday December 18, 2002 @09:07PM (#4919758) Homepage Journal
    My desktop monitor has the outline of the /. main page burnt on it. o_O~
  • Maybe my screen's the one with problems, because all I see on your jpg are the arrows that you drew on it.

    -Andrew

    P.S. Be aware that photographing a CRT isn't too accurate unless you can manually adjust your exposure... Otherwise, you'll get the scanning emitter making it look weird - like yours.
  • I got it! (Score:5, Funny)

    by Glonoinha ( 587375 ) on Wednesday December 18, 2002 @09:33PM (#4919934) Journal
    They should put a button on the front of the monitor that would make the screen turn off. It would save electricity and prevent any screen burn in too!

    • First off, I understand your sarcasm completely. Secondly, to prove myself the boring realist that I am, some monitors (including mine) that support power saving soft-off do not feature such a button on the front. They have moved it to the back, which is annoying--if only I could turn off that darn amber LED that says it is sleeping...
  • Running SETI without the screensaver app enabled, or even better, running it from the command line give you faster work units anyway.

    Since you worried less about your total units completed than about having a 'cool' screensaver, this is the price you pay. :)
  • Actually, I've always thought about 'burning in' my name/phone number/email/etc into my monitor for secuirty purposes (pretty hard to remove!). If you adjust your montior right (maybe make a screensaver that changes the screen mode to some obscure one where you have your monitor adjused so taht your "security message" will be mostly offscreen), you should be able to make on off the monitor's normal screen area.

    Of course, you have the undesirable type of burn in. Here's an idea. Take that screenshot of SETI@HOME, reverse the colors, and make some lame VB app to make that the screensaver (be sure to adjust your monitor so that the screensaver lines up perfectly with the burn in). And, from here on out, don't use the GUI version of SETI@HOME...it's terribly inefficient. Use the console version instead.

    Or, just sell the monitor to some lamer, claiming the burn in is from the next-generation Trinitron apeture grille in the monitor, for a jacked up price.

    • by GigsVT ( 208848 ) on Wednesday December 18, 2002 @10:30PM (#4920231) Journal
      If you really want to burn something in to a monitor, follow these instruction.

      NOTE: These instructions should not be done by anyone with an IQ under 100. This will damage your monitor! That's the whole point! Permanantly! In fact, no one should ever do this. Except maybe cats. Cats are weird like that.

      1. Boot into a console (or DOS on a differently-abled system) I think you Mac users are out of luck (yet again), seeing as you can't exit your GUI.

      2. Write a batch file or script or something that clears the screen and puts out ANSI codes for high intensity white, and your message wherever you want it on the screen.

      3. Open up your monitor*. Find the flyback transformer. It has a big red wire coming out of the top of it most likely. That red wire has 20,000 or so volts running through it, be careful, it bites.

      4. That transformer likely has two adjustment knobs on the side of it, which probably have screwheads for phillips head screwdriver. They are called focus and screen adjustments, they are variable resistors.

      5. Use whiteout to carefully mark the original position of the focus and screen. If you don't know which is which, it's OK, you will find out as soon as you turn one of them.

      6. Slowly and carefully turn one of them while the monitor is running*, and make sure you can see the screen. If it goes out of focus, you have the wrong one. Slowly turn it back to your white-out marked position. If it gets brighter or darker, that's the one you want.

      7. Turn up the front panel brightness all the way. Then turn the screen knob up slowly until the black part is pretty light too. The white text should be extremely bright right now, and may bloom some. Don't turn the knob up so much that the X-ray protection kicks in*. If the CRT turns off, you went too far, dipshit. Try rebooting the monitor if you didn't fry anything.

      8. Once you have the black level so it is pretty bright, and the text is nice and bright but not blurred out completely, let that thing sit for a few hours. Keep an eye on it, but don't hang out in front of it, X-rays, remember? :)

      9. Try turning the monitor off and see if the phosphers are cooked yet. If you can see the text with the monitor off, then you have succeeded.

      10. Undo all the changes you made to the settings. Put the cover back on. Any leftover screws are a bonus from the Gods. Sell the monitor to your enemy, etc.

      *In case you havn't noticed, if you screw up, you might die. This whole thing is dangerous for someone who doesn't know their way around electricity. Don't be a dipshit, and don't do this on a monitor you don't want to destroy. In fact, just don't do this... ever!

      Oh BTW... to the original poster, you can't reverse burn in by displaying a reverse image. All you will do is burn the rest of the phosphers to the same darkness as the burned ones, and your monitor will get too dark to use sooner.
      • 1. Yes, you're probably right...the monitor he was showing had an annnoying yellow tint where the burnin was.

        2. Solution?

        3. Burn in other elements and call it modern art!!! Then, sell it!

        4. PROFIT!!!

      • >>Boot into a console (or DOS on a differently-abled system) I think you Mac users are out of luck (yet again), seeing as you can't exit your GUI

        We can't?

        Really?

        I wonder what this strange console-like white text, black background, full screen, no-quartz, tcsh shell I get when I login as ">console" is then.
        • A framebuffer?
        • Well, considering the GUI is in the BIOS, I'd think it would be pretty difficult to completely get rid of it.
          • by ZigMonty ( 524212 ) <slashdot.zigmonty@postinbox@com> on Thursday December 19, 2002 @01:48AM (#4921074)
            No it isn't and it hasn't been in years. All new world macs (iMacs and above) only have OpenFirmware in their ROMs. OpenFirmware may be a bit more sophisticated than a traditional PC BIOS but it certainly doesn't contain the GUI. OS 9 (which needs the Toolbox code in ROM) uses a trick: It loads a file called Mac OS ROM from the hard disk and uses it instead.

            Even on old world machines (beige G3 etc) Mac OS X ignores the GUI code in the ROM as it is a completely different architecture (QuickDraw rather than Quartz) and it would be pretty impossible to use from a Unix environment anyway.

            Logging in as >console drops you into a text console. No GUI. Is that so hard for you to believe? If you hold down command+v, Mac OS X will boot in verbose mode ie. it dumps pages of text to the console just like linux does. Command+s will boot you into single user mode, which is sh with nothing else running and / mounted as read only.

            Here's a trick: install XFree86, login as >cosnole, run startx. No Apple GUI code in sight. Hell, if you had half a clue you'd know that Apple releases Darwin free without Quartz et al. The console is all you have unless you install XFree86.

          • >> Well, considering the GUI is in the BIOS, I'd think it would be pretty difficult to completely get rid of it. Good Lord. When was the last time THAT was true??
      • 1. Boot into a console (or DOS on a differently-abled system) I think you Mac users are out of luck (yet again), seeing as you can't exit your GUI.

        Wrong about the mac users again. At the GUI login screen, just enter '>console' as the username. Your immediately dropped into a console.
  • The case is clear. Their "screen saver" did just the opposite. You live in America don't you? :)
  • by almightyjustin ( 518967 ) <(dopefishjustin) (at) (gmail.com)> on Wednesday December 18, 2002 @10:05PM (#4920101) Homepage
    SETI@home has a built in option to blank the screen (without turning off the monitor or doing any fancy Energy Star stuff) after a certain amount of time. They even recommend this because the video code takes some time away from the CPU. There's no excuse for not using this.
  • by quintessent ( 197518 ) <my usr name on toofgiB [tod] moc> on Wednesday December 18, 2002 @10:22PM (#4920189) Journal
    Now we can all breathe a little less pollution when you start having your computer shut down the monitor after 30 minutes of non-use.
  • ..I took a picture [pilafidis.org] with my digital camera.

    Yeah, I can just make out a breast, hmm, two.. hold on - there's four!
  • Definitely Yes (Score:4, Insightful)

    by penguinboy ( 35085 ) on Wednesday December 18, 2002 @10:29PM (#4920227)
    I've seen an entire lab of computers with the Windows 2000 login box burned into the monitor of every single one. This may be something of an extreme case, since the displays were left on 24x7, even on nights and weekends and they were relatively cheap monitors to begin with, but burn-in is certainly still a reality.

    The bottom line is that it probably doesn't matter if you leave the same thing up for a little while, but the screen should definitely be blanked/turned off the it isn't going to be used for any significant period of time - say, a half hour or more. Besides eliminating the problem of burn-in, simply turning the display off when it isn't in use will save a significant amount of energy.
  • Monitors should, before you power them down, show a white screen, and somehow scan it for imperfections. Then, if there's burn in, just blast it out with a HUGE electron gun jolt on the rest of the screen to balance things out.
    • Makes perfect sense. The only snafu I see is the
      "somehow scan it for imperfections"
      . I can see it now.
      "System shutting down, please place screen quality assurance adapter on the front of your monitor and press ENTER.
  • by dmorin ( 25609 ) <dmorin@@@gmail...com> on Wednesday December 18, 2002 @10:39PM (#4920271) Homepage Journal
    I think popular thought on burn in was that rotating screen savers made the problem go away because the pattern never stayed in one place. This is why ATM's usually suffer from the problem, because they show the same screen all day long.

    My company recently went over to big-brother-esque screen savers that rotate through the company mission statement. The funny thing is that they decided to put the company logo in the exact same spot on every slide, so it's now burned into my screen. Lovely.

  • The exec. assistant in my office has a very clear outline of what would appear to be the standard NT login box on her 17" Sony tube. This monitor is maybe 3 years old.

    I'd always been told from monitor vendors, engineers, everyone down the line that burn-in was a thing of the past, but don't believe the hype. It's not going to happen in minutes, my guess is that she probably went on vacation and left it or something. But then, that box jumps around, so I'm unsure what it really is. Perhaps she just left it logged in with a window there for a week.

    But regardless, it's not impossible.

  • Hmmm. (Score:5, Funny)

    by pete-classic ( 75983 ) <hutnick@gmail.com> on Thursday December 19, 2002 @12:22AM (#4920770) Homepage Journal
    Let me get this straight.

    Until recently you didn't believe in CRT burn-in, but you became a believer while looking for freaking space-men?

    Your system of beliefs is totally fucked.

    -Peter
    • you became a believer while looking for freaking space-men?

      Yeah, uh huh! You got it straight now.

      However, I just can't emphasize enough that it's "freaking spacemen" that I'm looking for-- not your average, easy-going spacemen.

  • While the burn in problem certainly has improved for CRTs during the last years, be aware that there are certain types of TFT panels which can develop a (partially reversible) memory-effect very similar to the burn-in problems known from CRTs.

    I talked about this topic with a product manager from Samsung who told me that current panels are far less affected by this problem compared to the panles available one or two years before, but I wouldn't be surprised if the problem is even present in current products.

    Specialist
  • It just takes a while longer to happen. On the order of 1 week to 1 month. I've seen _many_ modern colour monitors with screen burn. Do not leave static images on them.

    The worst are computers with login screens that have no screen savers or screen blanking functions. These ALWAYS end up burned in at my work (in this case with the novell client) because people get lazy/forgetful about turning the screen off.
  • I have my monitor set to screensaver in 15 minutes, standby in 45. Which for me is the best tradeoff in savings without my monitor shutting off too abrubtly if i get up for a little while. I use a dell/sony trinitron from 98 and it still works great (hell the dell pII 350 is still my main box, and you call 700mhz computers OLD). Leaving your monitor running all the time will only waste electricity, burn in the screen, and burn out the CRT. Then you end up with those monitors like you see at libraries and schools which have no brightness left at all and are barely readable.
  • I know projection TVs and CRTs are different animals, but the voices told me to share this story, so here it goes:

    My Dopey Inlaws (who have never heard of Slashdot) bought an expensive rear projection TV and burned in the Fox News logo and "Live" in the upper left corner.

    They left it on all day every day... to entertain their parrot.

    Now, they did not think the parrot was a news junkie. That just happened to be where they usually left the channel set when they got up from watching TV.
  • I work in a petrol station, where the terminals have all been upgraded. But before that we (the cashiers) had two terminals each. One which was the till, with an LCD for display, and another for taking the fuel transactions off the pumps. This used a conventional green-on-black CRT for display, which displayed 16 boxes (of the pumps) and their status by embossing them or highlighting, inverting etc. The time and date were at the top. The boxes got really nicely burnt out, unsurprising, considering this was a 24 hour station, open 364 days of the year.

    Once I tried turning off one of the monitors to admire the screenburn. When I turned it back on I couldn't get the display timing to lock, and just saw a highly corrupted picture that cycled. The monitor wouldn't work again until it had been switched off for half an hour.

    I don't know if turning it off affected its lifespan, but it was definitely buggered until it cooled.
  • On my monitor at work, I went away for a two week vacation and accidently left my monitor on and when I came back it had the "Windows 2000 Professional, this workstation has been locked.. press ctrl-alt-del to unlock, etc" screen burned into it. It's had it ever since, my co-workers were quite surprised to see a modern monitor with screen burn in. So yes, it can happen, but it'll take a few weeks.
  • This is unsubstanciated and purely spectulation. Degauss every week. All those crazy shaking colors gotta clean up something. :)
    • Degaussing is there to reverse damage caused by magnets too near the screen. It basically dumps lots of AC into a big coil around the screen for a few seconds; if the shadow mask/aperature grille have been magnetized, this will help to demagnetize them and eliminate purity (funky-color) problems.

      Phosphor burn-in is the result of the fact that the phosphors (the things in the monitor which light up when an electron beam hits 'em) will lose brightness when they've been worked for a while. If they've been worked disporportionally, like to display a light colored logon box in the middle of a dark screen, then you might get an "image" of burned-out phosphors sitting amongst the non-burned ones.

  • by DGolden ( 17848 )
    No one has EVER told me that burn in "stopped being a problem" (I'm in europe). I've seen monitors that were bought early this year with noticeable burn in after about a year's (ab)use by clueless office drones.

    Must be some propoganda campaign by monitor manufacturers in america, or something.
  • Back in 1999 at my last job, one of our clients had us order them a huge plasma display. They had nowhere to store it while waiting for the custom roadie case to be built that it was going to be riding in while it travelled to different trade shows around the country, so we hung on to it and, uh, "tested" it for them.

    One day, a video crew came to shoot some tape of our execs for some promotional video or something they were putting together. We didn't have a really cool backdrop, so we were asked to set up the plasma display and just put a huge company logo on the screen, and the talking heads would have that behind them.

    The shoot went almost all day. When they left, I went in to take apart what we rigged up for them. When I powered off the plasma display, I was startled to see that the company logo had burned in.

    If memory serves, the burn-in faded a bit over time (with further use of the unit) and was no longer noticable-- this happened after the roadie case finally showed up, but the client changed their minds and tried to weasel out of the expense and stick us with a display we didn't need.

    ~Philly
  • only 104 units in 3930 hours?

    man, did you get *Delled* ;)

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...