Become a fan of Slashdot on Facebook


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Graphics Software

Where Did 1280x1024 Come From? 42

Alan Shutko asks: "I was playing with different resolutions recently, and got confused. 640x480, 800x600, 1024x768, 1152x864, 1400x1050, 1600x1200, they all have a 4x3 aspect ratio. But 1280x1024 has a 5x4 aspect ratio. What's up with this? Somewhere in the annals of computing history, someone must have come up with 1280x1024. Why did they choose such an odd aspect ratio?"
This discussion has been archived. No new comments can be posted.

Where did 1280x1024 Come From?

Comments Filter:
  • by Anonymous Coward
    Ideally, you'd have 1280x960, but since 960 isn't a multiple of 128 (or 256), it messes up various hardware blitting methods.

    Is this still true, or did it only apply to older hardware? I've ran X in 920x690 mode with no problems, 690 isn't a multiple of 128.

    One of the popular VGA modes was 320x256 (which is in the 5:4 ratio). It meant that you could have an array of scanlines, and index them with a single byte (with no wastage).

    What programs used that screen mode? 320x200 was the standard mode for games (mode 13h). 320x240 was available with a hack ("Mode X"). Neither of these are multiples of 128, and I've never heard of 320x256 being used.

  • Actually, I'm running 1440x1080 on my 19" ViewSonic monitor :-). E-mail me if you want my modeline to play with...
  • why would you only run 16bit color? can they do full color?
  • same here, even the monitors that sgi ships running with 1280x1024 as the standard res.
  • ibm (and dell?) both make laptops that can handle it, the one from ibm (a20p) is expensive.
  • It's a good resolution -- my old Iiyama monitor handled it well, with a little modeline tweaking.

    Pity my laptop only does 1024x768... but I did need a machine I could carry around.

  • actually, i think your sun monitor still only does 1152x864. The monitor "does" 1152x900 but won't display it all

    but, IMBW
  • Whatever is in the Dell Inspiron 7500 with the XGA+ screen... it runs at 1400x1050...

    just a note

  • The TV is 640x480 but standard TV transmissions use less resolution (483 lines). Further, since TV's weren't designed to be monitors, resolutions less than 640x480 may be better for TVs. See: olution/ []

  • That's not my point. XoXus says 1280x960 was not (originally) feasible because 960 is not evenly divisible by 128.

    However, none of the other very common resolutions we use, like 640x480, 800x600, 1152x864 or 1600x1200 (for those lucky enough :P) are divisible by 128, so his theory doesn't seem to make much sense.
  • Okay, that's plausible... except it doesn't explain 640x480, 800x600, 1152x864 or 1600x1200.
  • It was a very popular hack on amigas...
    but i've also seen it on the pc...
    another nice one is 256x256
  • Aaaaaghk!

    Now THERE's a flashback - QBasic. I got carpal tunnel from QBasic... at the tender age of 18...

    Was your for-poke-next method any faster than PSET?

  • Back when desktop publishing was really heating up on the Macintosh, Apple came out with the Full Page Monochrome (later RGB) Display that supported one resolution:



    Yes, a "backwards" display ratio - 3:4 - 1/3 taller than it was wide. This was designed to show a single sheet of paper at 100%; and it worked, too, you could just about line up a sheet of paper to the display.

    And others will remember the Radius Pivot display, which could "swing both ways", so to speak; you could tilt it over horizontal for spreadsheets and tilt it back vertical for page layout. The neat thing on the Mac was the fact that the computer would automagically detect this and resize your desktop to accomodate.

    Ahhhhh, those were the days....

  • The monitor "does" 1152x900 but won't display it all

    No, it does. I noticed the discrepancy when using Xvnc for the first time. I remembered that the root window size on the target machine was 1152x?, so I set it to 1152x864 by doing the 4:3 math.

    But then I couldn't figure out why I couldn't fit the same number of windows on the screen as I do when using X on the target's console. I had to force vncviewer to open at 1152x900 in order to get the same results. [*]

    Others are using the same X setup as me, with similar boxen (older sparcs mostly) and they have the same res and effects.

    [*] I actually had to walk upstairs to the target and run X on it, check the root window size, close X, and walk back downstairs to reopen the XVNC client. It was silly. Luckily it was a temporary necessity.

  • Incidentally, my Sun's monitor has resolution of 1152x900, which is... uh... 25:32 aspect ratio. Normally (insofar as 1152 x anything is a normal res) you see 1152x864.

  • I'm actually running at 1152 x 870 right now. Yeeha! 192:145 aspect ratio!
  • Was your for-poke-next method any faster than PSET?


    The PSET function called a BIOS routine to do the write. Terribly slow.

    Not that mine was THAT fast.. just a lot faster. (-:
  • Ya know, I never could figure out why reverse-aspect monitors never caught on. when I hack I always end up fullscreening emacs and running follow mode (kicks but!) and horizontal split which turns it into a two column display.

    The scrolling isn't perfect, which is why I want a reverse aspect ratio monitor.

    Does anyone have a software solution? Like an X driver (server?) that just rotates the display 90 deg? Then you just stand the monitor on its side...

  • This isn't 4:3 either, it's 8:5. I think this was to fit in the original mapped VGA 64K (320*200=64000, 320*240=76800). 1280*1024 could be a similar situation (4-bytes/pixel gives you 5MB?)

    Might it be possible that the 3 color phosphors were wider than they were tall? I've seen 160x240 done for that reason (small LCD).

  • I'm going to say that 1280x1024 came about because of the need for a more "square" work area, namely in CAD environments. That's the first thing that popped into my head when I saw this question.

    Could it also have been brought about due to memory constraints? 1280x1024 will fit into 1Meg of video memory at 4-bit color depth, maybe 1364x1024 (close to the 4:3 aspect ratio) doesn't fit into one meg of video RAM as well?

    Again, I'm guessing here, but isn't 1280x1024 the first "XGA/PGA" resolution? Could this resolution be held to a different standard than the "(S)VGA" resolutions?

    All I know is, I like my resolutions high, and 1280x1024 suits me just fine on medium monitors (17", high end 15").

    You want a messed up resolution? Dell's 1400x1050 LCD screens (XGA+)... I've got one of those puppies, and MAN, are they nice! :-) X on that laptop is almost heaven at 16-bit color depth! Oh, and it's a 4:3 aspect ratio. Probably for good DVD playback, but since I'm boycotting DVDs, I'll never know.
  • Hmm: lspci thinks it's a 'ATI Technologies Inc 3D Rage P/M Mobility AGP 2x (rev 64)'. I've got to agree, 1400x1050 is a great resolution. I'd have bought a non-Dell laptop if I could have gotten better than 1024x768 in December, but at the time the only one I could find that did SXGA was the 7500, and it did SXGA+ (Dell's name for 1400x1050) for just a little bit more money. Well worth it.

    BTW, if you're looking at the 5000 (slimmer, not as expandable version of the Inspiron 7500) think very carefully about a Celeron or something else that runs cooler. My 7500 is incredibly stable and stays reasonably cool, but the PIII/650 Inspiron 5000's my company bought have real heat problems. Your lap getting uncomfortably warm is one thing, but having drive and cpu flakiness that trashes a filesystem when you run it too hot is a little outside what I'd consider acceptable. If you can hack the extra weight grab the 7500.

    If you do want the 5000 look at Sceptre []. They source the chassis from the same manufacturer Dell does.

  • 1152x864 was probably used because its the highest convenient resolution thats a multiple of 32, and fits in 1 megabyte in 8 bit modes.
  • I believe the Hercules monochrome cards ran at 720 x 348 (only mentioning that so you don't calcute an incorrect aspect ratio, of course ;) ) Just for a laugh, one of these days I'm going to break mine out of storage and try it in my Windows 98 box to see if it'll be autodetected. I guess I'd better do it soon - it seems unlikely that the next motherboard I own will be physically able to support an 8-bit ISA card.

    It had better resolution than CGA, but I have to say it sucked being the only one amongst my friends without color. I was so happy the day I got a copy of "SIMCGA" - a TSR program that allowed us Hercules users to fool games into thinking they were running on a computer with a CGA card installed; a requirement for most PC games back then. I owe whoever wrote "SIMCGA" (and who released it as public-domain) a big thank-you.

    Didn't the Hercules cards allow you to have a VGA card installed simultaneously?
  • Hmmm ... after a quick search at the PC/Blue Disk Library [] hosted by the awesome folks at the OAK Software Repository [], if anyone cares (yeah, I know) here are the first few sections from the SIMCGA manual:

    SIMCGA - Simulate CGA with Hercules Monochrome Card

    Written in September 1986 by
    Chuck Guzis
    153 North Murphy Ave.
    Sunnyvale, CA 94086

    This memory-resident utility allows you to "fool" most software requiring a Color Graphics Adapter into using your Hercules (or compatible) monochrome adapter in the graphics mode. Graphics images are reproduced in normal aspect ratio, using as much of the available screen area as is possible.

    The trick used here is to program the HGC to display more lines of 3 lines per character time instead of 4 (The CGA displays 2). A service routine hooked into the hardware timer interrupt (int 8) copies one line to the third displayed line to give a filled-out image.

    If you're out there Mr. Guzis ... thank you. :)
  • Heh, that beats 1152x882 (64:49) which was present in some old Matrox Millenium drivers.
    It was later replaced by 1152x864 (4:3).

    -- Sig (120 chars) --
    Your friendly neighborhood mIRC scripter.
  • I've actually always wanted a resolution between 1280x1024 and 1600x1200 for my 19inch monitor. 1400x1050 seems to fit the job well, but it would appear my current video board (v3 3000) doesn't support it. Do any current video boards support this mode?

    Sometimes you by Force overwhelmed are.
  • Anyone remember the old Hercules monochrome adapter? 720x350. I recall Win3.0 & 3.1 having drivers for it. And the old EGA stalwart: 640x350x16 (out of a 64 color pallete).

  • The BBC micro had a virtual screen resolution of 1280x1024 - it had a number of different screen modes (colour depths, resolutions), but the graphical modes all had a virtual resolution of 1280x1024. This meant drawing a line (in BBC BASIC) from the bottom left to the top right was always: MOVE 0,0: DRAW 1279, 1023 (Yes, (0,0) was bottom left, as opposed to top left)
  • Okay, so it isn't a nice multiple of good numbers, but here's the resolution I've come up with: 1320x992. The number of pixels on the screen is within approximately 0.1% of 1280x1024, so it should theoretically work with any monitor that supports 1280x1024.

    I have pasted a few modelines below. The most important number for a lot of people is the dotclock (120 in the example). You can bring that down or up, depending upon how high of refresh rates you can use with your system. IIRC, this runs at about 60 Hz, but it may be a bit higher (65 or so). Please also realize that xvidtune may be of use.

    Also note that I'm not a genius when it comes to this stuff, and it could cause bad things to happen (though most modern displays can shut off when fed a bad signal..)

    Modeline "1320x992" 120 1320 1348 1516 1752 992 994 999 1036
  • `` it should theoretically work with any monitor that supports 1280x1024.''

    Whoops.. If you have a fixed-freq monitor, this probably won't work, but it should work for any multisync monitor. I came up with this because my monitor is only spec'd to do 1280x1024, and I didn't want to try my luck at a higher resolution (which would have probably forced me to use lower refresh rates).
  • 640x480 is the resolution of a North American TV screen, which is in a 4:3 ratio. In fact, you'll find that all the standard settings are in the same 4:3 ratio.

  • Interesting, but is that really the case? I just measured mine, and it has a 4x3 aspect ratio.
  • 320*200=64000

    I think that this mode was popular because the complete video memory fit into one memory segment. No need for page flipping a-la ModeX.

    I still remember the days of bypassing PSET and using my own Pixel Routines in QBASIC.

    DEF SEG = &HA000 'the VGA segment
    'fill screen with red:
    FOR i = 0 TO 64000
    POKE i, 4

    Those were the days. Direct memory access under DOS. Without fear of the BSOD, without fear of infringing on another process' memory... [drool]
  • Perhaps I'm wrong, but wasnt 1280x1024 the original size (for whatever reason) of the default X10/X11 desktop?

    WWJD -- What Would Jimi Do?

  • Matrox G200/G400 will do that mode nicely. Your monitor may not like it, however. I had to tweak my Sony 21/XF86Config for an hour to get it to sync up correctly. I had some luck with it on an ATI Rage Pro, but the card has so many non-redeeming qualities I'd shy away from it, even if it is cheap.

    OT, but I ran some benches of my new V3 3000 (replacing a burned out ATI) against my old SLI V2s.. The V2's kicked its ass, by up to 40%. I sincerely hope the 'Bigger, Badder' V3 models really are.. (Scariest part? I have 64M of video subsystem memory in there now)
  • Most games don't list all the other resolutions. If you have a GeForce 2, you can't play Homeworld or Half-Life in 1280x960 if it runs basically the same with a better view at 1280x1024. GeForce 2s still support 1280x960, but with that kind of video card, why not go all the way up to what your monitor supports? If you have a crazy high-res monitor, you're doubling that res, though.

    Tell me what makes you so afraid
    Of all those people you say you hate

  • (Let's try that again)

    The comments about memory adderssing make sense for the horizontal size, but not necessarily for the vertical. That said, it is definitely easier to get a 4040-like counter to reset every 1024 ticks than every 960 ticks - but this is trivial.

    I think the reason may be that older systems actually used 24-bit color, often with no underlay/overlay/alpha channel. This gives 3 bytes per pixel, for a total of 3.75 Meg for the display. This fits comfortably within a 4M framebuffer. The next higher multiple of 128 for the horizontal is 1408, with a 1056 vertical (for 4:3), and that is too much for a 4M framebuffer.

    I know that back in 1987 (and probably before), the SGI Iris (with a whopping 25MHz R3000 and 32M RAM!) had a 1280x1024x24bpp display. (of course, that was 24 bits color, 24 bits z-buffer, 2 bits overlay, 2 bits underlay, and another 24bit+24bit rendering buffer, for a total of 100 bits/pixel!)

    This may be another one of those "They're doing it, so we might as well, too" kind of things.

  • The simple answer is that while TV's have a 4:3 aspect ratio, PC monitors have a 5:4 aspect ratio. That's why DVD's always seem just a little stretched. Some decoder cards actually letterbox the display to retain the approximate ratio of actual TV screens, which inevitably results in scaling artifacts unless you're cranking it out at 1080i resolution.

    Anyways.. 1280x1024, in addition to offering memory-aligned scanlines as stated in every other comment, provides perfectly square pixels, which comes in pretty handy for graphics work.
  • Various Macintosh computers have supported various of the popular resolutions, but the one I always run into is Apple's 1152x870. When I render a 3D graphic full size desktop picture, I must go out of my way to support 1152x870. I have to trim off 6 scan lines when moving it to a PC.

    Another is 800x600 (4:3) and Apple's 832x624. Apple supports 800x600 on some Macs, 832x624 on others, and both on other models.
  • by XoXus ( 12014 ) on Thursday August 03, 2000 @02:17PM (#881317)
    Ideally, you'd have 1280x960, but since 960 isn't a multiple of 128 (or 256), it messes up various hardware blitting methods.

    One of the popular VGA modes was 320x256 (which is in the 5:4 ratio). It meant that you could have an array of scanlines, and index them with a single byte (with no wastage).

  • by dvd_tude ( 69482 ) on Thursday August 03, 2000 @09:17PM (#881318)
    For mostly obscure hardware reasons, early DRAM/VRAM graphics controller implementations favored horizontal resolutions that were a multiple of the row size. 1280 is divisible by 256, the row size of a 64K DRAM.

    One of those obscure reasons was address translation. If you form the linear framebuffer address as (2048*y)+x, it made doing blt hardware much easier: just map x and y onto the appropriate row and column bits.

    Another of those reasons was being able to load the video shift registers at the same times each line. This made the timing control easier to do in the logic of the day (think MSI counters and gates.)

    Modern gfx conrollers refresh the display using periodic burst DRAM access instead of actual shift registers; and they have hardware to help deal with the x-y to linear address translation. So the whole issue of row size pretty much goes away.


    "I'm ANN LANDERS!! I can SHOPLIFT!! " - Zippy

"This isn't brain surgery; it's just television." - David Letterman