Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Viewers for Large Images? 64

mateub asks: "Before setting off to write something of our own, we have been looking for an image viewer that can deal with large (e.g. 10k by 10k pixel) CMYK TIFF images. Note that this is not necessarily the same thing as saying that the file is large, but usually it will be. A smart program could allocate enough memory to show the 1k by 1k pixels of a normal monitor and read other parts of the file when the user scrolls. Not fast, but functional. We've tried ImageMagick, and it isn't that smart--it runs out of memory even on my 1GB RAM, 4GB swap workstation. It appears The Gimp and xv can't even handle CMYK. Are there any programs that can display these images?"
This discussion has been archived. No new comments can be posted.

Viewers for Large Images?

Comments Filter:
  • xzgv (Score:3, Interesting)

    by kraf ( 450958 ) on Thursday April 04, 2002 @04:43AM (#3282973)
    xzgv is the viewer of my choice, it is fast and efficient, it should be able to handle these big files

    • Re:xzgv (Score:5, Informative)

      by sydb ( 176695 ) <[michael] [at] [wd21.co.uk]> on Thursday April 04, 2002 @07:28AM (#3283380)
      I was going to mod you up, but you could have quoted this:

      Now adapts rendering method for big images. When the number of pixels in the image exceeds the value set by image-bigness-threshold (as set in config file or on command-line, defaulting to 2 million pixels), it's drawn piece-by-piece on demand rather than all-at-once. The all-at-once behaviour is worth keeping around for smaller images, as it gives much nicer scrolling - but for big images it's just impractical, hence this feature.

      Which sounds like just the ticket. You could also have linked to the website [browser.org].

      Here ends sydb's lesson in karmah whoring.
      • In all your, I must say, wonderfull ability to karma whore, I must suggest a link to A most interesting site on the subject [cybernothing.org].

        Also, this site [linux.com] has some information on the software suggested: xzgv.

        Hopefully you will find this useful.

      • Hola de nou,

        Having looked at xzgv, it looks like the main problem would be getting it running on HP-UX, Windows and Linux, which are our platforms. I edited this important information out of my original post, sorry.

        So I'm back to thinking that we could write something more easily with the Java Advanced Imaging library, or libtiff. What do you think?

        gràcies,
        Mateu

        • Are you sure you can't run it on those? It should be OK on HP-UX, Windows... well, run it off a Unix box with Exceed or something?

          I've stopped using GQView since I read the original post...
    • Thanks, I had never heard of it before. I'll check it out.

      adéu,
      Mateu

  • by Pathwalker ( 103 ) <hotgrits@yourpants.net> on Thursday April 04, 2002 @04:44AM (#3282974) Homepage Journal
    But I seem to recall hearing from some people who work creating visual effects for movies that Discreet: Combustion [discreet.com] is a very good software only solution for working with extremely high resolution images.

    It's overkill because of the price, that fact that it is an editing tool, and that it is designed to work with video at higher resolutions than the still pictures you are dealing with, but it would probably work for what you want if you can't find anything else...
  • Try ImageFX4.0 running on Amithilon!

    http://www.novadesign.com/fxinfo.htm

    .
  • by Matthias Wiesmann ( 221411 ) on Thursday April 04, 2002 @05:00AM (#3283022) Homepage Journal
    As far as I know, photoshop is very good at handling huge images, altough I don't know if it could handle such a monster. Also photoshop does CMYK. One of the main problems when handling very large images is that the default VM policy of the OS is not adapted to handle the graphic data.

    Photoshop does its own VM to handle the memory needed for the image. As far as I remember it needs four times as much disk space than the original picture (this is needed for things like filtering and undo).

    I don't know of a cheaper alternative that would do the trick. If I had to write such an app, I would look into mechanisms to install a custom pager that would handle the image data - using the image file as backing store - this is basically what you describe (loading the parts that are needed), but integrated with the VM mechanism.

    If the app needs some basic functionality like zooming, things are a little more tricky: lets say you want to dispay a 1/10 zoomed out version of the image, then you need to process the whole file to calculate the reduced image. As this is expensive, you would probably want to cache the result.

    This could again be done using a custom pager, that simply clear the memory when the pages are swapped out, and regenerate the needed pages when the memory needs to be swapped in again.

    • If they want to view zoomed out images often, and they want to go through the effort of writing their own tool, storing the data in a quadtree [nist.gov] would probably be a good idea to allow fast access to the data needed to generate the overview without having to process the whole file.

      Of course, it would suck if they wanted to manipulate the data, but for viewing only it might be a good idea.
    • As far as I remember it needs four times as much disk space than the original picture (this is needed for things like filtering and undo).

      Almost right. This setting is adjustable, under Preferences - Memory & Image Cache. Change the cache levels from 4 to say 1 if you're working with large images (eg. 4000dpi medium format RGB slide scans - ooooh Nikon Coolscans rock [grin]). You can also set the amount of physical memory available to Photoshop here, as well as which disks to use for swapping under Plug-ins & Scratch Disks.

      Oh, also worth mentioning is that you can purge this cache (and the clipboard, etc.) through "Edit - Purge - All" which is handy if you're swapping-out already and are about to do an memory intensive operation.

      I don't know of a cheaper alternative that would do the trick.

      Might be worth taking a look at Photoshop Elements [adobe.com], I think you can get this as cheap as $40USD or so if you hunt around. Don't think it supports CMYK though, but then if you're working with press images you'll probably need the full Photoshop anyway.

  • by hanjo ( 82884 ) on Thursday April 04, 2002 @05:27AM (#3283085)
    If you decide to write the application yourself, and you are using an Intel architecture, then you might consider using the Intel Image Processing Library [intel.com].

    The library provides a set of low-level image manipulation functions in DLL and static form. A part of the API deals with tiling of big images, so that only the viewable fraction of the image is loaded in the memory. The library comes with a demo app that demonstrates its capabilities including the tiling of images.
  • http://www.enlightenment.org/pages/imlib2.html [enlightenment.org]

    Methinks Rasterman and the Enlightenment team may not have had this in mind, but if they did, it'll work.

    That previous post about the VM being important is spot on too.
  • by Bazzargh ( 39195 ) on Thursday April 04, 2002 @05:53AM (#3283148)
    The tiff spec originally used striped chunks of image, ie exactly what you might expect to be dumped directly from the buffer of a fax or scanner. (later they added square tiles, which made it easier to embed jpeg encoded images in tiff). While it is technically possible to encode the whole tiff in a single strip (of compressed image data), thus screwing any attempt at random access, the standard says you shouldnt do this and aims for small strips - in terms of memory.

    This has nothing to do with how large the strips are in pixels though. If you viewer wanted to load a 100 pixel square region and the page width is 1M pixels, you will actually be forced to load at least 100x1M pixels into memory.

    Bottom line is that in general tiff is a poor format for very large images. And while other image formats are different, they, too, are often not designed for random access to areas of a much larger image. If you really have large images then you should consider storing them in a different format, or even creating a client-server viewer so you don't have to install a gig of ram on the desktop of everyone who wants to look at the image. Depends on your app though. For photoshopping, just add more ram, for a db of x-ray images you would probably want to store images in a format designed for random access to small areas by remote client machines, and various thumnail levels for image navigation.

    BTW I deal with A1 scans and I'm pretty sure the (proprietary and fairly crap) image viewer we use takes the client-server approach. I dont speak as an expert in this area but I have written software to convert TIFFs to other formats in the past.

    -Baz
    • Hola,

      You are right, in general. Fortunately, we need the viewer for an application we are developing that outputs TIFF. So we can control the strip length as well.

      We will look into changing formats if necessary, but as this application is for in-house use and most of our users will have fast computers with lots of memory, reading in, say, 900 TIFF strips that are 10,000 pixels wide should be acceptable in most cases. Regardless, you are correct that another approach would be better. I have thought about changing to tiles--it would be tricky, but possible, for this application.

      I will think about your client-server idea, though. Thanks.

      adéu,
      Mateu

  • ACDSee is the image browser of choice when it comes to any kind of image type. It's fast efficient and performs many functions that are an absolute must for anyone who works with images or graphics.

    It supports just about any file type you can think of including video and audio as well as compressed archives and even scans unrecognized files.

    It's available for PC and Mac but the mac version is more limited and slower than the pc counterpart. There's also a slimed down version of the original image viewer that is fast as lightning.

    check it out [acdsystems.com]

    http://www.acdsystems.com

    • by b_pretender ( 105284 ) on Thursday April 04, 2002 @10:55AM (#3284046)
      I remember being very impressed with the size of images that ACDSee can handle. It was my image browser of choice at my old job where the average tiff files were ~35MB and the PGM files were much larger.

      ACDSee does choke, however, if the directory has a few thousand images in it. We were training image recognition algorithms and a typical truth category might have as many as 10,000 images. With this many images in a directory, ACDSee became unuseable because it constantly would update the directory contents, filesizes, filetypes (of a network drive) and wouldn't allow user input until it was finished.

      • Also several functions of ACDSee will choke if you feed more than 32k of images to it (think "Find Images" function for example).
        For example the "Duplicate File" finder won't even load more than 32k images.

        OT: anyone knows of a good "find duplicate files" tool ?
        • For the same job that I comment on above, I wrote a Perl script to find duplicate files. It worked well. If I remember correctly, it was based upon a similar program in the "Perl Cookbook" (Christianson). It would output a text file. I would then cat the text file in a shell script to modify the duplicated files (e.g. rm then). One problem, was the IRIX 'rm' (or any shell command) barfs if it gets passed too many arguments (i.e. 32k filenames). This wasn't to hard to work around.

          Conclusions:
          1. GNU Tools are the shiznit when it comes to managing large batches of data (CLI beats out GUI every time).
          2. Even when GNU Tools are difficult to use (i.e. 32K files), there are work-arounds and they still blow the pants off of GUI file management tools.

          I used to get all upset, because NONE of the "File-managers" work worth a darn when it comes to actually managing files!

  • While this topic is well outside my usual area of (in)expertise, I think these [lizardtech.com] guys products might be able to help you out... That is, if you were not absolutely tied into the TIFF format for your viewable files. As others have pointed out, TIFFs are a poor format for very large images.

    Anyway, LizardTech's technology is used by plces like the USGS and Library of Congress to allow instant access to very large scanned maps and other documents. In particular, you might want to check out their software aimed directly at working with big photos [lizardtech.com].

    • I've seen a demo of their software in action. An entire CD-ROM (or more) of overlapping images (aerial photography in this case) that were geographically positioned could be viewed using their software. You could pan & zoom in near real time, and this was about 4 years ago! Their software uses wavelet compression.

      The software was only on Windows at the time, but according to their download page [lizardtech.com] they have both a browser plug-in and standalone viewer for Linux.
      • It's also a wavelet system.

        The difference is the free ECW SDK can do decompession of any size and compression of images up to 500Mb. If you want to compress larger than 500Mb you have to buy the software.

        Last I checked, MrSID's free kit could only decompress, any compression required buying the software.

        ECW is a little easier to try out.

        ERMapper [ermapper.com]
      • I've used the linux plugin, it works great.
  • If you consider commercial products RealTimeImage [realtimeimage.com] is certainly worth your look. They provide collaborative application based on a networked viewer that can work with really large images very fast.

    Have a look at their Demo [realtimeimage.com]. Viewing 0.5G images on your local computer is very impressive.

    • We use the RealTime plugin for one of our websites, and it's ideal for showing large images (we use it for 'Scitex Brisque-Ripped PDF files'). It's got 'notation' and 'color dropper' tools which allows a certain degree of manipulation online. Ok, it's a lil' bit slow on a 56k modem, but a few seconds delay to view a 120Mb PDF file ain't that much!
  • Large Image Reader (Score:2, Interesting)

    by sparkst24 ( 442777 )
    Check out Remote View [sensor.com]. It's a tool designed for really large satellite images, but can be used for a any standard type of image. This tool is optimized for HUGE images upwards of 1GB apiece. I've had a lot of luck with it.
  • If you're on an SGI, use the ImageVision libraries/tools. We were using it to process big images (60K x 20K x 5, 16 bits per channel) nearly 10 years ago. It will read as much of the image that is needed and will cache some of the tiles so going back over an area won't require another disk read. If the output is to an SGI video card, some image processing operations will be hardware accellerated. ImageVision homepage [sgi.com]

    • Of course, 90% of what you could do ten years ago on a high-end SGI can now be done on a palmtop or generic PC, because Moore's law has made both CPUs and graphics processors much faster.
  • Regardless of platform, I've successfully dealt with 24x36 inch, 300 ppi CMYK images before... usually movie poster files that had multiple layers, but a flattened tiff should be even easier to work with.
  • Irfanview (Score:2, Informative)

    by orty.com ( 81360 )
    If you're on a Wintel platform, Irfanview (www.irfanview.com [irfanview.com]) is a great freeware tool that can open damn near any format of image (some that other viewers can't) and is a small download. I've opened some pretty dang big images in there (a 42 meg jpeg that was ungodly huge), and it loaded fine (and it can open cmyk images and do basic manipulation, resizing, color adjusting, etc...). I've been using the program since version 1.2 and love it (it's at 3.61) and recommend it to anybody that needs a simple graphic viewer (it's a helluva lot faster than ACDSee).
  • Photoshop 6 opens 21600x21600 tiffs in about 2-3 minutes on my athlon 1.3 w/384megs DDR. Once you get it loaded, you can zoom all the way in and out at will with very little lag.

    Check out the Blue Marble [nasa.gov] satellite images from NASA! They're huge and really sweet. You can see individual sand dunes on the Sahara!

  • Is that the kind of thing you had in mind?
    If so, cf:

    http://www.lizardtech.com/
  • The short answer is: use a command line tool like "tifftopnm" to convert into RGB color space; it should do the conversion pixel-wise, so it doesn't need much memory at all. The resulting file will be about 400Mbytes uncompressed. Once you are in RGB space, many image viewers should work (ImageMagick probably won't--it's not very efficient). The Gimp in particular uses a tiling cache.

    If you are going to do this sort of thing a lot, don't use TIFF. There are a few tiled formats around for GIS applications, and and hierarchical formats like JPEG2000 or PhotoCD should also be a reasonable choice (but not much tools support yet).

    Of course, rather than writing your own, if you really need CMYK, why not add CMYK support to the Gimp? The Gimp already does the tiling and has lots of other useful features.

    • Well, the Gimp also has the problem for us of requiring lots of work to get it running on HP-UX, and we would also need something for Windows. The gimp's web page does not inspire confidence in their Windows stability--or are they just being modest?

      TIFF has a tile format as well. Is there something better about the JPEG2000 or PhotoCD formats over the TIFF one? Especially given that TIFF tool support is quite broad.

      But really, the format elegance doesn't matter if an intelligent viewer doesn't exist that knows not to try and read the whole file into memory. Are there viewers like this for JPEG2000 or PhotoCD?

      From what I read a few months ago, adding CMYK support to the GIMP, while desirable for the whole graphic arts industry, would be heroic amounts of work. Has the situation improved? Is anyone working on it? How would that compare to the work needed, say, to use the Java Advanced Imaging library (here [sun.com]) to write a viewer?

      adéu,
      Mateu

      • The gimp's web page does not inspire confidence in their Windows stability--or are they just being modest?

        I've always found the win32 port to be rock solid for me, but I'm not graphics professional, and I only run 98. I've never tried it under other platforms.

        Though if all that's needed is a viewer, not an editor, Xzgv is great for big images. GTK has been ported to Windows (enough for Gimp to run, anyway) and it's probably less work to add CMYK to such a small program than Gimp.

        With Xzgv on my puny K6-400 with 384MB RAM, I can load 50MB jpegs and tiffs in under 30 seconds, and panning is done in realtime. A modern workstation should handle bigger files even faster. Might be worth the effort to adapt it if adding CYMK doesn't break the speed.

      • TIFF has a tile format as well. Is there something better about the JPEG2000 or PhotoCD formats over the TIFF one? Especially given that TIFF tool support is quite broad.

        Both JPEG2000 and PhotoCD are not merely tiled, they are hierarchical: you can extract a complete, scaled down view of the image without every looking at the high resolution version. You can also make some global adjustments (color, cropping, etc.) without touching every pixel. And for viewing detail, only small portions need to be decompressed.

        From what I read a few months ago, adding CMYK support to the GIMP, while desirable for the whole graphic arts industry, would be heroic amounts of work.

        It doesn't look that hard to me. The GIMP already has support for arbitrary numbers of channels. CMYK is just a four-channel image. All you would need to do is write code that loads and saves it, and some code that converts between CMYK and RGB on the fly for display and RGB-based filters.

        I think the main reason this hasn't happened is because nobody who understands color can figure out why anybody would want to do such a thing, or even what "viewing or editing CMYK" is even supposed to mean. Just convert to RGB and use RGB tools. Conversions to CMYK are best left to the device drivers that actually do the printing. What the Gimp needs much more than CMYK is 16bit support, IMO.

        Well, the Gimp also has the problem for us of requiring lots of work to get it running on HP-UX, and we would also need something for Windows. The gimp's web page does not inspire confidence in their Windows stability--or are they just being modest?

        I have had no problems with the Gimp on Windows, but it is a slightly older version. Why not just download it and give it a try? As for HP-UX, "configure" generally works pretty well, but HP-UX is also pretty weird.

        Your best bets for high-end graphics (unless you want to pay an arm and a leg) are MacOSX, Solaris, and (for free or research software) Linux.

  • by t ( 8386 )
    The tools aren't quite up to snuff but what would be ideal is to reencode the tiffs using the lossless (integer wavelet transform) probably specifying no color transform. (It only really talks about rgb2yuv). Since it is lossless there is no concerns over data. jp2k was designed for random spatial access with automagic support for viewing the image at lower resolutions that are powers of two. (eg. 256 pixels to 512 to 1024 etc....)

    Its possible that you could utilize one of the source available versions from jpeg.org like jasper as a backend. You can specify cmd line options to get reduced resolution versions from the encoded files. This has the benefit that you'll never have to decode the data for the higher resolutions if you don't need it.

    t.

  • there is a reason why graphics professionals use them. I just pushed a 25000 x 25000 pixel cmyk to test on my G4 400. PS7 b43 under 10.1.3 worked just fine. chunky, but my machine is a few years old. btw thats a 2.3gb file.
  • Just FYI, Photoshop will support any image up to 30,000x30,000 pixels in size, I've done it personally on a 512MB box.

    For 10k^2, and 1GB ram like you say, it shouldn't be a problem (after all 10k^2 CMYK is only 381MB uncompressed).

    Stick with an industry standard, full-featured program unless you can't.

    MadCow.
  • Look around for Irfanview. Not only is it free, but it does a good job of displaying just about any image format. Of course, at 10k x 10k, even it might choke on the file...I've never tried it with something so large.

    BTW: What os?
  • If you are comfortable working with Java, and feel like writing up your own viewer application, check out Java Advanced Imaging [sun.com].

    It offers some pretty cool tiling caching strategies for displaying large images. We've written an app that opens up images that are in the order of several hundred MB, and it handles them pretty well.

  • When I was doing my undergraduate studies in physics, I was using a program called I.R.A.F. (http://iraf.noao.edu/iraf-homepage.html) to process large astrophysical images. This program doesn't support TIFF (yet) but if you can convert the file format, maybe you could use it to view/manipulate your image. You can download the program from the NOAO page for almost any UNIX flavour.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...