What Has Happened To Fractal Image Compression? 27
Dennis Thrysøe asks: "In 1995/1996 Iterated Systems (Michael Barnsley) made a program that compressed/decompressed images with fractal compression technology. It was for real, pretty fast and really worked. It was even free, except if you wanted to make your own program, that compressed images from the libraries. What on earth happened to this field of technology? You can still find the same old version of 'Fractal Imager 1.1', but has it been developed on since? Has anybody else implemented anything (open/free) that really works? Fractals / Iterated function systems are REALLY amazing for compressing images, but why aren't they being used more?"
No good algorithm for compression (Score:1)
Re:I think its the technology. (Score:1)
It just dosen't make sense for me (or, I think, for your average computer user) to compress image files, given the currently available hardware.
That arguement makes absolutely no sense. JPG and PNG are two examples of compressed image files which are used extensively today, in the world of gigahertz speed processors and tens of gigabytes of storage on your typical machine. People still compress files to transfer them (self-extracting .zip and .zip, .rar and .tar.gz), even though their connection is many times faster than before.
I believe the question the author wanted answered is what happened to fractal compression? Tech seems to have locked on to wavelet compression (for images I believe) and still uses LZW and the more "normal" forms of compression for binary files.
personally the only fractal compressor I ever saw was around '95 or so and all it did was store the compressed file as bad sectors on the disk and indicate where and how many were in the "fractally-compressed archive" itself. OLW I think it was called.
Re:OLW (?) (Score:1)
I don't know whether that's the acronym or not, but I'm pretty sure I ran across that program. Fractal data compression on the order of 100:1, almost as quick as pkzip, [this was around '93, I think...I was young and naive then :( ] and all of this by way of simply indexing the "compressed archive" to the existing larger file, and padding it with random info. Soo.... if you deleted the old file, poof, no more working archive. And that's, I think, where tzanger got the "bad sectors" thing; rather than giving "hey, you deleted the file I needed to 'restore' your archive" as an error message, it spat some hoopla about file corruption and bad disk sectors.
Yup; that's the one. Was it really '93? Wow that kinda dates me hehe... I remember downloading it skeptically (back at 2400 baud...) and running it on a test program. I'm pretty sure it hid the original file (not deleted as I'd originally stated) and then reported wondrous compression. I viewed the "archive" with good old Norton Commander (I still use this program today, and it's Linux counterpart, Midnight (I call it Morton) Commander) and wondered how the hell it could store a 300k program in approximately 27 bytes (I believe all the archives were about 27 bytes long, regardless of the original size), when 15 of those bytes took up the path and filename.
Wow this is starting to bring back memories... 232 characters a second at a time. :-)
Wavelets (Score:1)
Anyway, I guess this is why most interest is in wavelets now -- and I saw a JPEG2000 implementation listed on freshmeat the other day!
Fractal compression didn't really work well. (Score:1)
fractal (Score:1)
For mainstream acceptance it needs browser support (Score:2)
Apart from that, fractal image compression probably is being used all 'round the place, just in internal applications and closed communications that you and I don't see.
I vote for the "shiny-thing!" explanation. (Score:3)
I have doubts about that. The license would have been for one specific type of fractal compression; however, the idea was known to enough people that if licensing was a big issue, several other groups would have created their own replacement based on any of a variety of algorithms. I'm sure I'm not the only hobbyist who played with my own compression programs in those days.
I think that the first poster's explanation is the most plausible. Years ago, fractal compression was the latest toy that programmers played with in their free time. Now, it wavelet compression, or something other than compression.
There will still be people playing with fractal compression in their free time. If it's useful, we may see results eventually.
uhm... (Score:1)
Re:Inventor `sitting on patent' (Score:1)
Re:OLW (?) (Score:1)
God, that's way too much information on such a tidbit. I
I just hope it's more informative than it is off-topic
Fractal Image Compression - massively over hyped (Score:1)
1. lots of marketing claims on performance and availablity that didn't hold up
2. not very robust, certain images creates visible artifacts. The algorithm needed more work and less marketing
3. highly proprietary implementation prevented more people from playing with it or improving it.
4. complex and difficult mathematics not taught very many places limited the ability to incorporate into custom hardware
5. did I mention the heavy BS factor?
Photoshop Plugin (Score:2)
Sometimes use it to send high quality images by email, just have to be sure the person you are sending to have the same program.
Not sure what happened to Iterated's own app, although seem to remember reading on their site way back that they were working with someone else of a streaming video application for it?
Maybe they sold it, or maybe just shut up about it till they had their final product (and if there are better methods now it could just have been dropped).
Inventor `sitting on patent' (Score:1)
I also remember that the _compression_ algorithm was, erm, slooooow. So slow that it needed some huge machines chugging for a long time to spit out one fractally-compressed image. But the decompression was pretty quick (although slower than jpeg).
There was a floppy with a quiz using JFIF pictures a few years back on one of the UK PC magazines.. shame I haven't still got the floppy, as the pictures were _very_ nice quality and incredibly compressed.
Re:Inventor `sitting on patent' (Score:1)
I completely agree, as this is pretty obvious. However, the amount of math involved in generating the compressed images was prohibitive, and so the bits saved transmitting or storing the image(s) could easily be overcome by the additional costs of compressing the image to achieve that saving.
Besides, at the time the algorithm they were using was supposed to be `highly optimized' by some very clever maths types, but still needed several days of runtime on a Cray (this was a good few years ago) to compress a single image. I'd call that prohibitive cost
Too proprietary-minded (Score:1)
Their image compession claims were very overblown, and their examples were always the same few images that compressed really well (that iguana image being the most prominent).
The lesson here for new codecs is wrap it in QuickTime, and then the content-holders can try it out with very little effort.
Re:I think its the technology. (Score:2)
You are compressing (or at least decompressing) images. JPEG and GIF and PNG and others are all compressed image formats. If they weren't, you would be downloading something more than ten times as much data to view every webpage.
Example: The Slashdot logo, top left on every page. It's a 3,473 byte GIF. Uncompressed, it's 59,848 bytes.
Even with today's technology, data compression is a BIG DEAL, for images or anything else.
I think that the licensing issue *might* be what killed it, or at least stunted its growth. Would we be using GIFs so much had Unisys been a tyrant about it's little patent back when GIFs were catching on? (I sure hope Unisys wasn't a tyrant back then, because otherwise my argument is dead...)
Wavelets vs. Fractal Image Compression (Score:1)
- Rei
Elaboration (Score:1)
When asking the above question, I actually hoped that someone who knew a lot about the licensing and patents on this field would come forward and fill the rest of us in. I know for a fact that Michael barnsley from Iterated (former Iterated?) has a bunch of patents on this.
Does that in effect mean that nobody else can implement an open (or commercial for that matter) library based on IFS?
How to do it is pretty accesible knowledge, but I don't know if it's legal at all.
-dennis
Photoshop plug-ins in use today (Score:1)
It's called Genuine Fractal 2.0, and it's made by the Altamira Group [altamira-group.com]. It's also a Photoshop plug-in. They make some pretty huge claims, but I've seen this stuff work, and if their output is anything near as good as Mr. Sid, it should be sweet. Here's a clip from the site:
Cellular Automata Transform compression (Score:1)
I think its the technology. (Score:1)
66Mhz 486
8MB Ram
1GB HD
3.5" 1.44MB Floppy
5.25" Floppy
4x CD-ROM
Slow (I forget how slow) modem.
My current Computer:
533MHz Celeron
128MB RAM
13GB Hard-Drive (Two Drives)
3.5" Floppy
4x4x32 CD-RW
T1 Internet Access (College LAN)
In 1995, I couldn't easily transfer or store large files. The hassel-factor of dealing with compressing and decompressing files was negligable compared to the hassel-factor transfering the files to my PC and storing the files on removable media. Today, I can transfer large files to my computer by a number of network protocols, and can store them on a CD-R very easily. It just dosen't make sense for me (or, I think, for your average computer user) to compress image files, given the currently available hardware.
Re:I think its the technology. (Score:1)
Re:I think its the technology. (Score:1)
Re:I think its the technology. (Score:1)
Re:Inventor `sitting on patent' (Score:1)
..._compression_ algorithm was, erm, slooooow.
A fast compression algorithm is not as important as a fast decompression, because compression is only used while creating the image, while decompression is used every time one displays the image. Because of this, the compression algorithm was probably not that much optimized.Windows happened.... (Score:1)
But this was in DOS days. When we wanted to develop Windows 3 software, there were no decompression libraries, so we had to use something else as a stopgap JPEG for large images and simple RLE for small images (this was for an Image Database on CD-ROM project).
And then when the Windows software finally came out it was terrible - images that used to decompress in memory in a fraction of a second now took 10, 20 or 30 seconds to decompress ! This was no use whatsoever - so we just gave it up as another good technology gone bad, where the later developers they hired after the initial startup just had no idea....
Tim