FEAD Compressing Compressed Files by 50-75%? 75
An anonymous reader asks: "I just installed Acrobat Reader and found that it was using FEAD which claims - 'FEAD© Optimizer© significantly reduces the size of application programs on average by 50% (in some cases up to 75%, depending on the specific software), even when they are already compressed with common compression technology like ZIP or CAB.' . It seems that they optimize each application individually at thieir labs. But an average of 50% compression on already compressed binary files seems to be too good to be true. Anyone familiar with how someone may be able to achieve this?"
Old compilers (Score:1)
Other than that ? Probably just marketing.
Re:Old compilers (Score:2)
This is a common hoax. (Score:2, Informative)
Re:This is a common hoax. (Score:2, Funny)
It's not compression (Score:5, Informative)
The thing they tout as FEAD is basically a load-over-network-on-demand thingy. They haven't actually developed anything that does compression, they're just storing some of the app on a server somewhere to be downloaded on demand. The hype at their site mislead you, like it was meant to do.
Re:It's not compression (Score:4, Informative)
That is NOT what is being discussed here.
Even if you bypass using the download manager, it still uses FEAD to decompress and install AcroRead.
One could easily disprove your theory by unplugging their net connection during the FEAD decompression... Done... no adverse affect.
Nonetheless, the installer is VERY slow, and is still bigger than the AcroRead 5.1 installer, which did not use FEAD.
Making users go through this many steps (download the download manager. run it. wait for it to download. wait for fead...) and slowness is insane.
Re:It's not compression (Score:2)
Well, the information on FEAD available by going where the
Compression is easy (Score:5, Funny)
You can compress all your files down to a single bit using this patented two step process:
1. Discard all zeros.
2. Use one to represent any length sequence of ones.
This is as reliable a compression scheme as most backups to tape I've ever seen, and you can fit a huge number of files onto a single floppy.
Re:Compression is easy (Score:1)
Re:Compression is easy (Score:1)
Re:Compression is easy (Score:1)
Re:Compression is easy (Score:5, Funny)
First, you expand the 1 to the requisite number of 1s:
1 -> 1111111
Then, reinsert the 0s:
1111111 -> 1101101001001
Thus, 1 -> 1101101001001
I have a better algorithm (Score:3, Funny)
Re:Compression is easy (Score:1)
Re:Compression is easy (Score:1)
Uhm, yeah. The original post was a joke? It sounds like an algorithm that should go into LZip.
--JoeRe:Compression is easy (Score:2)
Am I the only person who looked down at his t-shirt for comparison?
Re:Compression is easy (Score:1)
Re:Compression is easy (Score:2)
Re:Compression is easy (Score:1)
Of course, the player itself is 65 gig...
(guess I shouldn't have used Delphi)
Marketing obfuscation (Score:4, Insightful)
From the article: "Netopsystems specialists combine and customize these tools and processes for each individual software product so that optimal size reduction results are achieved."
Note the following from the whitepaper: "Usually software producers compress their data by generating cabinet files or the like...Applying a conventional compression tool like WinZip or WinRAR on such data does not lead to appreciable - often negative - results."
Read strictly, this says what we know: compressing a compressed file generally doesn't work. They aren't saying they compress the compressed file here.
Note that towards the bottom, they are comparing 'lossless compressed' data to what they do.
So, here's my bet: they probably do something like crack open a cab or zip, parse a PDF, for example, for 'magic things' that can be ignored without changing the functionality ('lossy' but nothing of significance lost), or take an HTML file and strip all spaces and newlines between tags. Similar things could be done for other file types: Removing quotes and instead, magic-quoting commas in a CDF. Etc, ad inifinitum.
All in all, it's lame, but so is most software.
If you have a gigantic amount (hundreds of gigs terabytes) of different files to back up or move around, with so many file formats that you can't keep them straight, then it might be worth it. If you are lazy and it's cheap, it might be worth it. Other than that, I fail to see the real utility here - disk is cheap, bandwidth is getting cheaper, and reasonably assuming the bulk of this data is generated (an adequate assumption), you can do very similar things by fiddling around with the the output formatting in code.
J
Re: Marketing obfuscation (Score:2, Funny)
> Similar things could be done for other file types: Removing quotes [...]
When I have files with lots of quotes in them I reduce the size by using single quotes instead of double quotes.
Re: Marketing obfuscation (Score:2)
Obligatory lzip mention (Score:2)
Lossy compression eh? Get LZip [sourceforge.net] for all your lossy file compression needs! It can reduce
Re:Spaces/newlines in HTML add up QUICKLY (Score:1)
Re:Spaces/newlines in HTML add up QUICKLY (Score:2)
My moral is that HTML from MSOffice is bloated like crazy and sucks badly in many other ways. I made a nice FAQ file, about 100k total, with some very simple CSS as the only styling. Another guy uses Word to edit it, it goes to 180k, and is so full of and nested and other tags that it's impossible to edit source any more.
How to do anything (Score:5, Funny)
"Lying through one's teeth" comes to mind...
Re:How to do anything (Score:2, Informative)
Download on demand:
Think Quicktime/IE/etc. Download a small download then download the components they want. I expect they'd also use a protocol similar to rsync which makes downloading alot faster.
Code Inspection:
This is where they say then can decrease to size of the executable. If you've done progaming then you know that you can make executables alot smaller. Here's some examples: Remove inline macros and make th
i know this one (Score:3, Funny)
Wow. (Score:3, Funny)
Re:Wow. (Score:4, Funny)
It may be difficult to read, but it sure is easy to FEAD© !!!
Re:Wow. (Score:4, Funny)
Re:Wow. (Score:2)
(c) 2003 Cgenman (tm) All rights reserved.
Re:Wow. (Score:2)
EXE compressor? (Score:3, Informative)
Re:EXE compressor? (Score:2)
Re:EXE compressor? (Score:3, Insightful)
zip an
this is why people zip up movie files on their sites, it does make a difference, and if you only save one meg on your 40 meg movie, and 1,000 people download it, you just saved yourself one gigabyte of transfer fees, during
Re:EXE compressor? (Score:2)
-rw-r--r-- 1 tort tort 337648 Jun 11 02:27 libslang.so.1.4.4
-rw-r--r-- 1 tort tort 149671 Jun 11 02:26 libslang.so.1.4.4.gz
-rw-r--r-- 1 tort tort 148000 Jun 11 02:30 libslang.so.1.4.4.gz.gz
-rw-r--r-- 1 tort tort 148051 Jun 11 02:31 libslang.so.1.4.4.gz.gz.gz
double compressing saved 1671 bytes. On triple compressing, thats when you end up with a net loss. You learn something new ev
Re:EXE compressor? (Score:2)
First zip, no compression.
Second zip, zip the first zip.
Can be significantly smaller than compressing on the first zip.
Re:EXE compressor? (Score:2)
> First zip, no compression.
Or use another uncompressed archive (e.g. tar)...
> Second zip, zip the first zip.
> Can be significantly smaller than
> compressing on the first zip.
Yes. That's why a
You do lose something, though; a
Re:EXE compressor? (Score:2)
Re:EXE compressor? (Score:2)
Re:EXE compressor? (Score:2)
Translation : It didn't work anymore, but I recovered 10% of the disk space that it used to use.
Hell that's easy, just go to any random directory, pick the largest file in there and delete it. Pretty much the same result.
Re:EXE compressor? (Score:2)
For the non-developers who can't run "man strip"; "strip" removes the *optional* fluff added to a binary executable (including libraries). Since this is is really only useful when debugging the code, something a user doesn't do that often and certainly shouldn't be done on a production box strip will remove it for you. It does not stop the executable from running. When you
Re:EXE compressor? (Score:2)
Re:EXE compressor? (Score:1)
Re:EXE compressor? (Score:2)
A production box probably does want debug info--when things start going awry there you want to track them down _fast_. Attaching a debugger to the misbehaving app can save tons of downtime.
Ideally you wouldn't have any bugs on the production machine, but in the real world...
Sumner
Re:EXE compressor? (Score:2)
In short: Executable files are far more than just streams of bytes.
Compressing data exe compressors don't (Score:4, Interesting)
I suspect these guys are going in and manually altering the code to perform a decompression. This would certainly produce a benefit.
Here's something for you to try: Take an executable and zip it. If it compresses, then there's probably SOME give in it. And most executables I see are compressable.
Re:Compressing data exe compressors don't (Score:2)
--Dan
Hmm... (Score:3, Interesting)
Using ZIP -9 gives a 20MB file.
So, FEAD offers slightly better compression. (I know there's other crap, including the installer, registry settings, icons, ...)
Still, is it worth the annoyance of the greatly increased install time?
Also, how is FEAD saying they are 50% better than other compressors?
How I'd do it (Score:2)
A large portion of shipped executables are blocks of standard code from the compiler. If you're using a Microsoft compiler, you can strip out the standard chunks and pull those chunks in from the binaries that are already in Windows.
If you're using another compiler, you can still probably do the same kind of thing: some intelligent block compression with the included code that'll do better than the "dumb" compression from "zip" or oth
Re:How I'd do it (Score:2)
Otherwise known as JIT...
Re:How I'd do it (Score:2)
Actually, this is what the "sliding dictionary" is in LZW
Re:How I'd do it (Score:2)
Actually, it's common to have compiler switches to optimize for either space or performance, and even feature-specific switches
And a lot of the time smaller _is_ faster--getting more into icache saves costly trips to RAM. This is becoming more and more true with modern processors, though there are still obvious space/time tradeoffs in many cases.
Sumner
Speculating... (Score:2)
> It seems that they optimize each application individually at thieir labs. But an average of 50% compression on already compressed binary files seems to be too good to be true. Anyone familiar with how someone may be able to achieve this?
Maybe they're just removing the bloat. I've read on comp.risks about a guy who disassembled Windows regedit and found embedded strings and even images, which were not actually used in the application program.
But the link is not at all clear about what they are actua
Statistical encoders (Score:2, Insightful)
Re:Statistical encoders (Score:2)
??? - compression introduces entropy . . . a well-ordered file (little entropy) can be heavily compressed. The compressed version has more entropy per n bits than the original file.
Re:Statistical encoders (Score:2)
Re:Statistical encoders (Score:2)
Re:Statistical encoders (Score:2)
bzip2 (included in most Linux distributions and well past the experimental phase) also uses a Burrows-Wheeler transform (with Huffman coding).
bzip (not 2) used BW with arithmetic coding but was withdrawn because of potential patent problems with that combination (the bzip ari coding wasn't LZW and no concrete patent on bzip's ari coding was known, b
is UPX the same thing? (Score:2)
http://upx.sourceforge.net
it generally gets 50-75% too. IIRC it make a really fast (faster than a HD read) decompressor prepended to a compressed program.
The only thing I can think of to do better is to actually rearrange the binary in a more efficient order / go in at the assembler level and replace any repeated 5 instruction or more sequence with a function call.