rex files

Discuss music production with Ableton Live.
Post Reply
clipperer
Posts: 593
Joined: Tue Jan 11, 2005 4:24 pm

rex files

Post by clipperer » Fri Jul 15, 2005 12:14 am

can somebody tell me the truth about rex files, how can a file be ~46% smaller its original size with no quality loss?
i still dont get those lossy\non-lossy compression things, can anybody explain?

Lo-Fi Massahkah
Posts: 3604
Joined: Fri Jun 04, 2004 2:57 pm
Location: The south east suburbs of Malmö, Sweden.

Post by Lo-Fi Massahkah » Fri Jul 15, 2005 10:14 am

I'd think that the answer is on the Propellerhead's forum...

/Mikael

stuffe
Posts: 55
Joined: Wed Jun 29, 2005 11:55 am
Location: Chesterfield, UK

Post by stuffe » Fri Jul 15, 2005 11:44 am

Same reason a zip file can crunch 10 word files down to half their size, just compression.

Reducing size by half is fairly easy for most compression algorithms, and with such a low compression ratio there is no need to lose quality.

To reduce by 10 times or more (i.e. compressing a bitmap to a jpeg, or a wave file into MP3) is only possible if you start discarding bits of data to allow the compression to be more effective.

Here's a (fairly crap) analogy; the following string is 30 characters long:

aaaaabbbbbcccccbccccaaaaababaa

Here is the same 30 characters, but compressed:

5a,5b,5c,b,4c,5a,b,a,b,2a

That's now down to 25 characters, but it holds the same information without loss. Here is is again, but with some tweaking of the compression to lose bits of info in order to compress further

5a,5b,10c,10a

Now down to 12 characters, but we've lost 3 bits of the data (the last 3 "b"s)

Imagine that those 3 "b"s represented a sound, or a pixel colour, that sounded or looked so close to the ones next to it, that you could lose it without noticing too much, that's all a lossy compression algorithm does.

Of course, the more you compress, the more you lose, the worse it gets, but the above should prove that you can compress and guarantee not to lose quality if you want to.

clipperer
Posts: 593
Joined: Tue Jan 11, 2005 4:24 pm

Post by clipperer » Fri Jul 15, 2005 1:40 pm

danke stuffe i see something now//

so it means i can compress one time without quality loss?

stuffe
Posts: 55
Joined: Wed Jun 29, 2005 11:55 am
Location: Chesterfield, UK

Post by stuffe » Fri Jul 15, 2005 1:44 pm

Any time :)

clipperer
Posts: 593
Joined: Tue Jan 11, 2005 4:24 pm

Post by clipperer » Fri Jul 15, 2005 2:09 pm

bbut, the more i compress the smaller the file gets ,right? and looses its bits.. and no quality loss?

hell i dont get it

Meffy
Posts: 327
Joined: Fri Jul 02, 2004 1:21 pm

Post by Meffy » Fri Jul 15, 2005 2:22 pm

A compressed file works together with a compression/decompression program. Together the two encode and decode data streams. The compressed file represents instructions for reconstructing the original data stream. The program provides the algorithms and data-handling routines to create the compressed file and to perform the reconstruction.

As long as the original information can be reconstructed perfectly from the compressed version, there is no loss. There's a limit to how much you can compress without losing any information. If you're willing to sacrifice some of your original data, you can squeeze tighter, get a smaller compressed file.

In the case of JPEG graphics, you can easily see the result of data loss: artifacts -- the squarish, blocky things -- appear when compression becomes excessively lossy. When you compress music too much, your MP3 (or whatever) will have audible artifacts in the form of distorted sound and weird background ringing.

When using lossy compression, the trick is to reach a comfortable balance between file size and reconstructed data quality. It's up to the individual compression user to decide where that balance lies, based on his or her needs. If you must have high fidelity sound, you'll have to compress less. If bandwidth is the paramount concern, you must accept larger files. To some extent, improved compression algorithms can help -- but they really just push the borderline case further toward the high quality, low file size case. You'll still have to make the same decision, but results will be better.

I've deliberately omitted two parameters: the time required to compress and decompress files. That's a separate issue that would only add confusion. I've also been a bit cavalier in using "data" and "information" interchangeably. They ain't the same thing! But this is an informal description and I think no harm is done by fudging. :-)

HTH!
Last edited by Meffy on Fri Jul 15, 2005 2:24 pm, edited 1 time in total.

stuffe
Posts: 55
Joined: Wed Jun 29, 2005 11:55 am
Location: Chesterfield, UK

Post by stuffe » Fri Jul 15, 2005 2:24 pm

The more you compress using a lossy compression algorithm, the more bits you lose, and the smaller it gets, but the worse it looks or sounds. You get to choose the quality (eg 128bt, 360bit) and thus ratio that is used for lossy formats like MP3, JPEG.

Rex files (and FLAC, search for this, it's interesting) are lossless, so they will hit a limit on how small they can go before not being able to go smaller without losing stuff, which they refuse to do. This is typically about half. You usually don't get to choose the ratio, as it will just do what it can, and you won;t be able to get it smaller.

clipperer
Posts: 593
Joined: Tue Jan 11, 2005 4:24 pm

Post by clipperer » Fri Jul 15, 2005 2:51 pm

oh, thanks all

Meffy
Posts: 327
Joined: Fri Jul 02, 2004 1:21 pm

Post by Meffy » Fri Jul 15, 2005 3:02 pm

Don't feel bad if it's hard to wrap your mind around. Data compression isn't exactly intuitive! It is a complicated subject, and I don't begin to understand some of the more advanced kinds of algorithms and file formats used.

For a real hoot, check out Iterated Systems, Inc.'s FRACTAL compression and the .fif format. Using .fif, you can achieve 50:1 compression ratios with good fidelity... and you can actually zoom IN, apparently revealing details not in the original image! It's a trick, really, a clever hand-wave -- but it works surprisingly well on a wide variety of source images.

Post Reply