Amiga.org
Amiga computer related discussion => Amiga Software Issues and Discussion => Topic started by: Brian on August 09, 2013, 02:10:17 PM
-
I'm searching for a better archiver than LZX with a small unarchive tool compatible with 68000 w/o FPU. I've looked into many options but they either don't compress as well as LZX or they compress better but their unarchive tool are huge and/or need huge library file(s) that the better compression doesn't compensate enough for (when it's data and compression tool that need to fit onto a single floppy).
Any suggestions or is LZX the best option?
-
I'm searching for a better archiver than LZX with a small unarchive tool compatible with 68000 w/o FPU. I've looked into many options but they either don't compress as well as LZX or they compress better but their unarchive tool are huge and/or need huge library file(s) that the better compression doesn't compensate enough for (when it's data and compression tool that need to fit onto a single floppy).
Any suggestions or is LZX the best option?
http://aminet.net/package/util/arc/xz-utils
-
http://aminet.net/package/util/arc/xz-utils
Main program with library is like 300K... not an option.
-
Main program with library is like 300K... not an option.
Then your only option is unrar.
http://aminet.net/package/util/arc/unrar-68k-amigaos-bin
-
Then your only option is unrar.
http://aminet.net/package/util/arc/unrar-68k-amigaos-bin
Can't get RAR to compress better than LZX and the unrar executable is still too big.
-
Then I guess you are stuck with LZX.
-
Can't get RAR to compress better than LZX and the unrar executable is still too big.
Stupid question: Have you tried RAR's 'Create solid archive' option?
Also, try compressing executables and libraries with Power Packer to save more space.
-
Stupid question: Have you tried RAR's 'Create solid archive' option?
Also, try compressing executables and libraries with Power Packer to save more space.
To be hones probably not... if you could give me an compressionrate optimized command line sample for RAR I'd appreciate it.
I found Imploder (with merged library) to compress better than PowerPacker so used that to compress UnLZX ta about 13K but since LZX have best compression rate that these one the fly extraction options (and require no library) just about everything else get the LZX treatment.
Knowing LZX2 (.CAB) give better compression I had hoped to find a good implementation for the Amiga but still the extraction tool cabextract although it can be compressed to about 35K still require a 175KB library so the gain is lost.
-
if you could give me an compressionrate optimized command line sample for RAR I'd appreciate it.
Sorry, I only use the Windows GUI version :o I don't even have it on the Amiga, and always use LZX. Just read about RAR's features and how to use them in the manual, or compress them on the pc. Not much help, I know :o
-
Try crunching all everything in C: L: Libs: and Devs: with this.
http://aminet.net/package/util/shell/lzma-exe
-
Try crunching all everything in C: L: Libs: and Devs: with this.
http://aminet.net/package/util/shell/lzma-exe
Requiers 90KB program and a 215KB library to work... everything like that is already LZX compressed.
-
What are you trying to do exactly, and how much more space do you need? I assume everything must fit on a 880Kb formatted floppy? Perhaps it can still be done with just LZX.
-
What are you trying to do exactly, and how much more space do you need? I assume everything must fit on a 880Kb formatted floppy? Perhaps it can still be done with just LZX.
Trying to fit something on a DD floppy that doesn't fit basicly so have skipped parts but am trying to minimze the amount of skipped parts as much as possible. It works as it is but if I could save some KB by using a different archiver I'd like to do that.
I know there are better compression out there but nothing seem to rival LZX with it comes to the size of it's extraction tool UnLZX Imploder compressed to 13KB with no external library needed and working on 68000 w/o FPU is impressive.
Basicly it's a question if I'm doing it right using LZX or if I have overlooked something (I don't think I have but would live to be proven wrong on this as it would meen freeing up some space for further improvements). :D
-
But, what exactly is the data you're storing? Archivers are general purpose and may net be optimal for the things you're compressing.
-
But, what exactly is the data you're storing? Archivers are general purpose and may net be optimal for the things you're compressing.
Indeed. It would be useful to know exactly what he's trying to compress as one size does not fit all when it comes to compression.
-
Indeed. It would be useful to know exactly what he's trying to compress as one size does not fit all when it comes to compression.
Texts, pictures, executables etc... a general mix of things that LZX seem to to a good jobb of compress (about 60% i think). As said, I'm pleased with LZX's compression, I know LZX2 (.cab) compress it better by about 50KB but with extractiontool 4 times that it's not an option unless someone have 68000 compatible, library free, LZX2 extractor that takes less than 63KB.
-
LZX is the best and fastest compression tool for the Amiga
sorry but nothing superior exists
-
sorry but nothing superior exists
As far as Amiga file system specific archivers go, that's probably true. But when compressing specific file formats you can almost always do better than any archiver can. For example, for raw audio, FLAC will always beat any general purpose archiver hands down, no questions asked. For non-lossy images you need the PNG format (and that's not fantastic either). For text files you can do better as well if you have a specialized text compressor, and so on.
In this case, the problem is the multiple file formats, but it could've been just audio, or just text, and then we could've done better than archivers.
-
All versions of LZX are bugged.
LZX should never be used for anything since they are all bugged and you never know when the bug will strike and ruin your archive.
LZX is very most definitely not the fastest at creating archives.
If you want something that can quickly create an archive and that actually works all the time, you need LHA.
-
For example, for raw audio, FLAC will always beat any general purpose archiver hands down, no questions asked.
For audio, Shorten (.SHN) will always beat FLAC hands-down. Approximately the same compression but shorten uses massively less CPU power to decompress.
-
If you want something that can quickly create an archive and that actually works all the time, you need LHA.
Not very quick nor working all the time. I was trying to create a big archive from about 40k files totaling 800M, and after 40 minutes on a 50MHz 68030 LHA hadn't even reported on the first file, but had instead managed to corrupt memory and crash AmigaOS.
So, if anyone's doing big archives you might want to look at yet another option.
-
1. You forgot to install TLSFmem into your OS. You absolutely must do this.
2. You forgot to assign T: to your hard drive somewhere. I know its dumb and not your fault but that is just how it is with lha making large archives.
-
All versions of LZX are bugged.
How do you know this?
For audio, Shorten (.SHN) will always beat FLAC hands-down. Approximately the same compression but shorten uses massively less CPU power to decompress.
And has far fewer features than modern lossless codecs. I also doubt Shorten is as good as modern codecs. And last but not least, I use a fast peecee for playback, and FLAC uses up an infinitesimal amount of CPU time here. On Amigas I use WAV. Much faster.
Anyway, seeing how the OP wants the best compression rates and it's for archiving, FLAC would be the way to go if their data was mostly audio.
2. You forgot to assign T: to your hard drive somewhere. I know its dumb and not your fault but that is just how it is with lha making large archives.
Why would anyone want to assign T: to their HD?
Until you can prove LZX is bugged, I'm gonna stick to LZX :p
-
7zip ?
-
7zip ?
...is a memory hog compared to the others on Amiga.
-
I used the EPU14 Packer in the past alongside the NUKE compression algo. It worked really well for me. I was able to put all of WB1.3 and Extras on a single DD floppy.
http://aminet.net/package/util/pack/epu14
You copy the base libs over (really not much needed) and then execute them very early in startup. You have to make sure to leave uncompressed anything that needs to execute, but this isn't much. You can use XPK style compressors like xpkNUKE and you don't even need the large xpkmaster.library file. You only need the following files on the disk in an uncompressed status
DF0:S/Startup-Sequence [variable number of bytes]
DF0:C/EPU [8916 bytes]
DF0:L/EPU1.handler [3496 bytes]
DF0:L/EPU2.handler [10100 bytes]
DF0:L/EPU3.handler [5708 bytes]
DF0:Libs/epu0.library [4192 bytes]
DF0:Libs/epu1.library [1564 bytes]
DF0:Libs/lh.library [2864 bytes]
DF0:Libs/xpkNUKE.library [2900 bytes]
Total: 39740 bytes
Everything else is compressed at over 40% when using a library like NUKE and it's possible that epu1.library and perhaps some of the other numbered handlers may not be required either. This also works on OS 1.x, 2.x and 3.x.
The compression and decompression is completely transparent and you don't need any other files hanging around to decompress anything. You can even place all the files on the disk while the compression flag is enabled and then once it's all set, you boot the disk with compression off and only decompression on. This prevents you from worrying that someone will compress one of the critical startup files.
-
7zip ?
Apart from the main program it also requier a 175KB library... not an option.
-
Texts, pictures, executables etc... a general mix of things that LZX seem to to a good jobb of compress (about 60% i think). As said, I'm pleased with LZX's compression, I know LZX2 (.cab) compress it better by about 50KB but with extractiontool 4 times that it's not an option unless someone have 68000 compatible, library free, LZX2 extractor that takes less than 63KB.
If you are only compresing for archiving/backup purposes why do you need to fit the unarchiver on the same floppy? You could create a single boot floppy with all the xad/xfd/xpk libs/decrunchers you need and use it to boot up and decrunch the stuff that is on your other disks.
Have you tried the '-3' argument with LHA btw? Might shave of a few bytes here and there and be comparable to LZX.
Save your bitmaps as PNG, your photos as JPEG, your text and exes/libs/devices/handlers etc with XZ/NUKE/RAR/EPU (or whichever) and you should be able to fit a lot more on a floppy than just plain old LZX for everything.
-
@Brian
If you are using KS 3.0 in your machine then you are allowed to format your floppy disks using FFS instead of OFS. This gives you a bunch of extra storage space for free.
-
You could create a single boot floppy with all the xad/xfd/xpk libs/decrunchers you need and use it to boot up and decrunch the stuff that is on your other disks.
Good idea!
In addition, you can also use a 1MB file system. Disks formatted with that give you almost a megabyte of free space (the filesystem uses the whole disk, OFS and FFS don't).
-
If you are only compresing for archiving/backup purposes why do you need to fit the unarchiver on the same floppy? You could create a single boot floppy with all the xad/xfd/xpk libs/decrunchers you need and use it to boot up and decrunch the stuff that is on your other disks.
Have you tried the '-3' argument with LHA btw? Might shave of a few bytes here and there and be comparable to LZX.
Save your bitmaps as PNG, your photos as JPEG, your text and exes/libs/devices/handlers etc with XZ/NUKE/RAR/EPU (or whichever) and you should be able to fit a lot more on a floppy than just plain old LZX for everything.
It is not a question of backup purpose (but I don't want to alter the files). The project of mine require it to fit on a single self sustaining DD floppy.
So the question is simply if LZX is the best option (compression rate, extract tool size, CPU/NoLIB requirements)... looking at this thread I think I have that question answered but wanted to ask it to be sure.
-
It is not a question of backup purpose (but I don't want to alter the files). The project of mine require it to fit on a single self sustaining DD floppy.
Can you give us a bit more detail to work with then we might be able to provide better solutions for you. i.e. What exactly is your project, what are it's goals/what are you trying to achieve as an end result?
So the question is simply if LZX is the best option (compression rate, extract tool size, CPU/NoLIB requirements)... looking at this thread I think I have that question answered but wanted to ask it to be sure.
It's quite possible, but we can't be sure without more information from you.
-
Yo, Brian - long time no see! :)
I assume you're unpacking something into RAM: and need something with a very small memory footprint in order to fit everything?
-
Can you give us a bit more detail to work with then we might be able to provide better solutions for you. i.e. What exactly is your project, what are it's goals/what are you trying to achieve as an end result?
I need an archiver optimized for compression rate (speed is not a priority) and need to work with mixed file types. The extraction tool has to work with just a 68000 w/o FPU and require no external libraries.
There is more data than actually fits unless the compression rate can somehow be increased. The data archived need to fit onto a single DD floppy together with the extraction tool hence the extraction tools size is a factor that has to be calculated with.
More info shouldn't really be necessary.
-
Yo, Brian - long time no see! :)
I assume you're unpacking something into RAM: and need something with a very small memory footprint in order to fit everything?
Hi!
Some is extracted to RAM: (startup files) so although small memory footprint is nice it isn't the biggest priority, compression rate is (together with extraction tool size).
-
I need an archiver optimized for compression rate (speed is not a priority) and need to work with mixed file types. The extraction tool has to work with just a 68000 w/o FPU and require no external libraries.
There is more data than actually fits unless the compression rate can somehow be increased. The data archived need to fit onto a single DD floppy together with the extraction tool hence the extraction tools size is a factor that has to be calculated with.
More info shouldn't really be necessary.
Well obviously it is neccessary for me to know more so I can understand better or I wouldn't be asking would I?
Anyway, one thing you can try to increase the compression ratio is to first create a completely uncompressed archive with all the files you want inside it, then compress this single file at the highest compression rate with LZX.
It'll probably save you quite a few kilobytes. It's the same concept behind *.tar.gz on UNIX systems rather than gzipping each individual file.
-
Well obviously it is neccessary for me to know more so I can understand better or I wouldn't be asking would I?
Anyway, one thing you can try to increase the compression ratio is to first create a completely uncompressed archive with all the files you want inside it, then compress this single file at the highest compression rate with LZX.
It'll probably save you quite a few kilobytes. It's the same concept behind *.tar.gz on UNIX systems rather than gzipping each individual file.
Thanks for the suggestion, however since LZX support merging files it only saved me 1.2KB out of 1.7MB compared to letting LZX handle it all in one go, not worth it.
-
All versions of LZX are bugged.
I have used LZX for many years now, and apart from the Y2K bug in unpatched LZX versions, I have not encountered any major problems.
Why would anyone want to assign T: to their HD?
This might be required if:
1. Files that are larger than the currently free RAM on the Amiga is to be added to an archive (with compression).
2. Files is to be deleted from a large archive that has a file size bigger than the currently free RAM.
The above goes for most archivers, including LhA, LZX and Zip.
Thanks for the suggestion, however since LZX support merging files it only saved me 1.2KB out of 1.7MB compared to letting LZX handle it all in one go, not worth it.
Try to increase the maximum merge size in LZX by using the -M option, like for example -M8000
Also make sure to use -3 compression as well.
If you create the archive on a Amiga with 68020 CPU or higher, you can also use the 68020 version of LZX which allows you to use -9 compression instead. This along with increasing the maximum merge size will give you the absolute max compression that can be achived with LZX.
The resulting archive can not be extracted with 68000 LZX since it doesn't support -9 (de)compression, but it will work fine with UnLZX
The above methods are the ones used on my Amiga911 disks.
http://amiga911maker.site11.com
-
Thanks for the suggestion, however since LZX support merging files it only saved me 1.2KB out of 1.7MB compared to letting LZX handle it all in one go, not worth it.
Go with what PanterHZ said, he's been perfecting this art for years with his 911 disks. He's probably the best expert on using LZX on this forum.
-
Try to increase the maximum merge size in LZX by using the -M option, like for example -M8000
Also make sure to use -3 compression as well.
If you create the archive on a Amiga with 68020 CPU or higher, you can also use the 68020 version of LZX which allows you to use -9 compression instead. This along with increasing the maximum merge size will give you the absolute max compression that can be achived with LZX.
The resulting archive can not be extracted with 68000 LZX since it doesn't support -9 (de)compression, but it will work fine with UnLZX
The above methods are the ones used on my Amiga911 disks.
http://amiga911maker.site11.com
Thanks for confirming these options are right. I'm use options "-e -f -F -m -M8000 -Qf -r -9" as I've found that to be the optimum options (yes I need empty archives), can you confirm these also (setting priority doesn't seem to change anything)?
Also compressing files from RAM: to RAM: seem to improve compression rate a bit, same goes for higher CPU speed (use WinUAE to get insane Mips/MFlops). Still LZX outputs size is a bit of a gamble... compressing same files a few times often result in different size archives. Removing files from the archive can sometimes result in a bigger archives etc.
-
Go with what PanterHZ said, he's been perfecting this art for years with his 911 disks. He's probably the best expert on using LZX on this forum.
Oh I don't know if I can call myself an expert, because although I have done a bit of research regarding this topic, I don't know much about the internal workings of LZX. :)
Thanks for confirming these options are right. I'm use options "-e -f -F -m -M8000 -Qf -r -9" as I've found that to be the optimum options (yes I need empty archives), can you confirm these also (setting priority doesn't seem to change anything)?
Also compressing files from RAM: to RAM: seem to improve compression rate a bit, same goes for higher CPU speed (use WinUAE to get insane Mips/MFlops). Still LZX outputs size is a bit of a gamble... compressing same files a few times often result in different size archives. Removing files from the archive can sometimes result in a bigger archives etc.
The options you are using seems fine, but for speeding up compression a bit you may consider increasing the output buffer size by using the -bo option as well. For example -bo256. As for setting the priority, it is mainly useful if you are running some other CPU intensive tasks at the same time as compressing the archive. It will tell the system what kind of priority LZX shall have in regards to the other stuff that are running (either higher or lower priority).
In my experience, the best compression results is achived when absolutely all files are added to a completely new archive. LZX will then examine the files and group similar ones into seperate "blocks", then each block will be compressed and added to the archive. Typically, you may have one block for executables and libraries, one block for icons, and one block for text files. This is actually how the file merging functionality in LZX works.
Now if you were to add another file to the archive, let's say an executable, it will not become a part of the already existing "executables and libraries" block, and thus - optimal compression is not reached.
When removing files from an lzx archive, it is important that you use the -9 option since the remaining files in the block will need to be re-compressed. If you fail to specify this, LZX will use the default -2 compression instead. This may then lead to a larger archive than the original.
There are also a couple of other things you can do for reducing the size of your project:
1. Don't use the UNLZX version that is included with the LZX distribution, use this smaller one instead: http://aminet.net/package/util/arc/UnLZX2
2. If your project contains executables and library files, you might consider using StripHunk for reducing the file sizes on them, you can find it here: http://aminet.net/package/dev/misc/StripHunk
Personally I use it like this:
StripHunk FILE REPLACE DREL32 ZEROS DEBUG SORT
Where FILE is the name of the file to be stripped.
-
StripHunk is only a valid option for uncompressed programs and libraries since the hunk structure seem to become invalid for crunchers after a strip.
-
StripHunk is only a valid option for uncompressed programs and libraries since the hunk structure seem to become invalid for crunchers after a strip.
Yes this is true. After using StripHunk on files, they can in most cases not be safely crunched with PowerPacker, Imploder, XPK etc. But they can however be included in file archives like lha, lzx and zip without any problems.