Welcome, Guest. Please login or register.

Author Topic: Better archiver that LZX with small unarc tool  (Read 7106 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline PanterHZ

  • Sr. Member
  • ****
  • Join Date: Jul 2009
  • Posts: 295
    • Show all replies
    • http://www.rhz1.com
Re: Better archiver that LZX with small unarc tool
« on: August 13, 2013, 09:38:01 PM »
Quote from: ChaosLord;744494
All versions of LZX are bugged.

I have used LZX for many years now, and apart from the Y2K bug in unpatched LZX versions, I have not encountered any major problems.

Quote from: Thorham;744516
Why would anyone want to assign T: to their HD?

This might be required if:
1. Files that are larger than the currently free RAM on the Amiga is to be added to an archive (with compression).
2. Files is to be deleted from a large archive that has a file size bigger than the currently free RAM.

The above goes for most archivers, including LhA, LZX and Zip.

Quote from: Brian;744610
Thanks for the suggestion, however since LZX support merging files it only saved me 1.2KB out of 1.7MB compared to letting LZX handle it all in one go, not worth it.

Try to increase the maximum merge size in LZX by using the -M option, like for example -M8000

Also make sure to use -3 compression as well.

If you create the archive on a Amiga with 68020 CPU or higher, you can also use the 68020 version of LZX which allows you to use -9 compression instead. This along with increasing the maximum merge size will give you the absolute max compression that can be achived with LZX.
The resulting archive can not be extracted with 68000 LZX since it doesn't support -9 (de)compression, but it will work fine with UnLZX

The above methods are the ones used on my Amiga911 disks.
http://amiga911maker.site11.com
 

Offline PanterHZ

  • Sr. Member
  • ****
  • Join Date: Jul 2009
  • Posts: 295
    • Show all replies
    • http://www.rhz1.com
Re: Better archiver that LZX with small unarc tool
« Reply #1 on: August 19, 2013, 02:41:19 AM »
Quote from: nicholas;744620
Go with what PanterHZ said, he's been perfecting  this art for years with his 911 disks.  He's probably the best expert on  using LZX on this forum.

Oh I don't know if I can call  myself an expert, because although I have done a bit of research  regarding this topic, I don't know much about the internal workings of  LZX. :)

Quote from: Brian;744662
Thanks for confirming these options are right. I'm  use options "-e -f -F -m -M8000 -Qf -r -9" as I've found that to be the  optimum options (yes I need empty archives), can you confirm these also  (setting priority doesn't seem to change anything)?

Also compressing files from RAM: to RAM: seem to improve compression  rate a bit, same goes for higher CPU speed (use WinUAE to get insane  Mips/MFlops). Still LZX outputs size is a bit of a gamble... compressing  same files a few times often result in different size archives.  Removing files from the archive can sometimes result in a bigger  archives etc.

The options you are using seems fine, but for speeding up  compression a bit you may consider increasing the output buffer size by  using the -bo option as well. For example -bo256. As for setting the  priority, it is mainly useful if you are running some other CPU  intensive tasks at the same time as compressing the archive. It will  tell the system what kind of priority LZX shall have in regards to the  other stuff that are running (either higher or lower priority).

In  my experience, the best compression results is achived when absolutely  all files are added to a completely new archive. LZX will then examine  the files and group similar ones into seperate "blocks", then each block  will be compressed and added to the archive. Typically, you may have  one block for executables and libraries, one block for icons, and one  block for text files. This is actually how the file merging  functionality in LZX works.
Now if you were to add another file to  the archive, let's say an executable, it will not become a part of the  already existing "executables and libraries" block, and thus - optimal  compression is not reached.

When removing files from an lzx  archive, it is important that you use the -9 option since the remaining  files in the block will need to be re-compressed. If you fail to specify  this, LZX will use the default -2 compression instead. This may then  lead to a larger archive than the original.

There are also a couple of other things you can do for reducing the size of your project:

1. Don't use the UNLZX version that is included with the LZX distribution, use this smaller one instead: http://aminet.net/package/util/arc/UnLZX2

2.  If your project contains executables and library files, you might  consider using StripHunk for reducing the file sizes on them, you can  find it here: http://aminet.net/package/dev/misc/StripHunk
Personally I use it like this:

StripHunk FILE REPLACE DREL32 ZEROS DEBUG SORT

Where FILE is the name of the file to be stripped.
 

Offline PanterHZ

  • Sr. Member
  • ****
  • Join Date: Jul 2009
  • Posts: 295
    • Show all replies
    • http://www.rhz1.com
Re: Better archiver that LZX with small unarc tool
« Reply #2 on: August 25, 2013, 09:19:37 PM »
Quote from: Brian;745318
StripHunk is only a valid option for uncompressed programs and libraries since the hunk structure seem to become invalid for crunchers after a strip.

Yes this is true. After using StripHunk on files, they can in most cases not be safely crunched with PowerPacker, Imploder, XPK etc. But they can however be included in file archives like lha, lzx and zip without any problems.