Welcome, Guest. Please login or register.

Author Topic: New Replacement Workbench 3.1 Disk Sets www.amigakit.com  (Read 6748 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline olsen

Re: New Replacement Workbench 3.1 Disk Sets www.amigakit.com
« on: November 13, 2014, 11:24:59 AM »
Quote from: Thomas Richter;777232
Despite legal constraints: There's also a serious size constraint here to place 3.9 (or parts thereof) in ROM. The workbench.library is considerably larger than the 3.1 version, and so is the Shell. One way or another, something has to go from ROM. Workbench is a candidate, audio is a candidate, mathieesingbas is a candidate (the latter for more than one reason). It remains a bit unclear which compatibility issues may arise from this, which is probably the reason why nobody wants to do it.
Actually, if one were to build a new ROM there could be sufficient free space to keep all components in it. I tested this with a fully native build that utilizes SAS/C for almost all components. That change allows for considerable space savings, e.g. the A4000T ROM has enough room for the SCSI/ATA scsi.device flavours and workbench.library to fit. Taken a step further, the disk-based icon.library that was part of AmigaOS 3.5/3.9 could fit, too. The same could be done for the A1200 V40 ROM which has almost no free space left.

However, this native build is pretty much untested. If one were to use its components, mix them with existing components, the results might be more mature and robust than in a fully native build.
 

Offline olsen

Re: New Replacement Workbench 3.1 Disk Sets www.amigakit.com
« Reply #1 on: November 13, 2014, 08:17:41 PM »
Quote from: Thomas Richter;777269
Strange, so what's so incredibly huge in the 3.1 ROMs that CBM got so tight on ROM space? intuition alone? Ok, there *is* indeed some headroom left in the A2000 ROMs (and A500's of course), but as soon as the scsi controler or/and the on-board IDE had to go into ROM, CBM tried really to cram in the bytes.
As far as I recall the only two V40 ROMs which had little room to spare were those for the A1200 and A4000T models.

The A1200 ROM basically covers everything which the A500/A600/A2000 needs, which means PCMCIA hardware support. The big difference is in the graphics.library, which is significantly larger for the A1200 due to AA support, than the ECS version used in the A500/A600/A2000 ROM.

The A4000T ROM has even more crammed into it, which includes the ATA scsi.device, the really large A4091 scsi.device (which itself includes the bootstrap script for the NCR SCSI controller) and the large AA graphics.library. The combination of these components left no room for workbench.library, which was moved out to disk.

V40 was built almost exclusively using Lattice 'C' 5.04 which did not feature the more refined and effective optimization functionality available later in SAS/C. Commodore did not use SAS/C for production code, or for that matter, used 68020 code generation, because of code generation maturity issues. Because of this, all the compiled 'C' code was targeted for the plain 68000, and no 68020 specific optimizations could have helped to reduce the code size for the A1200/A3000/A4000/A4000T.
 

Offline olsen

Re: New Replacement Workbench 3.1 Disk Sets www.amigakit.com
« Reply #2 on: November 14, 2014, 09:35:45 AM »
Quote from: matthey;777318
Please leave the icon.library out of Kickstart. Everybody is using PeterK's icon.library because it's much faster, smaller and supports most Amiga icon types.
Well, I'm not using it ;) So far I'm reasonably satisfied with the icon.library which I wrote for the OS 3.5 update. If there is a problem badly in need of a solution, it's in how workbench.library and icon.library interact for directory scanning and display. It just does not scale: larger icon files, more files, no matter what, the performance and responsiveness quickly goes down the drain.

Quote

SAS/C "refined and effective optimizations"? You have to be kidding.
Hey, I wrote "*more* refined and effective", and the reference was Lattice 'C' 5.04. SAS/C 6 was definitely an improvement considering the quality of the code optimization. However, it did take a couple of years to mature (1995-1996), by which time Commodore could no longer put it to good use.

I was told that Commodore was a driving force in getting SAS, Inc. to improve the code generator and the optimizer. They would submit samples of code as produced by the Green Hills compiler (obviously, they could not share compiler source code) and ask the compiler developers at SAS to replicate the results. Step by step the compiler improved.

Quote
The icon.library was compiled with SAS/C and PeterK's optimized version is now about 35% smaller with much added functionality (my record library reduction is 43% but that was an early version of GCC/EGCS which I could take to half size with some effort).
I can't comment on the size and functionality of the replacement icon.library, as I have never used it. I only spent a couple of months rewriting the icon.library from scratch, integrating NewIcons support, colour icon support, etc., making it work better with workbench.library, building new APIs, etc. The focus was not on optimizations for size or speed, because icon loading is pretty much restricted by what the file system can do (and that isn't much). My focus was more on making the whole thing as robust as I could, and on opening up the API.

Quote
I would say that SAS/C is better for size than speed. I have a working and well tested workbench.library which is 191168 bytes without any hand optimizations from me (it has bug fixes applied). I bet I could optimize away another 10kB or so with basic hand optimization (getting rid of that slow SAS/C copymem routine would probably save 500 bytes alone). Granted the code quality is nowhere near as bad as the intuiton.library. It might be worth trying vbcc for small executable sizes. Vbcc's features:

+ best 68k peephole optimizing assembler ever in vasm
+ uses optimized inlined assembler functions (the default)
+ sophisticated optimizations that exceed SAS/C (some don't seem to work)
+ cross-assembler for fast compiles on faster computers
+ good Amiga and 68k features and support (Amiga hunk output, IEEE math libraries for fp)
+ actively maintained by knowledgeable and helpful people who know the 68k and Amiga
+ source code available and compiles on a 68k Amiga with few dependencies
+ free for Amiga use
+ easy Amiga installation
+ good c99 support
? some of the link code is highly optimized (hit and miss)
- the 68k backend is average at best
- no 68k instruction scheduler
- lacking tools although many GCC and SAS/C tools are compatible (CPR debugger)
- slow at compiling
- memory hungry
- no C++ support

There should be a much improved version of vbcc out in the next few weeks. SAS/C is a dead end last decade compiler. How about giving the new version of vbcc a try?
Colour my curious. Where do I start?

The lack of an interactive source debugger is something of a dealbreaker, though. I'd hate to go back to where I was back in 1987. Life's too short for peppering your code with printf()s and assert()s, rerunning it, watching it crash, modifying it and rerunning it all over again. Now CodeProbe may not be much fun, but it's not that big a productivity sink as "old school" debugging is.
 

Offline olsen

Re: New Replacement Workbench 3.1 Disk Sets www.amigakit.com
« Reply #3 on: November 15, 2014, 08:58:26 AM »
Quote from: matthey;777337
I appreciate that you are proud of your own work, and I'm not saying it's bad, but PeterK's icon.library really is significantly faster and it supports PNG and AmigaOS 4 icons as well as everything it did before while shrinking over 1/3. There is good and then there is amazing ;).
I understand that PeterK's icon.library is written entirely in assembly language. Given the complexity of the whole design, warts and everything, that's nothing short of very impressive. Even the integrated PNG decoder is in fact integrated into the library code and not merely latched onto it. It appears that the library has been under constant development for almost four years now, and it shows.

Suffice it to say that spending four years on polishing an assembly language implementation of icon.library is the kind of luxury that few are able to afford, and which in the context of the OS 3.5 project would not have been an option.
Quote
The guys that did SAS/C were professional, fixing a lot of bugs and giving a lot of Amiga support. The basic code generation was ok but they did some weird stuff like branching into CMP.L #imm,Dn instructions for little if any advantage and they loved the double memory indirect addressing modes like ([d16,An],od.w) which was used more with later versions (IBrowse has 1968 uses). These didn't hurt the 68020 code as much as for 68040 and 68060 where instruction scheduling is sorely needed. There are way too many byte and word operations for the 68060 which is most optimal with longword operations also. The direct FPU generation is poor for the 6888x and worse for the 68040+. It should be possible to generate good quality code for the 68020-68060, excluding the 16 bit 68000.
It's possible that this development path was eventually taken. SAS, Inc. acquired the compiler so that it could more easily port their stochastic analysis package to more platforms and produce better quality ports. At the time the interest was less in making a better Amiga compiler, but in providing other 68k platforms (namely the Apple Macintosh) with the SAS flagship software.

As far as I know the Amiga compiler business did not actually make much money (probably lost money), but it became a convenient test bed for compiler development. At the time it's likely that there were more users of the SAS/C compiler for the Amiga than there were customers for the SAS software that was built using the same compiler technology. Commercial support for SAS/C ended long before the last patch for SAS/C was released for the Amiga, and it's possible that further enhancements to the code generation were made that never saw integration into SAS/C for the Amiga.
Quote
Looking at other compilers code generation is a good start. It's hard to imagine that Green Hills compiler was once better after looking at the intuition.library disaster. The Green Hills compiler is still around and pretty well respected in the embedded market for it's optimizing capabilities. They still have a ColdFire backend but I couldn't tell whether they had dropped 68k support.
As far as I know the Green Hills compiler (referred to as "metacc" in the slim "AmigaDOS developer's manual", ca. 1985) used to have a major advantage not just in performing data flow analysis, but also in generating code sequences, back in 1984/1985.

This was an optimizing 'C' compiler intended for use on Sun 2 / Sun 3 workstations, which was adapted so that it emitted Amiga compatible 68k assembly language source code (as an intermediate language). That source code was then translated using a 'C' language precursor version of the ancient "assem" assembler into object code format suitable for linking. Mind you, this was not an optimizing assembler, just a plain translator. All optimizations happened strictly within the 'C' compiler.

What exactly rubs you the wrong way with Intuition?
« Last Edit: November 15, 2014, 11:06:47 AM by olsen »
 

Offline olsen

Re: New Replacement Workbench 3.1 Disk Sets www.amigakit.com
« Reply #4 on: November 16, 2014, 11:27:19 AM »
Quote from: matthey;777555

There isn't anything wrong with how the Green Hill's compiler works but there are signs of lack of maturity in the compiler like this:

...


When I see MOVE.L Dn,Dn, I know the compiler has problems with it's register management. Compilers repeat the same mistakes of course. Add in all the function stubs because it can't do registerized function parameters and it's pretty ugly. It might be passable for a 68000 but for a 68020+ there are a lot of places that EXTB.L could be used, MOVE.L mem,mem instead of MOVE.B mem,mem and index register scaling in addressing modes.
I think I understand your criticism. My experience with 68k assembly language could charmingly be described as "finding exceedingly clever ways not to use it", so I'm in not an expert in writing or optimizing it ;)

From what I gather, an optimizing assembler would have had difficulties improving upon the sequences the compiler emitted, because the relationships between the sign extension and the repetition thereof are not easy to spot.

Shuffling register contents around in order to avoid pushing them to the stack looks like a reasonable strategy in 1983's terms. This sort of thing only started to become a problem when it caused pipeline stalls, didn't it? For an 68010 or 68020, which would have been the targets of the compiler, it should have worked fine.

The compiler seems to be restricted to emitting 68000 code only, so no "extb.l" for you ;)

Given its age (the 68000 didn't become available until 1979, if I remember correctly, and the core of the compiler seems to date back to 1983), this still makes it a pretty good compiler which no doubt would continue to mature over the 1980'ies when the 68000 family saw widespread adoption, not just in desktop computers.

Aside from the ABI issues (function parameters passed on the stack), I would criticize the compiler for being obtuse in error reporting (at least, the logs which I saw are not particularly helpful). It's (of course) following K&R syntax rules with a few oddities included. For example, you could pass structures "by value", and the compiler would cleverly pack 'struct Point { SHORT x; SHORT y; }' into a 32 bit scalar value which would be passed on the stack. Problem is, Intuition *assumes* that the 'struct Point' parameter will be passed as a scalar value, and if you change compilers (say to SAS/C 6.50) then this assumption will no longer hold.
 

Offline olsen

Re: New Replacement Workbench 3.1 Disk Sets www.amigakit.com
« Reply #5 on: November 17, 2014, 09:00:44 AM »
Quote from: Thomas Richter;777620
Yes, except that the shuffle is here used to sign-extend a variable which is already sign-extended to begin with, so all the register-rearrangement is superfluous to begin with.  

Actually, if you follow ANSI C and would pass struct Point as a parameter, then it does require the compiler to pass this argument as a copy. I'm not sure what SAS/C does with that in this case, but it is not unreasonable to assume that it would simply copy it on the stack, too. In that case, the two options are pretty much equivalent: Whether the two members of the struct are on the stack because the compiler pushes them there as a single 32-bit longword due to the stack-based ABI, or whether it is on the stack because the compiler copied them there does not matter much. You probably cannot enforce the order on which they appear on the stack with Greenhill.
The problem with Intuition and its relationship with the Green Hills compiler is that Intuition depends upon the 'struct Point' to be passed as a scalar value.

I spent considerable time porting Intuition to use SAS/C, so that it could be built natively on an Amiga, and that's when I found how "fast and loose" Intuition plays with function parameter passing.

Given the quality of the code in general, I reckon it is not an accident or oversight how the K&R function and the call-back routines used in the state machine of Intuition treat their parameters. These must have been deliberate design choices.

Quote
Anyhow, I wonder what actually depends on this.

If I remember correctly (it's been a while), most of the operations which involve planar geometry (is this point within this area? do these areas overlap? scale this area to this size) use the packed 'struct Point', and Intuition uses these both in function parameters and in BOOPSI messages.

It's not as if these were the foundations upon which Intuition rests, but the use of this type of data structure is so pervasive that fixing up the code to make it less compiler implementation dependant will quickly get on your nerves.

Quote

One thing that depends unfortunately on the stack-based ABI (but I don't remember whether it depends on struct Point) was Boopsis, which for that reason felt like an alien to the system, too (besides the other obvious one).

I suppose this is true for every case of SmallTalk methods and practice transplanted to a different system (Objective-C comes to mind) ;)

Quote
The boopsi callbacks received parameters on the stack, unlike everything else where one would have probably used registerized parameters or a struct Hook * to call into the user functions.
That's the SmallTalk legacy. It could have been worse: imagine that these were TagItem lists, and how slow the parameter processing would have been.