Not quite sure what you mean. We have "symbolic" assignments that evaluate by "name" and not by "lock". That's precisely the "PATH" option.
I did some more tests with the PATH option of assign and you are correct. My AmigaDOS manual describes PATH as being useful for removable data (which it is) but fails to explain adequately what it is doing and what else it can be used for. The PATH option of assign keeps the relative path instead of converting it to an absolute path. If the original assigns for Libs:, Devs:, L: etc. were assigned with the PATH option and the S:Startup-Sequence changed to add PATH to the assigns:
Assign PRINTERS: DEVS:Printers PATH
Assign KEYAPS: DEVS:Keymaps PATH
Assign LOCALE: SYS:Locale PATH
Assign LIBS: SYS:Classes PATH ADD
Assign HELP: LOCALE:Help PATH
Assign REXX: SYS:Rexx PATH
then we could do:
Assign SYS: CD0:
One minor flaw of the assign PATH option is that it does
not work well with ADD. For example:
Assign SYS: CD0: ADD
My tests would not locate data in the new PATHs (like "Assign LIBS: SYS:Libs PATH" which should look in CD0:Libs now also). Is this a bug or a limitation?
I don't think developing a CPU able to handle Big and Little Endian is "throwing everything away." The rest of your conclusions are "an" opinion.
Bi-endian CPUs are not a good idea, IMO. Instructions which handle endian conversion are a useful part of modern CPU support though. Any choice of CPU endianess is a minor issue and has little to do with anything I talked about. Some of my comments are opinionated. It's clear that Motorola made the wrong decisions but it is not clear what the right choices were. However, if Motorola could have seen the future of processors, they would have kept the 68k and devoted more resources and "positive" marketing to it. That's not to say that PPC was a bad choice at the time but they shouldn't have bet the farm on it. I went to an Amiga show in St. Louis during the '90s where they had a developer conference with several Motorola employes. They told us PPC was the future and hyped up how great it was. The 68k was already talked about as an old dead legacy processor but the 68060 was faster per clock, used 50% less memory and ran cooler than the "more advanced" PPC replacements. The 68060 was still for sale but Motorola was anti-marketing it. Some of my "opinions" are observations as I recall them from the time.
I'm not sure I'd call amd64 inferior to m68k, maybe in your nonexistant parallell universe where m68k was picked by IBM to power the first IBM PCs and m68k would enjoy the kind of attention that x86 has had. But in our universe, no.
How many processors or byte code ISAs since the x86 are based on an 8 bit variable length encoding? Java has an 8 bit byte code (it's so effiicent that most JAVA compilers don't even use it) and Java processors were created which have largely failed due to inefficiency. How many processors and byte code ISAs are based on a 16 bit variable length encoding like the 68k? ARM CPUs with Thumb 2 so the majority of processors produced. Dalvik byte code as used by Andoid is based on a 16 bit encoding. It is less known but used more than Java byte code. The 68k has a more efficient encoding, more powerful addressing modes, better code density and it's easier to read and program than x86 or x86_64. The 68k has not been developed in decades but this is actually easier (and with less baggage) to do now. The x86/x86_64 has excellent proven and well known CPU designs produced from huge amounts of cash flow while the 68k has not been developed and has no funding. The potential of the 68k is as good if not better though.
Well, SYS:Classes is multi-assigned to LIBS: so it is really.
True, It may be more efficient and user friendly to have shallow directory trees also. The datatypes drivers being in DEVS: might be more suspect of a decision as there are no devices related to datatypes.