Amiga.org
Coffee House => Coffee House Boards => CH / General => Topic started by: Super TWiT on April 14, 2010, 03:55:37 PM
-
I am not a programmer, but have decided to learn c (not c++ overly complicated, and everything can be done in c anyway). I have seen AMAZING things done on amigas and c64s and even apple IIs. How was it done in such SMALL resources? Were most programs of that era written in c? Or assembly? ( I am not interested in learning assembly as it would be impossible to take advantage of newer chip flags, and I don't want to be tied into only one architecture) EDIT: This should be moved to science & technology. I didnt' see it there SORRY!
-
I am not interested in learning assembly as it would be impossible to take advantage of newer chip flags
Why do you think so ? Assember is translated into machine language as is C. So why should it be impossible to do something in assembler which you can do in C ?
Actually you can do even more in assembler than in C because assembler translates 1:1 into machine language while C has its fixed feature set which translates into only a subset of all available assembler/machine language instructions.
-
Most Amiga software was written in assembly, or in C (with inline assembler) using the old venerable Lattice C compiler set. The Amiga custom hardware really helped, once you knew how to utilise it.
These days the best assembly language to learn is ARM, in my opinion, if you want to use it in your career. For a hobby you could learn 6502 or Z80 8-bit assembler, then 68000. It gives you a good understanding of what happens on the metal.
-
Why do you think so ? Assember is translated into machine language as is C. So why should it be impossible to do something in assembler which you can do in C ?
Actually you can do even more in assembler than in C because assembler translates 1:1 into machine language while C has its fixed feature set which translates into only a subset of all available assembler/machine language instructions.
What I meant was that it would be near impossible for me to take advantage of all new cpu features & flags on my own. The compiler would write better assembler than I. Also, I don't want to write programs that are architecture specific.
-
I don't want to write programs that are architecture specific.
That's very welcome. But I am sure that in 1985 they didn't care.
Also I think if you start developing nice effects on let's say an AmigaOne with fast PPC processor and modern graphics card, the same program will be dead slow and stuttering on a classic Amiga. Nowhere near what they did in 1985.
-
I guess what I was wondering how they wrote programs in c that took such low resources. It seems today that computers that are much faster are still slow.
-
C is a very simple language. Critics even call it a collection of assembler macros. If you look at the assembler output of a C compiler you'll see that this is almost true. And C does not have a runtime environment. That makes C programs as small as assembler programs (if you avoid static link libraries).
Of course you can write blown-up programs in C, but you can do this in assembler, too. It's just more work. And you can write very efficient programs in C, almost as efficient as you can in assembler. As you said, the C compiler creates better assembler than you do. So why do you think that C programs have to be bigger than assembler programs ?
-
I guess what I was wondering how they wrote programs in c that took such low resources.
Thomas explained it quite well, but I want to add that you should also realize that many things don't need large quantities of cpu power. There are often multiple ways of doing things, and some of those ways are fast, while others are slow.
It seems today that computers that are much faster are still slow.
Some programs, like certain text editors for example, are dead slow on even a Pentium 3 (Pspad), while text handling is nothing a 68030 can't handle at comfortable speeds, even with a few megabytes of text. It just depends on how well something has been implemented, and it seems that few people care about software speed these days :(
Computers get faster, software gets slower :furious:
-
Some programs, like certain text editors for example, are dead slow on even a Pentium 3 (Pspad), while text handling is nothing a 68030 can't handle at comfortable speeds, even with a few megabytes of text. It just depends on how well something has been implemented, and it seems that few people care about software speed these days :(
It depends on what kind of text handling it actually does. Text handling like big IDE's like visual studio, Eclipse or Netbeans are doing, is understandably very heavy because it scans tons and tons of text. It is heavy because it can be heavy. But a program nowadays should know that if you have a low spec computer, it shouldn't load/perform all the features it has. So taking into consideration of nowadays range of performance of computersystems, scalability is IMHO a necessity, rather than performance optimization. Also, you know, making the right choices in your design can make your program a gazillion times faster than any code optimization could do. Even if you're using Java vs. using Assembly.
So, learn how to design, rather than how to code.
-
the amount of data that is being handled has almost increased as much as cpu speed and memory.
most 8bit computers only have a few kilobytes of video ram and only throw around a few thousand bytes to the screen a second, while running only one program at a time.
a modern computer with a 1280x1024 true color screen has to shift about 150 megs of data per second to scroll smoothly, all while running dozens of programs.
i think the main reason peoples computers seem slow though is they have too much crap running in the background adware, and viruses are bad enough but there are larger programs legitimate ones running in the background that the user my never actually use.