Welcome, Guest. Please login or register.

Author Topic: 486dx2 System Question  (Read 11118 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline da9000

  • Hero Member
  • *****
  • Join Date: Mar 2005
  • Posts: 922
    • Show all replies
Re: 486dx2 System Question
« on: November 19, 2007, 11:53:25 PM »
Follow what BlackMonk says, and to reiterate:

If it says "DX2" on the chip, it's _DEFINITELY_ not a chip that runs at a 50Mhz *BUS* clock (I'm not talking about the internal/core speed). Those chips were very rare and they were marked as DX-50, not DX2-50. They ran at 50Mhz BUS frequency (very fast) and 50Mhz CORE frequency.

Now, the DX2 chips used either a 25Mhz or 33Mhz or 40Mhz BUS frequency and they had a fixed internal multiplier like BlackMonk listed. The bus frequency was determined by the motherboard (and this infact means that you could even use a 16Mhz or 20Mhz clock, which would severely underclock the chip - I've done that in the past while trying to figure out what the jumpers do).

So if you want you system to be the fastest, no matter if you use the DX2-50 or the DX2-80, set the frequency to 33Mhz (or higher if it allows) and see what happens. Of course use a fan, to make sure you don't have any melt-down, but in general the 486s where tough - never seen one melt (internally or externally - usually the melt-down is internally, which makes the CPU just not work - like a brick). Next best thing to do to find out the actual CORE frequency is to use some software for benchmarking (either dhrystones or somesuch, but it's been a while since I've run some to remember names - also, some cache/memory testing programs will give you Mhz rating via using either CPUid instruction or doing instruction timing -- come to think of it, the easiest way is to use a bootable Linux distribution, like Knoppix and seeing the Bogomips of the booting kernel -- or faster yet, a boot disk from Slackware will do the same thing as soon as it boots the kernel)

 

Offline da9000

  • Hero Member
  • *****
  • Join Date: Mar 2005
  • Posts: 922
    • Show all replies
Re: 486dx2 System Question
« Reply #1 on: November 24, 2007, 11:26:55 PM »
Quote

BlackMonk wrote:
I like old hardware!  Sorry for bumping the topic.
You may wish to look for some old demos, too:
http://scene.org/


Ah, and old DOS demo-scener! Should have figured it so...

I like old hardware as well, although to be honest I like old non-DOS hardware better (with some exceptions, like GUS, hehe)


Quote

BlackMonk wrote:
from Future Crew should work.  That's how I got into the mod/demo scene, through my ol' DOS machine.  The BEST sound card for mods and demos is probably the old Gravis UltraSound (GUS).  But for games, the SB16 is golden.


Future Crew! Ah, good days! (what's going on Trug and PSi ?? BitBoys?)

I've got a couple of those GUSs still, and SoundBastards...


Quote

BlackMonk wrote:
I wish I still had my GameBlaster/Creative Music System (CMS).  It was an AM-based sound card, the one that Creative Labs made RIGHT before the SoundBlaster which was FM-based.  You could even buy CMS chips for the earlier SB's that added GameBlaster compatibility.


You mean the "Adlib" cards? :-)
Got a programmer's manual of that somewhere, still... I think...


Quote

BlackMonk wrote:
SB16+WaveBlaster (or WaveBlaster 2 or that Yamaha one) was pretty hawt back in the day, too.

My preferred audio setup was SB16ASP + WB2 and a GUS ACE (later a GUS PnP Pro with 8 MB of RAM).  Ah the SB16ASP.  Back before Creative Labs got sued for the ASP part of the name and had to call it CSP instead.  Plus, mine had a crappy proprietary CD-ROM interface!  


Got some of those GUSs and SBs. I think I also have one or two with the propriatery not-quite-IDE CDROM interfaces. Maybe even have such a CDROM. Not sure. Also have a PAS sound card with a SCSI interface on it.

Quote

BlackMonk wrote:
Hardware back then was cool.  Hell, ever heard of the 3DOBlaster?  Or that Stacker card, was for hard disk


Yeah, some of it was cool. Never had the Stacker card :-( Just tons of floppies!

Quote

BlackMonk wrote:
compression with hardware-based compressor?  Or even a hard
drive on an ISA card (HardCard by Quantum)?  Hardware was


Hahaha! Got one, with a 20MB drive on it :-) Anyone want to buy it, it still works!


Quote

BlackMonk wrote:
The stuff nowadays is just... boring.  It's been the same as it has been for 10 years, just faster.  On one hand it's nice that things are faster, more efficient, and standardized, but on the other hand there's little sex appeal left for hardware nerds.


I feel the same way many times... Not sure if it's age though or some hard "fact", meaning: things have changed that even a newbie (as we were then) wouldn't find them as appealing. I dunno. I feel sometimes kids feel the same way about their "old" Pentium4, now that they've upgraded to Core2Duo... but perhaps that isn't so. Perhaps it's just a different "kind" of feeling they get with their old tech. Not sure. Perhaps asking the Homebrew Computer Club guys about their past and their feelings for it might help clarify perspectives... Ah well, gotta wait until next VCF to do that! ( Or during the Trammiel talks on December 5th or 9th is it!??! anyone going? wanna buy a  carton of eggs just in case?? :-D )

Ramlbing over
 

Offline da9000

  • Hero Member
  • *****
  • Join Date: Mar 2005
  • Posts: 922
    • Show all replies
Re: 486dx2 System Question
« Reply #2 on: November 27, 2007, 04:30:38 AM »
Wow, lots of good posts!

Let's disect...

@BlackMonk:
You're right about the where-abouts of the Future Crew crew :-)
I've got a one degree connection to Purple Motion (makes music for my friend's game company), and some of the other guys (good friends with Gore).

Indeed the BitBoys were bought out by ATI for $11-30 mil., I forget right now, as a "reflex motion", when NVidia bought out another Finish mobile 3D company, Hybrid, for about the same price (I'm friends with the two founders). All in all, they did pretty well even if they didn't have any success in  the consumre hardware sector :-(

CMS: now that you and DamageX explained it in detail, I remember it! It was the competitor to Adlib. Indeed not the same.

SB16 SCSI, huh? Don't think I've seen it, or at least my memory doesn't say so!

Very interesting info about the propriatery CDROM with the wacked out through-put rate. All I can say is WTF!? :-)

Scary experience, your Win98SE on a ISA hard-card! I refuse to punish myself like I did back in the M$ days! :-p

Punch cards & tubes: see what I mean? :-D

Sliphead was a great game! I remember my friend had it and we used to play it. And DeathTrack was great too! I remember the day a friend and I went to a computer show and bought it, and took the bus to get back home eager to play it! Ah, those were good times!



@DamageX:
Thank you for the low-level details on sound cards! You seem to have some very good knowledge on this arcane subject. It'd be nice to see it added to your site! (hint, hint)

Now I'll add some of my own low-level details for anyone like BlackMonk looking to refresh their old skool memory :-)

These hardcards had their own ROM (much like SCSI cards, or even Ethernet cards, etc), which was usually mapped onto the low 1MB memory space (C0000-F0000, where A0000-BFFFF=video card buffers and F0000-FFFFF=IBM PC BIOS, like Amiga Kickstart in a way) via a jumper set (and possibly IO ports, I forget right now). Then you had to instruct the "OS" (DOS or Windows was a joke not an OS) or relevant memory expanders (EMM386), to avoid using that memory area as it was mapped to the ROM of the card. No such thing as AutoConfig in the crappy PC architecture :-) Afterwards the BIOS would communicate with the card's ROM for accessing the disk(s) on it, and your software would talk to the BIOS to do its work, so in the end it was transparent.

As an aside, this is why "BIOS Shadowing made those old computers and their crappy software faster (shadowing copies the ROM chip contents onto RAM, which is much faster - like Amiga BlizKick in a way), because they relied on the BIOS to do their work, and didn't have direct hardware access, aka drivers - and at the same time it's yet another reason BIOS Shadowing won't help you when you're running a modern OS, like Linux for example.



@koaftder:
If I'm not mistaken that (16 color EGA game) was John Carmack's first entry into the 3D labyrinth / FPS genre, and the precursor to Wolfenstein 3D!

EDIT: memory still functions: check!
http://en.wikipedia.org/wiki/Catacomb_3D

@Invisix:
I would agree with your B) reason for enjoying playing with these old systems, but you are insane if you think this is true:  "A) DOS and Win 3.11 are pretty stable and a hell of alot safer for internet use than modern systems (due to OS flaws, holes, 'bugs', etc)"  :-D

I won't even comment. It's a troll-bait waiting to happen. Suffice to say: you need to either LAY OFF'EM DRUGS SON, or study a bit more about computer operating systems technologies :-)

To add to the validity of the nostalgia feelings: I was very impressed recently, after seeing again my old 3D code doing 30+ fps with 1k gouraud polys on a completely unacccelerated crappy architecture like a 486 PC. Today's CPUs are 100x faster and multi-core, but you could still do a "decent amount of work" with those old boxes.

Now as for Macs, all I'll say is that they were cool little machines. Lots of good stuff in them, and the OS even though technically crap, feature-wise and UI-wise was excellent. I wish some of the UI features, the resolution game, font-stuff and "UI consistency" was brought to the Amiga. Other than these the Mac had nothing going against the Amiga. (PS. I have lots of Mac stuff if anyone's looking for parts, and some rare accelerators too)

Ataris were somewhat cool as well (especially when the Carebears demo group got their hands on them), but not much point to them when an Amiga was there at the same time, and far superior hardware and software wise.

And of course the King of Kings (no, not Alexander the Great), or the Queen of Queens (no, not Cleopatra of Egypt), was and is the venerable Amiga.

Cheers to all the old skoolers!
 

Offline da9000

  • Hero Member
  • *****
  • Join Date: Mar 2005
  • Posts: 922
    • Show all replies
Re: 486dx2 System Question
« Reply #3 on: December 01, 2007, 03:09:09 AM »
@DamageX:
Interesting info. You win the crown for most esoteric and arcane subject matters ;-)


@Invisix:
I agree with your 486ophilia :-) And of course your Amiga comments! :-D  You have a nice Amiga system there, for this side of the pond. I've still not managed to procure anything faster than an 040 :-(

As for the Roland cards, like BlackMonk said: just too damn expensive at the time, so I never got one, never heard it, and so never ever looked back :/  SB was cheap and it did more than the speaker so it was good enough. But then when I got a hold of a Gravis Ultrasound with the extra RAM for the wavetable samples and 32 voice hardware mixing - man, that was it! I was done! I actually couldn't afford it at the time, but on one computer show I found it in a used parts bin for $40 (at the time it was around $140 or so), and I almost jumped in the box to grab it :-D

As for your halting problem, I can't think of anything. Try to notice if anything else happens (ex. is it only when certain games play, only when music is playing, only when drivers are loaded, what happens when you use a mod player like Cubic or something? how about any demos like Second Reality? etc)


@BlackMonk & Speelgoedmannetje:
I have to agree with BlackMonk: DOSbox is great, but it ain't the real thing for me either. I believe the diffence comes with people's personalities. There are certain people who are more prone to enjoying physical objects, touching, feeling, smelling (hey, stop thinking what you're think you dirty pervs! :-p ). Anyways, I'm the type that enjoys putting his hands onto things (round and soft with an inverted dimple at the center, har har har). For me DOSbox is great when I'm on a Mac and have nostalgia withdrawals, but if I have a real DOS box next to me, I'll prefer using that. Also on the PPC Macs it seriously needs a JIT - it's hard to get sound that's not choppy, and definitely not working well with many demos.

@BlackMonk:
It's interesting you mention Zone66. It was coded by Thomasz Pytel (scene nick Tran!), when he was still a teen, for Tim Sweeney (who went on to make Unreal/Unreal Tournament fame) of Epic MegaGames, who's also a long time friend, and he used his PMode32 code in that game, which was a "DOS extender" allowing software to use 32bit flat memory mode in DOS under protected mode (so true 32bit coding). Made a decent amount of cash by selling PMode32 (he and DareDevil) - because it was way faster and smaller than DOS4GW and any other alternative.

Also I agree with your other points. Win 3.11 vs Win95, etc. As for Novell's stuff, it wasn't that bad! It's mostly autoexec/config setup and some DOS based menu-driven config stuff. It was my first networking experience, with IPX and Doom of course ;-)
 

Offline da9000

  • Hero Member
  • *****
  • Join Date: Mar 2005
  • Posts: 922
    • Show all replies
Re: 486dx2 System Question
« Reply #4 on: December 03, 2007, 01:54:21 PM »
OK guys, nostalgia is good, but ignorance is also not bliss, it's poverty (of the mind and the intellect in this case).

I don't know of any dual core that is rated as low as 1.5Ghz. If it's sold as 1.5Ghz then that means BOTH cores run at that speed. Most dual cores are around the 2-2.5+ Ghz range right now, and that gives an ADDITIVE or TOTAL speed of 4-5Ghz.

Now, the issue is not that those machines aren't 4-5Ghz fast. They *ARE*, if you have the RIGHT software. Thus the problem is with software. You see, hardware has advanced much faster than software. Although the hardware advance, to be honest, wasn't an advance that required as much intellect - just take 1 CPU core and stitch it together with another on the same die - big wooptie doo.

Now back to software: software is DUMB DUMB DUMB (disclaimer: I'm a software engineer). You see, even though these monster machines have 2 or 4 or 8 or however many cores, the software isn't at the stage of intelligence where it can distribute itself all over those cores to take advantage.

This is the current state of how multi-cores are being utilized:

Normally, when you run a modern OS, you have multiple applications running. They usually run on different cores (sometimes they migrate between cores, which is time consuming and wasteful - forcing them to a certain core gets rid of this waste - this is called CPU affinity). This works well for servers where multiple version of a web server (ex. apache), or other software is run (for dynamic pages you'd run multiple Python, PHP, Ruby, Lua, etc. scripts, one for each page that gets "hit").

But if you're a desktop user, you don't normally run many power hungry apps. Even if you run 10 programs "at the same time", like office productivty tools, they're still just WAITING for you to type or move the mouse. When you *really* want performance is when you've got one power hungry app, like a hardcore game, like Crysis, Quake 4, etc.

Like previously mentioned, one way to utilize more than 1 core in such a circumstance is to use threads. Threads basically allow a program to run parts of itself as separate processes, thus on more than 1 core. The problem though is that thread programming is 1) hard (very error prone to what are known as race and deadlock conditions) 2) wasteful as you need to spend a lot of time synchronizing threads (because they must communicate their results with each other, or the main thread) and 3) there's only so much you can do with threads before things get unwieldy.

An example for 3) is a game where it uses thread A for main game logic, thread B for sound, thread C for graphics. If most of your execution time is spent (typically this is the case) on thread C, and you have 4 cores, then it's not good enough to only have 3 threads. You really want to break thread C into multiple threads again, so that it gets divided among all CPUs. But even with 4 threads, you soon realize that sound processing doesn't really fully use its core. So you want to break down the expensive graphics thread to even more sub-threads. And thus the nightmare begins: should it work on 2 cores? 4 cores? 8 cores? Heck, you say let's make it dynamic to adjust depending on cores. But then, as already mentioned, you hit the dreaded "iso-efficiency" problems: more "overhead work" versus "useful work", because you're spending all your time distributing the work, which is in too small packets for too much effort, while it would have been efficient to keep larger packets of work on fewer cores. This is known as the granularity level. Too fine granularity and you've got more overhead than actual work being done. Too coarse granularity and you're under-utilizing your cores because the chunks aren't split in enough pieces to fill the cores.

This was also in effect a real problem with PPC Amigas as well: without the extra effort by the programmers to split the work between the 68k and the PPC, there was no benefit to the "dual CPU" PPC cards. And also very real was the fact that switching between the PPC and the 68k had a lot of overhead time.

As you can see, this is soon a nightmare of huge proportions. So people thought: why not make the computers solve this problem. Onto the next section:

One of the *real big* problems that a major portion of the software industry and lots of us computer science majors are facing today, as far as advancing towards the multi-core/super-parallel future, is in the compilers. If compilers were smart enough to break down execution of code so that the programmer doesn't have to spend tons and tons of hours to write parallelizable code, then you could simply recompile your code to use more cores and it would work faster and better. Unfortunately due to the some of the reasons mentioned above such compilers are extremely hard to get right, and there isn't really any one that has accomplished this to a great extent, as of yet.

Now some theory (ramblings). Part of the problem I believe is in our programming paradigms. We view programming like we always have, and using languages that we always have used. Like common speech, I believe language is an enabler and an inhibiter. If your language isn't capable of letting you express a certain class of thoughts, you might never ever have thoughts of such a class in your brain. It can hold you back. On the other hand, if the language enables you to have thoughts of higher levels, due to higher levels of complexity and expression, then I believe you will be endowed with more expressive and thus more complicated and possibly more intelligent thoughts. This I believe holds for computer languages as well.

We're currently stuck with some very very bad technologies, as mentioned before. Although I enjoyed my x86 years, and I did years of intel assembly, the instruction set was horrible compared to the 68000. As a programmer I always wanted to have the 16 general purpose registers offered by the 68000 - but only got 8. This stupid x86 ISA is *still* here, in all 64bit chips (although they are internally RISC-like, they convert x86 ISA to micro-ops, and execute those). Another major problem is the Von Neuman architecture. Computers work on the principle of Fetch instruction - Decode instruction - Execute instruction - Store result. This limits us to certain subsets of problems or approaches to problem solving (ex: SIMD by default operate on multiple data chunks - this changes the way you program when using SIMD - you think in parallel by default). Think of anti-machines as a totally inverted example of how computing can be achieved (this field seems ever more hopeful a future with the advent of FPGAs and reconfigurable chips). Then there are biological computing devices which work on different principles (just to give a small example, the neurons in your brain not only work based on a "flat model" of connections, but depending where on the neuron's surface, which is 3D dimensional, a connection is made, makes a real difference in the results), and even quantum computing devices on yet other principles. Another major problem in my opinion is the stagnation of the masses, pioneered by none other than Microsoft and their technologies. They have dominated the software market for decades with C++, which is an extremely unclean and really retarded object oriented language, which has shown very little innovation and ability to literally break through the old programming paradigm expressions onto a new playing field.

Anyways, enough about the non-existant "multi-core hype". It's no hype. It's real. We've got idling screaming machines and "don't know what to do with them" as far as normal desktop use is concerned. Server people and internet companies know very well what to do with them. So do all the physicists and scientists that do massive data crunching. For the desktop, it's primarily games that will be pushing the envelope. From my personal experience I also believe Apple is heading in the right direction with all their new apps and APIs (Core Animation as a small example) exploiting more and more the underlying hardware architecture (not just Time Machine, but LLVM used in the OpenGL core and other parts).

Now I *really* feel very nostalgic about the good ol' SIMPLE days of single-core, single-CPU non-memory protected multitasking! ;-) Sigh.....