Welcome, Guest. Please login or register.

Author Topic: Google Acquires Rights to 20 Year Usenet Archive  (Read 4438 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline Waccoon

  • Hero Member
  • *****
  • Join Date: Apr 2002
  • Posts: 1057
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« on: January 12, 2005, 09:53:11 AM »
Quote
MiAmigo:  So, the Amiga today would sport not one, but, actually three equally powerful, equally advanced, state of the art Motorola chips, each designed specifically for jobs in sound, and graphics

Refer to Sony's "Grid" architecture.  Data processing, graphics, and audio all require much the same instruction set to process.  The reason why we have dedicated chips to do that work is because they are hard-wired to do the same intructions over and over without fumbling with complex decoders.

GPUs really are complete computers, but they are still used as "accelerators."  Why not have three CPU cores in a single chip, or ten seperate coprocessors?  It's not the complexity of the calulations you want to get done, it's just how fast you need it to be done, at a given cost.

Old arcade machines didn't have dedicated CGI boards.  They just had a half-dozen ordinary CPUS:  68K, Hitachi, Texas Instruments, AT&T, and so on.  My favorite arcade game, Hard Drivin', had a very demanding graphics and audio system, but was driven by several, standard parallel CPUs -- one 68K to do the physics and run the game, and two Texas Instruments processors running in parallel to do the polygon calculations.  Sega machines are infamous for using 68K chips for audio mixing and effects, instead of dedicated DSPs.  You had more programming flexibility that way, instead of being restricted by the limited capabilities of a hard-wired DSP.

Flexibility sometimes wins out over speed.  Dedicated sound chips are being phased-out in favor of CPU mixing.  Audio effects on a Pentium 4 chip are easier and far more capable than the effects possible with a standard EAX-compliant chip, like the EMU10K found on a typical Sound Blaster Audigy card.

nVidia has removed their much-applauded audio engine from their nForce chipset, and they have publicly stated that hardware-accelerated audio is dead.

I saw a demo where an ATI GPU was performing realistic fluid simulation as a smoke demo -- in the GPU itself.  A few years ago, only powerful, full-featured CPUs with floating point support could do stuff like that.

Quote
Cymric:   System Configuration: Also, the OS would be completely configurable at start up

Is it really necessary to have the same hardware configuration for each of the tasks you specified?  Why not have a dedicated OS for each task?  Note that many OSes come in single processor or multiprocessor editions, depending on your hardware.  Making one OS that will work on both types of architectures, at least at this time, results in a major loss of efficiency for either type of system.

Quote
Cymric:  Not a bad idea, but AFAIK FPGAs are not fast enough to meet today's demanding specifications

My feelings, too.  It's unlikely that "Flash" memory will ever be used like RAM, simply because it is more complicated and will always be at a disadvantage.  Simpler is always cheaper and more plentiful than complex.

It would be nice if computers could mix many types of memory and prioritize them appropriately.  Who said you had to break things down into RAM and a hard drive?  Why is on-die memory only used for state caching?  Why can't we have high-speed RAM for calculations, cheaper years-old RAM for scratch work, flash RAM for result storage, and then page things out to the hard drive (with no filesystem) at leisure?  Virtual Memory and filesystem models used in today's OSes are far too simple.

Quote
Bring back the Video Toaster, and make it an integral part of the Amiga's standard hardware.

Many graphics cards have video capture built-in.  Making an application that can read standard video streams makes sense, but it's doubtful that every machine needs the hardware.

Then again, sound cards with line input was a rarity a few years ago, and now even budget machines have it.  When all TVs go digital in a few years, everything will be on a serial connection, so broadcast channels and video feeds will all be 2-way, and you won't even need dedicated video capture hardware, anymore.  Death of S-Video?  Why not?

Quote
Bloodline:  My Gfx card and sound card are computers in their own right!

True.  A typical top-tier GPU has more transistors (by several million) and performs more calulations that the highest-spec Pentium!  A graphics card has its own memory and bus architecture, registers, caching... it's really a computer inside your computer -- complete with its own dedicated power feed and regulators.  :-)

Quote
Rogue:  You can't say, oh heck, two hours work gone, but hey, I can reboot in five seconds.

AROS?

Sorry, but it just crashes too damn much.

Quote
MiAmigo:  What I actually envisioned (back when I wrote the piece, and even now) is nothing less than a solid-state computer.

EROS?

Which, by the way, has been discontinued.  That experimental OS didn't get anywhere.

Quote
MiAmigo:  I want my computer as accessible (no boot-up time) as my microwave, or even my kitchen sink.

Yes, as do many interface designers.  However, the only solution to many complex data organization problems is caching indexes, and for that, you need memory storage.  Solid-state computers will probably never really exist.  We'll have computers that hibernate with a battery feeding the memory, at best.

I remember using a text editor written in AMOS that wrote data to the floppy drive on a character basis.  Very reliable, but responsiveness was calamitous.

The idea of backing up memory to a storage device is a bit far-fetched, too.  Maintaining all the data integrity of each component in the system (such as the register state of your graphics sub-system) is unlikely, due to driver issues and non-standards compliance.  I've never owned a computer -- PC, Mac, or Linux -- that would always reliably wake up from sleep mode.  You can minimize boot time, but you can't get rid of it unless it's a purpose-built machine that only does a tiny number of things and requires very little storage.

"Always On" also implies that the hardware never changes.  That's fine for proprietary systems where all the chips are soldered together, but not realistic for open architecture, like the PC.  Otherwise, the machine must verify that the hardware has not changed every time it turns on, and that's where a lot of startup delays occur.  There was a time in history where new hardware would cause the machine to lock up and everything was configured with jumpers.  I don't think people want to go back to those dark days.

Don't forget the rule of resource, either.  If a resource is available, programmers will use it, whether they really need it or not.  Efficiency is too much to ask.  ;-)

Quote
Cymric:  ...But nobody is expecting them to run Doom III or Half Life 2 at 1600x1200x32 at 8xFSAA and over 80 Hz refresh rate either. Whether you need or want such insane specs is quite another question. (I would not, my monitor cannot cope, so why bother?)

Elegance directly conflicts with human nature.  Elegance is necessity, but people want things they don't need.  Yeah, you don't HAVE to have a 3GHZ machine to do certain things, but people will buy them anyway because they want them.  Servers demand elegance and efficiency, and so do dedicated devices like the micro controllers in your microwave and home thermostat.  Such standards really cannot apply to PCs, hand-held phones, and other multifunction devices.  Hell, you can take photos will cell phones these days, and next will be movies, GPS, wireless banking, etc.

Kinda scary, really.

 

Offline Waccoon

  • Hero Member
  • *****
  • Join Date: Apr 2002
  • Posts: 1057
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #1 on: January 12, 2005, 10:27:35 AM »
Quote
Now that is a good idea. I wonder when it will become mainstream to include a line like RAM: 256/1024/128/lots MB in advertisements

Well, the problem is getting it all to sync up.  I suppose having multiple memory types on the FSB would be like comparing RISC to CISC.

But, I'm not going near that can of worms.  :-)
 

Offline Waccoon

  • Hero Member
  • *****
  • Join Date: Apr 2002
  • Posts: 1057
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #2 on: January 13, 2005, 05:09:38 AM »
Quote
Hammer:  Note, NVIDIA is currently revaluating SoundstormII i.e. to compete with the joint Intel/Dolby Lab’s Dolby Live initiative.

Interesting.  I'd still rather do everything in the CPU, but if it's built into the hardware, there's no software license fees, either.  :-)

Quote
Cymric:  And the Wildcats already have stereo support too! *Drool*. Now this is beginning to look interesting. Playing Doom III not on a flat screen, but with 3D goggles (or a good wide-screen projector) and force feedback harness. Killing demons will never be the same again...

Will Doom3 even work?  Drivers are massively different for high-end CGI boards compared to home GPUs.

Oh yeah, and don't those cards cost, like, $6,000?  :-)

Quote
The current technological limitations (many listed here in this thread) are merely obstacles to be overcome

Note that most "technological limitations" are merely design flaws in disguise.  That's why turning hardware into software is such a boon.  It pains me that virtual machine technology like Java has been so slow to catch on.  After having worked with AMOS, I though compilers would be dead by now.  It really bothers me that until recently, pixel shaders on GPUs still had to be programmed in assembly.  Pixel shading is a technology that was definately not ready for release, and still is quite a mess.

Quote
No problem is unsolvable

This also means that there is no "corect" solution.  Many existing solutions are poorly thought out and could be improved before we introduct newer, faster, more powerful technology.  More technology isn't always the answer.

Quote
How large (and hot) does a graphics accelerator have to get, before its too unwieldy?

As much as the market demands.  Pentiums are already unweildy, and it's not because they're inefficient.  They're hot because people value speed over temperature and efficiency.  Efficient products don't always succeed, because there are other priorities, including marketing.

I cringe when people bring up the old VHS vs Betamax debate.  Yes, Beta was supirior technology, but VHS still won since the tapes were longer and more practical, such as for selling commercial movies.  Commodity often wins over quality.

Quote
Another reason why better design is a requirement for continued evolution of these machines, or, it’s a scientific fact, they will reach of point where evolution is no longer possible.

Well, we regularly invent new forms of math to solve "impossible" problems.  Note that evolution depends on the conditions of the environment, so going faster isn't going to be the #1 goal of computing forever.