Welcome, Guest. Please login or register.

Author Topic: Google Acquires Rights to 20 Year Usenet Archive  (Read 4443 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline Cymric

  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 1031
    • Show only replies by Cymric
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #14 from previous page: January 12, 2005, 10:10:44 AM »
Quote
Waccoon wrote:
It would be nice if computers could mix many types of memory and prioritize them appropriately.  Who said you had to break things down into RAM and a hard drive?  Why is on-die memory only used for state caching?  Why can't we have high-speed RAM for calculations, cheaper years-old RAM for scratch work, flash RAM for result storage, and then page things out to the hard drive (with no filesystem) at leisure?  Virtual Memory and filesystem models used in today's OSes are far too simple.

Now that is a good idea. I wonder when it will become mainstream to include a line like RAM: 256/1024/128/lots MB in advertisements ;-).
Some people say that cats are sneaky, evil and cruel. True, and they have many other fine qualities as well.
 

Offline Waccoon

  • Hero Member
  • *****
  • Join Date: Apr 2002
  • Posts: 1057
    • Show only replies by Waccoon
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #15 on: January 12, 2005, 10:27:35 AM »
Quote
Now that is a good idea. I wonder when it will become mainstream to include a line like RAM: 256/1024/128/lots MB in advertisements

Well, the problem is getting it all to sync up.  I suppose having multiple memory types on the FSB would be like comparing RISC to CISC.

But, I'm not going near that can of worms.  :-)
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #16 on: January 12, 2005, 11:19:55 AM »
Well thought out post. And well written. And much too long to quote here!  :-o Just like to say briefly, that what you see as limitations or roadblocks are actually the shortcomings of current technology. And those usually don't last long, once they stand in the way of progress. How many innovations do we have today, which were deemed impossible just a few years ago? Too many to list here. I maintain that each of these limitations will fall in their time, so that the computers of ten years from now will be full of 'stuff' that we, here, today, would say 'impossible!' if we could travel to the future to see them. And, of course, our machines, will seem as big, slow, old and 'clunkety' as the Osborne laptops of old.

 

Offline Hammer

  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 1996
  • Country: 00
    • Show only replies by Hammer
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #17 on: January 12, 2005, 11:41:46 AM »
Quote
The Processor Core: The Amiga would have stayed with its
parallel processor core design. We always knew that one of the things
that made this such a versatile and powerful machine was the fact
that, even though main processor speed was relatively slow, much of
the burdensome work of processing graphics and sound was done by not
one, but two other chips, which when combined, made a very powerful
factory for multimedia apps. So, the Amiga today would sport not one,
but, actually three equally powerful, equally advanced, state of the
art Motorola chips, each designed specifically for jobs in sound, and
graphics. Also, each of these powerful processors should have direct
control and access to the system resources they require, including
proprietary and system-wide RAM and transport buses. In other words,
no cards!

NVIDIA’s Geforce 256 GPU murders any current Motorola CPU when it comes to 3D acceleration.

Note that, modern DX9 VPUs are massively paralleled VLIW(1)/SIMD(2)/MIMD(2) pipelined cores. They are noted to be one of the fastest hidden** DSPs in the market place (when loaded with BionicFX Apps).

Examples
1. NVIDIA Geforce FX family.
2. NVIDIA Geforce 6x00 family and ATI Radeon Xx00 family.

ATI Radeon X800 has theoretical of +200GFLOPS i.e. equalling ~1 Sony Cell. "Rage Max" is an ATI reserve technology for multi-VPU cores.

3DLabs's Wildcat Realizm 800 is quite dangerous since this sucker has theoretical of ~700GFLOPS (shame about the drivers).  
Amiga 1200 PiStorm32-Emu68-RPI 4B 4GB.
Ryzen 9 7900X, DDR5-6000 64 GB, RTX 4080 16 GB PC.
 

Offline Hammer

  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 1996
  • Country: 00
    • Show only replies by Hammer
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #18 on: January 12, 2005, 11:59:14 AM »
Quote
nVidia has removed their much-applauded audio engine from their nForce chipset, and they have publicly stated that hardware-accelerated audio is dead.

Note, NVIDIA is currently revaluating SoundstormII i.e. to compete with the joint Intel/Dolby Lab’s Dolby Live initiative.  

Quote
I saw a demo where an ATI GPU was performing realistic fluid simulation as a smoke demo -- in the GPU itself. A few years ago, only powerful, full-featured CPUs with floating point support could do stuff like that.

Particle physics and blood simulation on NV GPU was nice.

GPU’s large cache such as ATI’s hyper-memory and NV’s turbo-cache are just the beginning for large cache equipped GPU cores.  

The next evolution for DX class VPUs is the Unified Shader Model 4.0 i.e. probably under DirectX9d moniker.
Amiga 1200 PiStorm32-Emu68-RPI 4B 4GB.
Ryzen 9 7900X, DDR5-6000 64 GB, RTX 4080 16 GB PC.
 

Offline Cymric

  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 1031
    • Show only replies by Cymric
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #19 on: January 12, 2005, 12:47:33 PM »
Quote
Hammer wrote:
3DLabs's Wildcat Realizm 800 is quite dangerous since this sucker has theoretical of ~700GFLOPS (shame about the drivers).

Holy smokes... Very, very impressive. A genuine wild kitty, that's for sure. I'm almost inclined to look for the 380 V three-phase socket in my home once computers are fitted with this purring furrbal :-). And not all that expensive too. Makes you wonder what you can get if you really spend some cash.

And the Wildcats already have stereo support too! *Drool*. Now this is beginning to look interesting. Playing Doom III not on a flat screen, but with 3D goggles (or a good wide-screen projector) and force feedback harness. Killing demons will never be the same again...
Some people say that cats are sneaky, evil and cruel. True, and they have many other fine qualities as well.
 

Offline bloodline

  • Master Sock Abuser
  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 12114
    • Show only replies by bloodline
    • http://www.troubled-mind.com
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #20 on: January 12, 2005, 01:06:10 PM »
Quote
GPU’s large cache such as ATI’s hyper-memory and NV’s turbo-cache are just the beginning for large cache equipped GPU cores.


Turbo Cache means "Using System memory" :-(

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #21 on: January 12, 2005, 06:19:29 PM »
NVIDIA’s Geforce 256 GPU murders any current Motorola CPU when it comes to 3D acceleration...

[/quote]
Could that possibly have anything to do with the fact that, nVidia chips (and ATIs) have been constantly evolving, driven by the gaming market (by far the most powerful force behind the growth of any computer technology), while Motorola chips have not? Its been shown throughout our industry that, what one company (or person) can do, so can another. I think, obviously, given the market, (and the reason) to push the envelope of their hardware development, Motorola (or any current chip manufacturer) could also boast similar technologies. In that sense, it’s like comparing the current state of the machines (the Amiga, stuck in the past), and the PCs that hosts those chips and technologies, directly to each other right now. It is, in effect, not a valid comparison. And who knows? Commodore, which held rights to CMOS technologies, may have developed their own custom chips.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #22 on: January 12, 2005, 07:07:46 PM »
Cymric wrote:
@MiAmigo:

I note that your emphasis has shifted from technical matters to matters of 'elegance'. Which is fine, but does not quite address the issue at hand. You claim PocketPCs and notebooks even get quite a lot of bang for the buck, and even run cooler too. True, very true. But nobody is expecting them to run Doom III or Half Life 2 at 1600x1200x32 at 8xFSAA and over 80 Hz refresh rate either.

Quote

Not yet, but, given time, and the nature of the market, they soon will. Just as notebooks have been 'forced' from the business arena, into the gaming one. Why? Because people decided they wanted their notebooks to be as capable as their desktops, in this area. Which demanded a better way to implement the technology ('more elegant'), a more efficient way of solving all the problems desktops had, to be able to port these games reliably to notebooks. Smaller devices, such as Pocket PCs will follow in time. As a matter of fact, they already are, just look at what Sony's doing.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #23 on: January 12, 2005, 09:22:25 PM »
Sometimes, there is too much of a willingness to gloss over, or completely dismiss the concept behind the term ‘elegance’ when used to refer to current computer technologies. Maybe the term itself is misleading; perhaps ‘efficiency’ is a better word. In these posts, the Amiga has been used as an example of what computer technology could have become, simply because a), this is an Amiga forum, and b), in its time, it so far eclipsed what then-current computers were doing, that it makes a good candidate for ‘what if’ scenarios.

Throughout these discussions, I have taken a few basic tactical positions.

1). The current technological limitations (many listed here in this thread) are merely obstacles to be overcome, which is, rightfully, all they can be seen as, or we’re stating that we are basically ‘finished’ when it comes to evolution of computing machines. I, too, am impressed with the specs of current graphic accelerator technology, but I also understand that being too enamored of the achievements of the present, can endanger, or, at least, severely hamper whatever possibilities may exist for the future. The proper attitude should always be ‘that’s good, but, how can it be done even better’.

 2). Every technological advantage we enjoy and take for granted today was once seen as an impossible, or insurmountable challenge. ‘No problem is unsolvable’ is the mantra that must always be adapted whenever pushing this envelope (be it graphics manipulation, power sources, memory and storage issues, or power problems) to the next level, or there simply will be no ‘next level’.

3). As technology moves forward, it becomes more and more important to get these jobs done in the most efficient manner possible (read: elegant). Why? How large (and hot) does a graphics accelerator have to get, before its too unwieldy? Right now, they’ve almost doubled in size, and require their own power supplies. Will cpus follow suit? Even thought hard drives now sport massive amounts of storage, their size hasn’t really changed that much: a good thing. Hence, they still fit in current machines. But, what about reliable storage solutions for smaller devices, which are the future? There is an entire host of them, CF Cards, Secure Digital, USB devices, and of course, notebook variety HDs, but none of them has, as of yet, approached the truly massive capacities, and reliability, that today’s users and applications expect and demand. And, there’s the heat problem with them as well. Again, better designs, and new solutions to old, ‘insurmountable’ problems, are absolutely necessary. It could be that somehow combining these two technologies into an as yet, unforeseen new hybrid will be the answer.

Another reason why better design is a requirement for continued evolution of these machines, or, it’s a scientific fact, they will reach of point where evolution is no longer possible. This is a command fact of physics. Example: Ever fired a gun, or shot a bow and arrow at a distant target, or, even lifted an extremely heavy weight? What these things all have in common with computer design is an increasing need for accuracy and precision, as magnitude, unit of measure, and quantity increases. Although its very easy to hit a target that’s only a few feet away, hitting that same target becomes extremely difficult, when its several dozen yards away, since the greater amount (of the distance) magnifies errors of aim and execution from insignificant ones, to formidable ones. The slightest misjudgment, the merest shaking or movement of the bow (or gun) are greatly multiplied, and increased, as the distance is traversed, and the projectile travels farther and farther from the point of origination. (Anyone who has ever done any power lifting, or bodybuilding knows the same holds true: errors in sloppy execution are greatly increased as the weight increases, to a point where it becomes impossible to perform an extremely heavy lift without perfect technique). These same laws and principles, hold true, of course, for any increased amplitude of performance, such as speed, and overall efficiency, when applied to machines, as well, and computers, for all their glory, are nothing more than very precise machines. As they grow in power, that precision must be extremely accurate and efficient.

So, at what point do current machines become so bloated that they can no long (efficiently) support their own weight? Too noisy, too hot, too much storage required, too much power? These are all valid concerns, since poor solutions could easily move them out of the market of practicality, and affordability. A lot of these problems could be solved right now, if better, more efficient ways were used to implement them. While its true that sometimes “Elegance directly conflicts with human nature”, it’s also equally true that everyone can generally ‘have their cake, and eat it too”, with a little more innovation in the areas of better, more efficient technology implementation, instead of ‘hack’ solutions, i.e., ‘just make it bigger, stick on another power socket’, shove in more RAM, add more transistors’. There is a point of no return, where, not only will these solutions no longer be enough, they’ll actually begin to ‘trip over themselves’, collapsing under their own weight, and severely compromising the very systems they support. This is happening right now: a well-known fact to CPU designers at Intel and AMD, and all hard drive manufacturers, who must constantly come up with new ways to ‘shoe-horn’ in more storage capacity. They all realize that soon, they must ‘hit the wall’, and come up with a newer, more innovative, better way, or die. Its just another example of where increased quantities demand more and more accuracy of execution, and more efficient solutions.

 

Offline Waccoon

  • Hero Member
  • *****
  • Join Date: Apr 2002
  • Posts: 1057
    • Show only replies by Waccoon
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #24 on: January 13, 2005, 05:09:38 AM »
Quote
Hammer:  Note, NVIDIA is currently revaluating SoundstormII i.e. to compete with the joint Intel/Dolby Lab’s Dolby Live initiative.

Interesting.  I'd still rather do everything in the CPU, but if it's built into the hardware, there's no software license fees, either.  :-)

Quote
Cymric:  And the Wildcats already have stereo support too! *Drool*. Now this is beginning to look interesting. Playing Doom III not on a flat screen, but with 3D goggles (or a good wide-screen projector) and force feedback harness. Killing demons will never be the same again...

Will Doom3 even work?  Drivers are massively different for high-end CGI boards compared to home GPUs.

Oh yeah, and don't those cards cost, like, $6,000?  :-)

Quote
The current technological limitations (many listed here in this thread) are merely obstacles to be overcome

Note that most "technological limitations" are merely design flaws in disguise.  That's why turning hardware into software is such a boon.  It pains me that virtual machine technology like Java has been so slow to catch on.  After having worked with AMOS, I though compilers would be dead by now.  It really bothers me that until recently, pixel shaders on GPUs still had to be programmed in assembly.  Pixel shading is a technology that was definately not ready for release, and still is quite a mess.

Quote
No problem is unsolvable

This also means that there is no "corect" solution.  Many existing solutions are poorly thought out and could be improved before we introduct newer, faster, more powerful technology.  More technology isn't always the answer.

Quote
How large (and hot) does a graphics accelerator have to get, before its too unwieldy?

As much as the market demands.  Pentiums are already unweildy, and it's not because they're inefficient.  They're hot because people value speed over temperature and efficiency.  Efficient products don't always succeed, because there are other priorities, including marketing.

I cringe when people bring up the old VHS vs Betamax debate.  Yes, Beta was supirior technology, but VHS still won since the tapes were longer and more practical, such as for selling commercial movies.  Commodity often wins over quality.

Quote
Another reason why better design is a requirement for continued evolution of these machines, or, it’s a scientific fact, they will reach of point where evolution is no longer possible.

Well, we regularly invent new forms of math to solve "impossible" problems.  Note that evolution depends on the conditions of the environment, so going faster isn't going to be the #1 goal of computing forever.