Welcome, Guest. Please login or register.

Author Topic: Google Acquires Rights to 20 Year Usenet Archive  (Read 4435 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Google Acquires Rights to 20 Year Usenet Archive
« on: January 11, 2005, 01:13:40 AM »
Check this out.

Using it, I found this old post I wrote some years ago (April, 2000) in an Amiga forum.

(DISCLAIMER: Since this posting is a 'bit' long, and dated, "it does not necessarily STILL represent the opinions of me, or the other guy mentioned in its pages.")

The Next Phase:

Amiga Rebirth, Resurrection, and Development Into the next Millennium

        The Amiga's hardware and architecture are what made it
different, and hence, vastly superior to the PCs of its day. Back when
PCs could only ‘beep' and ‘boop', with hardware best be described as
‘clunky', the Amiga (and even its ideological and technological
predecessor, the Commodore 64) could already do the most amazing
things in graphics and sound. One product of this amazing
technological leap forward was the Video Toaster, which was, at one
time, (with the vast powers of the Amiga at its core) the industry
standard for producing much of the video broadcast content of many
small and large, in-house studio graphics departments.
        Because of its corporate woes, managerial ineptitudes, and
just plain "ball-fumbling," Commodore, shortly after snapping up the
‘Lorraine', (and with precious few years of development under its
belt), was unable to bring this machine to the full fruition of its
latent powers. And sadly, during its struggles in the limbo of that
unrealized potential, the lowly PC, (once easily considered with some
contempt by Amiga owners as the lobotomized love-child of low-tech
‘first cousins') had evolved to what it is today, something somewhere
between a technological wonder and logistical nightmare.
        So where would (or should) the Amiga be now, if, it too, had
had the chance to grow and evolve with the technological advances in
board and chip design? Well, a couple of friends got together for a
long weekend of driving (from Chicago, Illinois to Centsible Software
in Berrien Springs, Michigan) and palaver, and came up with the
following scenario:

 The Processor Core: The Amiga would have stayed with its
parallel processor core design. We always knew that one of the things
that made this such a versatile and powerful machine was the fact
that, even though main processor speed was relatively slow, much of
the burdensome work of processing graphics and sound was done by not
one, but two other chips, which when combined, made a very powerful
factory for multimedia apps. So, the Amiga today would sport not one,
but, actually three equally powerful, equally advanced, state of the
art Motorola chips, each designed specifically for jobs in sound, and
graphics. Also, each of these powerful processors should have direct
control and access to the system resources they require, including
proprietary and system-wide RAM and transport buses. In other words,
no cards!

        The OS: Currently, each of us is a PC user, having one by one
(and out of sad necessity) given up on the platform that at first
seemed a gift from the god of technology, and eventually ended up its
orphaned stepchild. But one thing we all remember is this: the Amiga
was always ‘ready to go'. Boot-up time was inconsequential, as most of
the Operating System seemed to be hard-wired into the machine! (This
is amazing, in light of today's boot-up times, which can take five
minutes, or more, depending on the configuration of the hardware and
software.) Today's Amiga would have most of its OS on a PCMCIA card
(or an array of such cards). It would still be completely upgradeable,
by ‘flashing' either patches, or entire rewrites, and many, many times
more accessible at system start. We think that a Linux OS would be
very well suited to this type of adaptation. So much for the GUI
aspect of the OS. For programmers and developers, we also envisioned a
return to the command line interface, with a fully realized DOS-like
(or UNIX like) command structure.

        System Configuration: Also, the OS would be completely
configurable at start up. What does the user specialize in? Sound
apps? Graphics? Multimedia? Games? Weather pattern predictions, or
other ‘super-computer' sims? The user would be able to choose, from
the Boot Up Menu, the most optimum configuration for the job at hand,
with the operating system then customized for that work, and reserving
all system resources for what's being done and used at the time, and
nothing else. This would not only bring more sys resources to bear on
the job; it should ideally result in more stability, and less downtime
from crashes and hang-ups. One of the ways we discussed this might be
achieved was the use of a ‘partitioned' operating system, one that
could easily and readily step between different modes, each for a
different job, like re-booting with a new configuration, but ‘on the
fly', without any loss of data, and no down time. (Sort of like the
way programs pass parameters to subroutines in a chained program, or
an overlay.)

        Open Source: One of the great ideas of today would certainly
not only work on this new Amiga, it actually may have had its start in
those early days of computing, when more likely than not, a user was
also a programmer. After all, back then, when there were no ‘Windows',
and no vast software support, many users became (out of either
interest, necessity or both), programmers. Those who didn't go that
far were at least able to ‘type in' programs from magazines. This, in
turn, led to not only an intuitively better understanding of the way
computers and their components worked, but also gave some insight as
to how to ‘tweak' them. Therefore, in the spirit of those early days,
‘open source code accessibility' is something that should definitely
be a part of this newly evolved machine. The Amiga, at its best, was
not only a user's platform, but a developer's and programmer's, as
well. (By contrast, many Windows users today are just that, ‘users',
who have no knowledge of how their machines work. How many people
drive cars, and yet have no knowledge of how a car works, or how to
maintain one? This aptly describes the typical Windows user.).

        Storage Capacity: Back to hardware. As multimedia machines of
the first order (and the first generation) the new Amigas would have
to have vast storage capacity, on an order of two to three times that
of today's PCs. This could be achieved several ways, one of the most
logical and accessible being that of the RAID array, with some
modifications. Hard drives should be fully hot swappable, and arrayed
in multiples. The Amiga hardware could include a bare-bones (or empty)
drive array rack, which the user could then build upon to meet storage
needs. Today's prices in storage solutions easily support such a
system. Eventual capacities of 50 to 100 GIG are conceivable with
moderate comparable cost.

        RAM: As far as RAM is concerned, more is always better. But
how much does one invest in RAM, until it becomes unwise insofar as
cost/projected time of use? One would hesitate to put one GIG of RAM
into a machine, when cost and the time until the next machine upgrade
is taken into consideration. This is one of the problems plaguing PC
users today is that, at some point, RAM evolves beyond a point where
it can be used by motherboards of a certain age (make, and
manufacture). Most would balk at spending top dollar for lots and lots
of RAM, and then not being able to use it in the next machine.
Therefore, RAM for the new Amiga should be of a new type, upgradeable
by ‘motherboard flashing', or a ‘governor chip', which would act as a
RAM controller, and, ideally, extending the life of these chips
considerably, by making them as variable and upgradeable by software,
and firmware, as other elements of the computer. For that matter,
since the new Amiga's sound and graphics would both be handled by
proprietary processors, they, too, should be, to some extent,
upgradeable in this manner.

        Secondary storage: There are, and will be many, many options
for this type of storage. Floppies of varying speeds and capacities,
ZIP disks, backup storage devices (tape), all clamor for support and
space in today's machines. The new Amiga would be no different.
However, it could be different in how it approaches the matters.
Instead of being designed for one type (or a couple) of these
secondary storage devices, the Amiga would sport a new, proprietary
specification. A controller device (a secondary backup ‘manager')
which could be configured by software, and a multiple-array docking
station, fully user upgradeable outside the case, (or, at least easily
accessible), to handle each of not only these devices, but also, any
which may come along. This, added to the ‘hot swappable' (see System
Configuration, above) OS could give the Amiga much more flexibility
over today's typical PC machines. (As far as CDs and DVDs go, it goes
without saying that the Amiga would definitely support these devices,
(and other optical media) with the same type of infrastructure, if so
desired by the manufacturer or user.)

        The Motherboard: A few words. The motherboard for the newly
evolved Amiga would actually be somewhat less capable than
motherboards currently in use in today's PCs. Why? By off-loading much
of the stifling specifications from the mainboard to the
processor-chips, (which would be removable and hence, fully
upgradeable) its use would be extended beyond that of today's systems.
Therefore, increases in bus speed, AGP specs, and over-clocking
tolerances could conceivably be lessened, if not completely dismissed,
since it would be the chips that need upgrading and replacing, and not
the entire motherboard.

        Multimedia Apps & Closing: Bring back the Video Toaster, and
make it an integral part of the Amiga's standard hardware. As with
other add-ons, it could come in different configurations, or none at
all, according to what the user would need, and be willing to pay for.
But it should definitely become a basic part of the machine, like the
sound chip, and the video chip.
        Well, that's about it. That was a long day of driving, and
many, many other ideas were discussed, brain stormed, and tossed
about. As with all such sessions, we did not limit ourselves by such
pesky concerns as to what is ‘possible' or ‘feasible' or ‘cost'. Such
thoughts only kill good ideas, and oftentimes, create a negative
atmosphere in which such ruminations may never see the light of day.
Sometimes, ‘not possible' really only means no one has thought of a
way of making a thing work. Having said all that, we hereby present
the core of our talk. If anyone's taken the time to read this, we are
grateful, for it is more than we could ask or hope for.
Curtis M. Harrell, Jr. & Sterling Hankins.

 8-)
 

Offline neofree

  • Sr. Member
  • ****
  • Join Date: Dec 2002
  • Posts: 467
    • Show only replies by neofree
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #1 on: January 11, 2005, 01:28:37 AM »
AFAIK google got archives back to the original Usenet post a few years ago...
 

Offline Cymric

  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 1031
    • Show only replies by Cymric
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #2 on: January 11, 2005, 08:51:04 AM »
That's a weird computer you're describing, even for the year 2000. I can see the need for a dedicated graphics processor, but a dedicated sound processor... Hrm. No cards is of course a Bad Thing, unless you want your users to be locked in and not be able to upgrade properly or replace broken parts. (I know, card replacement doesn't happen very often.)

Boottime is inconsequential. It never ceases to amaze me that people can see this as a plus. I would rather have a system which does not need to boot (read reboot) at all: it's always on, doesn't crash, and only needs to be shut down in case of severe hardware problems. Quick reboots lead to sloppy programming and sloppy use: I saw this in an OS4 video presentation. (A PDF reader was not very responsive. Ugh, what is causing it to be so slow... Dunno. Let's reboot and restart. Bad!)

OS on PCMCIA... That would be an USB stick noawadays, but the idea is nice. Although I'm not sure what you would gain with it unless you need to perform system rescue/installs often.

System configuration: nice idea, not necessary in practice because the requirements of many programs are not that demanding on the system. A good OS always maximises availability of resources to the applications, there is simply no need to hold things in reserve. What would be useful is easy automatic/manual tweaking of the task scheduler, but you don't need a partitioned OS for that.

Storage capacity: overkill. The people who use their machine for video editing is small, so having a rack as standard is not necessary. But a version of the hardware which incorporates RAID is definitely something to be considered.

RAM: aha, you're thinking about FPGAs for graphics, sound and bus controller (or something equivalent). Not a bad idea, but AFAIK FPGAs are not fast enough to meet today's demanding specifications. (They sacrifice speed for generality.)

Secondary storage: forget proprietary systems. Way too expensive, and you still end up building interfaces for the regular devices which could have easily been moved to the main board. The new Amiga would only offer USB and Firewire, and do away with all those ancient RS-232 and Centronics connectors.

Mobo: I don't think you realised how important the electrical infrastructure of the mobo is. You cannot simply upgrade the relevant chips and be done with it: simply look at the incredible data transport speed which now occurs in mobos. That is not doable with older hardware simply because the specifications don't allow it. And then it doesn't matter whether certain core chips are removable or not.

Video Toaster: make that an add on for those willing to pay for it. I don't intend to do video with my computer, ever---it simply doesn't interest me. So why should I pay for it?


It would have made an interesting device, but still one which would not be competitive.
Some people say that cats are sneaky, evil and cruel. True, and they have many other fine qualities as well.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #3 on: January 11, 2005, 09:40:24 AM »
Quote: That's a weird computer you're describing, even for the year 2000. I can see the need for a dedicated graphics processor, but a dedicated sound processor... Hrm. No cards is of course a Bad Thing, unless you want your users to be locked in and not be able to upgrade properly or replace broken parts. (I know, card replacement doesn't happen very often...
__________________________________________________________________________________________________________________________________________

Well, it has been a few years since I wrote that, and some of my ideas 'may' have changed, or, maybe not. And, five years in the development of technology is a really long time, as we all know. Still, I do remember the frame of mind I was in after our visit to centsible. And in many ways, I still feel the same way. The original Amiga was a truly revolutionary machine, which didn't utilize almost any of the then-current solutions for designing and building computers. In that same sense, I was willing, in my 'pretend' design stage, to willingly and knowingly leave current design convention behind as well, and attempt to find newer, more innovative ways of creating a powerful machine.

I have a lot of problems with the way typical PCs do almost everything they do today, from graphics, to sound, to motherboard traffic. Such machines are (overly) large and powerful because they aren't very elegantly designed, or realized, so they almost have to be. I don't know if it’s because of the clunky OSes demand hardware on steroids, or if its the other way around – it’s probably a little bit of both. But just because nVidia decided that 'this' is the way to do graphics, doesn't mean its the right way, or the only way. (You can lift a boulder with a crane that costs tens of thousands of dollars, or you can lift it with a lever that costs next to nothing.) Speaking again of current graphics accelerators, look at what the result is, a card with more memory than computers 5 years ago, gobbling up power, generating heat (and money!), and manipulating huge amounts of data on textures, at the cost of overall system performance. Not very efficient, or elegant, and extremely demanding on the wallet, and the resources of the machine. But, this technology evolved out of the PC's way of doing things back when the Amiga already had a better way. What would we have now, if that 'better way' would have been allowed to evolve? If the 'video card' were a true, full-blown video processor, for instance, it could use complex calculations to create and modify graphics ('micro-fractals?'), instead of storing, or sampling any raw data. Still,  'it is beyond the scope of this post' (always wanted to say that!) to fully go into all the intricacies of how such graphic accelerators work, but, needless to say, the way they work is probably not the be-all and end-all of solutions, nor the most efficient.

The same could be said for the motherboard; I'm sure we haven't seen the end of their evolution, or the best solutions to many of the challenges faced by mobo designers.  In this instance, in the interest of the goal of finding a better, more revolutionary way, I would again fully advocate throwing out both the bath water AND the baby, and starting over from scratch. After all, isn't that what they did back when the Amiga was originally conceived?

P.S. I still think the 'no cards' idea is also a good one. The idea of a processor suite does not necessarily mean that  they (the cpus) have to be hard wired to the board. They could just as easily be removed and upgraded at any time. And, the evolution of technological devices, which always tends towards smaller, and not larger (pocket pcs, phones and notebooks?) would almost insure that, at some point, we'd have to drop the big, clunky cards for something a lot more modular.
 

Offline bloodline

  • Master Sock Abuser
  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 12114
    • Show only replies by bloodline
    • http://www.troubled-mind.com
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #4 on: January 11, 2005, 10:09:21 AM »
Modern PC's are the technological evolution of the Amiga idea.

My Gfx card and sound card are computers in their own right!

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #5 on: January 11, 2005, 10:17:20 AM »
Quote

bloodline wrote:
Modern PC's are the technological evolution of the Amiga idea.

My Gfx card and sound card are computers in their own right!


I would tend to disagree. Even though I, too, use these 'newfangled' PC cards, I would compare them in this way, to what I feel the Amiga would have/could evolve in to: basically, the difference between Bruce Lee (the Amiga) and Hulk Hogan (the PC).
 

Offline bloodline

  • Master Sock Abuser
  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 12114
    • Show only replies by bloodline
    • http://www.troubled-mind.com
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #6 on: January 11, 2005, 10:38:38 AM »
Quote

MiAmigo wrote:
Quote

bloodline wrote:
Modern PC's are the technological evolution of the Amiga idea.

My Gfx card and sound card are computers in their own right!


I would tend to disagree. Even though I, too, use these 'newfangled' PC cards, I would compare them in this way, to what I feel the Amiga would have/could evolve in to: basically, the difference between Bruce Lee (the Amiga) and Hulk Hogan (the PC).


The problem is that The Amiga's hardware addressed the Graphics and Audio problems of it's time... It didn't do anything that spcecial, but did do what it did for an extremely low price.

The Graphics of the era centred around moving large rectangular blocks at 50fps, at about the resolution of a TV. They had to do this using the least amount of memory as memory was very expensive... and minimal CPU intervention was requires as CPU's weren't very fast and wern't that good at shifting around large chunks of Data.
The solution was to use a Blitter Chip (To move the blocks of graphics without the CPU), sprites (for fast moving gfx) and planar graphics (to allow scalable colour depts in low memory situations). The Amiga also had the Copper (with various memory saving uses) and Ham modes which allow more colours without extra memory load too.

Modern Graphics revolve around generating extremely high definition 3D environments, and they have to do this between 60 and 120 times every second. They do no have to worry about Memory (it's cheap!), so the Solution is to build a 3D graphics processor with a Blitter/2D acceleration on to a card with it's own memory.

Unless a new game is invented that requires something other than 2D or 3D graphics, The Modern gfx board is the only solution.

Offline Cymric

  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 1031
    • Show only replies by Cymric
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #7 on: January 11, 2005, 10:50:50 AM »
This is sidestepping a little, but I think you're placing too much faith in graphics alternatives. There is a reason why we are manipulating textures, bump maps and working our asses off to get affordable and quick lighting in a chip, and that is simply because the alternatives are even more demanding and expensive. Do you have any idea what it takes to render a decent-looking 3D scene? In other words, have you ever written out the math (traditional, fractal or otherwise)? Because I have a hunch you haven't.

Second, blaming the PC (graphics) architecture for turning out the way it did 'even when the Amiga already had a better way' is really nonsense. The Amiga showed the power of a coprocessor doing video or sound work, and unless I miss my guess, that's exactly what nVidia, ATi, Matrox, 3dfx, Creative, Terratec, and all the others were/are/will be doing. In fact, they make exchangable coprocessors. On cards, true, but exchangable nonetheless. So please, enlighten me, what is the 'better way of the Amiga'?
Some people say that cats are sneaky, evil and cruel. True, and they have many other fine qualities as well.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #8 on: January 11, 2005, 06:39:18 PM »
Quote

Cymric wrote:
This is sidestepping a little, but I think you're placing too much faith in graphics alternatives. There is a reason why we are manipulating textures, bump maps and working our asses off to get affordable and quick lighting in a chip, and that is simply because the alternatives are even more demanding and expensive. Do you have any idea what it takes to render a decent-looking 3D scene? In other words, have you ever written out the math (traditional, fractal or otherwise)? Because I have a hunch you haven't.

Second, blaming the PC (graphics) architecture for turning out the way it did 'even when the Amiga already had a better way' is really nonsense. The Amiga showed the power of a coprocessor doing video or sound work, and unless I miss my guess, that's exactly what nVidia, ATi, Matrox, 3dfx, Creative, Terratec, and all the others were/are/will be doing. In fact, they make exchangable coprocessors. On cards, true, but exchangable nonetheless. So please, enlighten me, what is the 'better way of the Amiga'?


Well, actually...

No, I've never done any of the 'actual' math for scene rendering of any type, and I doubt many of us have. But, I have written my own games, and I do have some idea of what it takes to make things move as fluidly as possible. (From a programmer's point a view, and simply put, I would say lower level languages over the higher ones, the lower the better, with pure machine code being the most obvious, though also most cumbersome choice). And yes, these modern cards do use co-processors onboard to lots of the processing work, but, and I repeat this again, at great cost system-wise, money-wise, and in terms of the need for more and more heat dissipation. Look at pocket PCs: for their size, use and cost, they have sound and graphics that are far superior to desktops of a few years ago. And, depending on type and cost of model, they can handle phone calls, play games, perform a host of other computing tasks, and do a pretty sophisticated job of it, all with no cards, and not much heat generation. And, of course, their cost is going down all the time. The same, to a greater degree, can be said for today’s notebooks. So, it is possible to be more conservative of system requirements, and still have a robust system, if there’s both a need and desire for it.

Necessity IS the mother of invention, and can lead to greater innovation, as it did in the Amiga's time. True, system resources for graphics and sound were little, to none, memory was expensive, as well as storage, etc, etc. But, the end result was a capable, powerful machine that managed to 'do without'. Had the machine been allowed to evolve and survive, operating along those same design principles, the face of computer technological development would be very, very different today. But it’s always easier to just throw a lot of money, superfluous hardware and bloated code at a problem, rather than solve it elegantly. There's that word again. 'Elegant' is a term old programmers used to use to describe a job, hardware, or programming, that was extremely tight, small, and well done, with as few resources as possible. I defy any one using PCs today (mostly Wintel based) to find anything 'non-bloated', or elegant about how those machines operate. I would say this goes for every aspect of their operation, from the software solutions, to the hardware ones as well. In the constant rush, down through the years, to get (buggy) code, and system-hungry hardware to the market, we have evolved computer systems that are as bloated, inefficient, and money hungry as, perhaps, the society at large, that created them.

As for, what the better way of the Amiga is? I would say, rather 'what that better way could have been, or should be'. Sadly, we may never know. It’s probably a bad idea to try to resurrect the machine, many have tried, and few if any have succeeded. It could be that some of the things I'd like to see will eventually come about from the constant evolution of the smaller devices, its been a historical fact that innovation in these areas often 'trickle up' to the larger devices, after time has proven their worth. As for enlightening 'us' on that better way, sir, I could do that for hours and hours and hours, but who would listen? Not many are open to ideas as words and words alone, most people, myself included, would like to eventually see something in hardware, before long, to prove or disprove ideas. But the best ideas often start as words, a coin this medium is most suited for, so like everyone else, and for now, its all I have.
 

Offline Rogue

  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 566
    • Show only replies by Rogue
    • http://www.hyperion-entertainment.com
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #9 on: January 11, 2005, 08:54:32 PM »
Quote
Quick reboots lead to sloppy programming and sloppy use:


Sorry, but I think that's nonsense. Reboot time is completely irrelevant if your data just got blown away by a crash. You can't say, oh heck, two hours work gone, but hey, I can reboot in five seconds.
Look out, I\'ve got a gun
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show only replies by MiAmigo
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #10 on: January 12, 2005, 01:07:18 AM »
Quote

Rogue wrote:
Quote
Quick reboots lead to sloppy programming and sloppy use:


Sorry, but I think that's nonsense. Reboot time is completely irrelevant if your data just got blown away by a crash. You can't say, oh heck, two hours work gone, but hey, I can reboot in five seconds.


What I actually envisioned (back when I wrote the piece, and even now) is nothing less than a solid-state computer, which is always 'on' and ready to go, and which only needs to be reset in the event of an upgrade, service call, or catastrophe so profound, the machine needs to be reset to work. I want my computer as accessible (no boot-up time) as my microwave, or even my kitchen sink.
 

Offline Cymric

  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 1031
    • Show only replies by Cymric
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #11 on: January 12, 2005, 08:15:24 AM »
@MiAmigo:

I note that your emphasis has shifted from technical matters to matters of 'elegance'. Which is fine, but does not quite address the issue at hand. You claim PocketPCs and notebooks even get quite a lot of bang for the buck, and even run cooler too. True, very true. But nobody is expecting them to run Doom III or Half Life 2 at 1600x1200x32 at 8xFSAA and over 80 Hz refresh rate either. Whether you need or want such insane specs is quite another question. (I would not, my monitor cannot cope, so why bother?)

In addition there is always the tendency of people to buy 'for the future': buy a too-fast card today so it can be normal specs tomorrow, and while not perfect, be adequate the day after. It takes a strong will to buy a lower-spec model just because you know you won't need anything faster and if you can just as easily buy the faster model! 'Elegant' and tight solutions are much harder to upgrade: I have yet to see people upgrade their PocketPC and notebook. Instead they sell off the old one, and buy a newer model. The Amiga, 'elegant' as it was, proved to be a {bleep} to upgrade, as many A500 owners found out to their detriment as magazines began to show all the benefits of faster machines. (CPU upgrade usually caused problems with the caches or conflicted with stock standard hardware on the expansion port, Kickstart upgrade messed up games and demos, and if you wanted different graphics or sound, well, you would be very much stuck.)

With that all in mind, and going back to 'hot' and 'noisy' nVidias, I am not at all sure whether you could do much better than is the case now, given, of course, our current state of technology, and the fact that it is a consumer product, so cannot be overly expensive. That is elegant, in a way, but of course not the 'elegant' you had in mind.
Some people say that cats are sneaky, evil and cruel. True, and they have many other fine qualities as well.
 

Offline bloodline

  • Master Sock Abuser
  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 12114
    • Show only replies by bloodline
    • http://www.troubled-mind.com
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #12 on: January 12, 2005, 08:39:59 AM »
Quote

MiAmigo wrote:
Quote

Rogue wrote:
Quote
Quick reboots lead to sloppy programming and sloppy use:


Sorry, but I think that's nonsense. Reboot time is completely irrelevant if your data just got blown away by a crash. You can't say, oh heck, two hours work gone, but hey, I can reboot in five seconds.


What I actually envisioned (back when I wrote the piece, and even now) is nothing less than a solid-state computer, which is always 'on' and ready to go, and which only needs to be reset in the event of an upgrade, service call, or catastrophe so profound, the machine needs to be reset to work. I want my computer as accessible (no boot-up time) as my microwave, or even my kitchen sink.


Ahhh, you'll be wanting XP then! :-D

Offline Waccoon

  • Hero Member
  • *****
  • Join Date: Apr 2002
  • Posts: 1057
    • Show only replies by Waccoon
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #13 on: January 12, 2005, 09:53:11 AM »
Quote
MiAmigo:  So, the Amiga today would sport not one, but, actually three equally powerful, equally advanced, state of the art Motorola chips, each designed specifically for jobs in sound, and graphics

Refer to Sony's "Grid" architecture.  Data processing, graphics, and audio all require much the same instruction set to process.  The reason why we have dedicated chips to do that work is because they are hard-wired to do the same intructions over and over without fumbling with complex decoders.

GPUs really are complete computers, but they are still used as "accelerators."  Why not have three CPU cores in a single chip, or ten seperate coprocessors?  It's not the complexity of the calulations you want to get done, it's just how fast you need it to be done, at a given cost.

Old arcade machines didn't have dedicated CGI boards.  They just had a half-dozen ordinary CPUS:  68K, Hitachi, Texas Instruments, AT&T, and so on.  My favorite arcade game, Hard Drivin', had a very demanding graphics and audio system, but was driven by several, standard parallel CPUs -- one 68K to do the physics and run the game, and two Texas Instruments processors running in parallel to do the polygon calculations.  Sega machines are infamous for using 68K chips for audio mixing and effects, instead of dedicated DSPs.  You had more programming flexibility that way, instead of being restricted by the limited capabilities of a hard-wired DSP.

Flexibility sometimes wins out over speed.  Dedicated sound chips are being phased-out in favor of CPU mixing.  Audio effects on a Pentium 4 chip are easier and far more capable than the effects possible with a standard EAX-compliant chip, like the EMU10K found on a typical Sound Blaster Audigy card.

nVidia has removed their much-applauded audio engine from their nForce chipset, and they have publicly stated that hardware-accelerated audio is dead.

I saw a demo where an ATI GPU was performing realistic fluid simulation as a smoke demo -- in the GPU itself.  A few years ago, only powerful, full-featured CPUs with floating point support could do stuff like that.

Quote
Cymric:   System Configuration: Also, the OS would be completely configurable at start up

Is it really necessary to have the same hardware configuration for each of the tasks you specified?  Why not have a dedicated OS for each task?  Note that many OSes come in single processor or multiprocessor editions, depending on your hardware.  Making one OS that will work on both types of architectures, at least at this time, results in a major loss of efficiency for either type of system.

Quote
Cymric:  Not a bad idea, but AFAIK FPGAs are not fast enough to meet today's demanding specifications

My feelings, too.  It's unlikely that "Flash" memory will ever be used like RAM, simply because it is more complicated and will always be at a disadvantage.  Simpler is always cheaper and more plentiful than complex.

It would be nice if computers could mix many types of memory and prioritize them appropriately.  Who said you had to break things down into RAM and a hard drive?  Why is on-die memory only used for state caching?  Why can't we have high-speed RAM for calculations, cheaper years-old RAM for scratch work, flash RAM for result storage, and then page things out to the hard drive (with no filesystem) at leisure?  Virtual Memory and filesystem models used in today's OSes are far too simple.

Quote
Bring back the Video Toaster, and make it an integral part of the Amiga's standard hardware.

Many graphics cards have video capture built-in.  Making an application that can read standard video streams makes sense, but it's doubtful that every machine needs the hardware.

Then again, sound cards with line input was a rarity a few years ago, and now even budget machines have it.  When all TVs go digital in a few years, everything will be on a serial connection, so broadcast channels and video feeds will all be 2-way, and you won't even need dedicated video capture hardware, anymore.  Death of S-Video?  Why not?

Quote
Bloodline:  My Gfx card and sound card are computers in their own right!

True.  A typical top-tier GPU has more transistors (by several million) and performs more calulations that the highest-spec Pentium!  A graphics card has its own memory and bus architecture, registers, caching... it's really a computer inside your computer -- complete with its own dedicated power feed and regulators.  :-)

Quote
Rogue:  You can't say, oh heck, two hours work gone, but hey, I can reboot in five seconds.

AROS?

Sorry, but it just crashes too damn much.

Quote
MiAmigo:  What I actually envisioned (back when I wrote the piece, and even now) is nothing less than a solid-state computer.

EROS?

Which, by the way, has been discontinued.  That experimental OS didn't get anywhere.

Quote
MiAmigo:  I want my computer as accessible (no boot-up time) as my microwave, or even my kitchen sink.

Yes, as do many interface designers.  However, the only solution to many complex data organization problems is caching indexes, and for that, you need memory storage.  Solid-state computers will probably never really exist.  We'll have computers that hibernate with a battery feeding the memory, at best.

I remember using a text editor written in AMOS that wrote data to the floppy drive on a character basis.  Very reliable, but responsiveness was calamitous.

The idea of backing up memory to a storage device is a bit far-fetched, too.  Maintaining all the data integrity of each component in the system (such as the register state of your graphics sub-system) is unlikely, due to driver issues and non-standards compliance.  I've never owned a computer -- PC, Mac, or Linux -- that would always reliably wake up from sleep mode.  You can minimize boot time, but you can't get rid of it unless it's a purpose-built machine that only does a tiny number of things and requires very little storage.

"Always On" also implies that the hardware never changes.  That's fine for proprietary systems where all the chips are soldered together, but not realistic for open architecture, like the PC.  Otherwise, the machine must verify that the hardware has not changed every time it turns on, and that's where a lot of startup delays occur.  There was a time in history where new hardware would cause the machine to lock up and everything was configured with jumpers.  I don't think people want to go back to those dark days.

Don't forget the rule of resource, either.  If a resource is available, programmers will use it, whether they really need it or not.  Efficiency is too much to ask.  ;-)

Quote
Cymric:  ...But nobody is expecting them to run Doom III or Half Life 2 at 1600x1200x32 at 8xFSAA and over 80 Hz refresh rate either. Whether you need or want such insane specs is quite another question. (I would not, my monitor cannot cope, so why bother?)

Elegance directly conflicts with human nature.  Elegance is necessity, but people want things they don't need.  Yeah, you don't HAVE to have a 3GHZ machine to do certain things, but people will buy them anyway because they want them.  Servers demand elegance and efficiency, and so do dedicated devices like the micro controllers in your microwave and home thermostat.  Such standards really cannot apply to PCs, hand-held phones, and other multifunction devices.  Hell, you can take photos will cell phones these days, and next will be movies, GPS, wireless banking, etc.

Kinda scary, really.

 

Offline Cymric

  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 1031
    • Show only replies by Cymric
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #14 on: January 12, 2005, 10:10:44 AM »
Quote
Waccoon wrote:
It would be nice if computers could mix many types of memory and prioritize them appropriately.  Who said you had to break things down into RAM and a hard drive?  Why is on-die memory only used for state caching?  Why can't we have high-speed RAM for calculations, cheaper years-old RAM for scratch work, flash RAM for result storage, and then page things out to the hard drive (with no filesystem) at leisure?  Virtual Memory and filesystem models used in today's OSes are far too simple.

Now that is a good idea. I wonder when it will become mainstream to include a line like RAM: 256/1024/128/lots MB in advertisements ;-).
Some people say that cats are sneaky, evil and cruel. True, and they have many other fine qualities as well.