Welcome, Guest. Please login or register.

Author Topic: Google Acquires Rights to 20 Year Usenet Archive  (Read 4468 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Google Acquires Rights to 20 Year Usenet Archive
« on: January 11, 2005, 01:13:40 AM »
Check this out.

Using it, I found this old post I wrote some years ago (April, 2000) in an Amiga forum.

(DISCLAIMER: Since this posting is a 'bit' long, and dated, "it does not necessarily STILL represent the opinions of me, or the other guy mentioned in its pages.")

The Next Phase:

Amiga Rebirth, Resurrection, and Development Into the next Millennium

        The Amiga's hardware and architecture are what made it
different, and hence, vastly superior to the PCs of its day. Back when
PCs could only ‘beep' and ‘boop', with hardware best be described as
‘clunky', the Amiga (and even its ideological and technological
predecessor, the Commodore 64) could already do the most amazing
things in graphics and sound. One product of this amazing
technological leap forward was the Video Toaster, which was, at one
time, (with the vast powers of the Amiga at its core) the industry
standard for producing much of the video broadcast content of many
small and large, in-house studio graphics departments.
        Because of its corporate woes, managerial ineptitudes, and
just plain "ball-fumbling," Commodore, shortly after snapping up the
‘Lorraine', (and with precious few years of development under its
belt), was unable to bring this machine to the full fruition of its
latent powers. And sadly, during its struggles in the limbo of that
unrealized potential, the lowly PC, (once easily considered with some
contempt by Amiga owners as the lobotomized love-child of low-tech
‘first cousins') had evolved to what it is today, something somewhere
between a technological wonder and logistical nightmare.
        So where would (or should) the Amiga be now, if, it too, had
had the chance to grow and evolve with the technological advances in
board and chip design? Well, a couple of friends got together for a
long weekend of driving (from Chicago, Illinois to Centsible Software
in Berrien Springs, Michigan) and palaver, and came up with the
following scenario:

 The Processor Core: The Amiga would have stayed with its
parallel processor core design. We always knew that one of the things
that made this such a versatile and powerful machine was the fact
that, even though main processor speed was relatively slow, much of
the burdensome work of processing graphics and sound was done by not
one, but two other chips, which when combined, made a very powerful
factory for multimedia apps. So, the Amiga today would sport not one,
but, actually three equally powerful, equally advanced, state of the
art Motorola chips, each designed specifically for jobs in sound, and
graphics. Also, each of these powerful processors should have direct
control and access to the system resources they require, including
proprietary and system-wide RAM and transport buses. In other words,
no cards!

        The OS: Currently, each of us is a PC user, having one by one
(and out of sad necessity) given up on the platform that at first
seemed a gift from the god of technology, and eventually ended up its
orphaned stepchild. But one thing we all remember is this: the Amiga
was always ‘ready to go'. Boot-up time was inconsequential, as most of
the Operating System seemed to be hard-wired into the machine! (This
is amazing, in light of today's boot-up times, which can take five
minutes, or more, depending on the configuration of the hardware and
software.) Today's Amiga would have most of its OS on a PCMCIA card
(or an array of such cards). It would still be completely upgradeable,
by ‘flashing' either patches, or entire rewrites, and many, many times
more accessible at system start. We think that a Linux OS would be
very well suited to this type of adaptation. So much for the GUI
aspect of the OS. For programmers and developers, we also envisioned a
return to the command line interface, with a fully realized DOS-like
(or UNIX like) command structure.

        System Configuration: Also, the OS would be completely
configurable at start up. What does the user specialize in? Sound
apps? Graphics? Multimedia? Games? Weather pattern predictions, or
other ‘super-computer' sims? The user would be able to choose, from
the Boot Up Menu, the most optimum configuration for the job at hand,
with the operating system then customized for that work, and reserving
all system resources for what's being done and used at the time, and
nothing else. This would not only bring more sys resources to bear on
the job; it should ideally result in more stability, and less downtime
from crashes and hang-ups. One of the ways we discussed this might be
achieved was the use of a ‘partitioned' operating system, one that
could easily and readily step between different modes, each for a
different job, like re-booting with a new configuration, but ‘on the
fly', without any loss of data, and no down time. (Sort of like the
way programs pass parameters to subroutines in a chained program, or
an overlay.)

        Open Source: One of the great ideas of today would certainly
not only work on this new Amiga, it actually may have had its start in
those early days of computing, when more likely than not, a user was
also a programmer. After all, back then, when there were no ‘Windows',
and no vast software support, many users became (out of either
interest, necessity or both), programmers. Those who didn't go that
far were at least able to ‘type in' programs from magazines. This, in
turn, led to not only an intuitively better understanding of the way
computers and their components worked, but also gave some insight as
to how to ‘tweak' them. Therefore, in the spirit of those early days,
‘open source code accessibility' is something that should definitely
be a part of this newly evolved machine. The Amiga, at its best, was
not only a user's platform, but a developer's and programmer's, as
well. (By contrast, many Windows users today are just that, ‘users',
who have no knowledge of how their machines work. How many people
drive cars, and yet have no knowledge of how a car works, or how to
maintain one? This aptly describes the typical Windows user.).

        Storage Capacity: Back to hardware. As multimedia machines of
the first order (and the first generation) the new Amigas would have
to have vast storage capacity, on an order of two to three times that
of today's PCs. This could be achieved several ways, one of the most
logical and accessible being that of the RAID array, with some
modifications. Hard drives should be fully hot swappable, and arrayed
in multiples. The Amiga hardware could include a bare-bones (or empty)
drive array rack, which the user could then build upon to meet storage
needs. Today's prices in storage solutions easily support such a
system. Eventual capacities of 50 to 100 GIG are conceivable with
moderate comparable cost.

        RAM: As far as RAM is concerned, more is always better. But
how much does one invest in RAM, until it becomes unwise insofar as
cost/projected time of use? One would hesitate to put one GIG of RAM
into a machine, when cost and the time until the next machine upgrade
is taken into consideration. This is one of the problems plaguing PC
users today is that, at some point, RAM evolves beyond a point where
it can be used by motherboards of a certain age (make, and
manufacture). Most would balk at spending top dollar for lots and lots
of RAM, and then not being able to use it in the next machine.
Therefore, RAM for the new Amiga should be of a new type, upgradeable
by ‘motherboard flashing', or a ‘governor chip', which would act as a
RAM controller, and, ideally, extending the life of these chips
considerably, by making them as variable and upgradeable by software,
and firmware, as other elements of the computer. For that matter,
since the new Amiga's sound and graphics would both be handled by
proprietary processors, they, too, should be, to some extent,
upgradeable in this manner.

        Secondary storage: There are, and will be many, many options
for this type of storage. Floppies of varying speeds and capacities,
ZIP disks, backup storage devices (tape), all clamor for support and
space in today's machines. The new Amiga would be no different.
However, it could be different in how it approaches the matters.
Instead of being designed for one type (or a couple) of these
secondary storage devices, the Amiga would sport a new, proprietary
specification. A controller device (a secondary backup ‘manager')
which could be configured by software, and a multiple-array docking
station, fully user upgradeable outside the case, (or, at least easily
accessible), to handle each of not only these devices, but also, any
which may come along. This, added to the ‘hot swappable' (see System
Configuration, above) OS could give the Amiga much more flexibility
over today's typical PC machines. (As far as CDs and DVDs go, it goes
without saying that the Amiga would definitely support these devices,
(and other optical media) with the same type of infrastructure, if so
desired by the manufacturer or user.)

        The Motherboard: A few words. The motherboard for the newly
evolved Amiga would actually be somewhat less capable than
motherboards currently in use in today's PCs. Why? By off-loading much
of the stifling specifications from the mainboard to the
processor-chips, (which would be removable and hence, fully
upgradeable) its use would be extended beyond that of today's systems.
Therefore, increases in bus speed, AGP specs, and over-clocking
tolerances could conceivably be lessened, if not completely dismissed,
since it would be the chips that need upgrading and replacing, and not
the entire motherboard.

        Multimedia Apps & Closing: Bring back the Video Toaster, and
make it an integral part of the Amiga's standard hardware. As with
other add-ons, it could come in different configurations, or none at
all, according to what the user would need, and be willing to pay for.
But it should definitely become a basic part of the machine, like the
sound chip, and the video chip.
        Well, that's about it. That was a long day of driving, and
many, many other ideas were discussed, brain stormed, and tossed
about. As with all such sessions, we did not limit ourselves by such
pesky concerns as to what is ‘possible' or ‘feasible' or ‘cost'. Such
thoughts only kill good ideas, and oftentimes, create a negative
atmosphere in which such ruminations may never see the light of day.
Sometimes, ‘not possible' really only means no one has thought of a
way of making a thing work. Having said all that, we hereby present
the core of our talk. If anyone's taken the time to read this, we are
grateful, for it is more than we could ask or hope for.
Curtis M. Harrell, Jr. & Sterling Hankins.

 8-)
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #1 on: January 11, 2005, 09:40:24 AM »
Quote: That's a weird computer you're describing, even for the year 2000. I can see the need for a dedicated graphics processor, but a dedicated sound processor... Hrm. No cards is of course a Bad Thing, unless you want your users to be locked in and not be able to upgrade properly or replace broken parts. (I know, card replacement doesn't happen very often...
__________________________________________________________________________________________________________________________________________

Well, it has been a few years since I wrote that, and some of my ideas 'may' have changed, or, maybe not. And, five years in the development of technology is a really long time, as we all know. Still, I do remember the frame of mind I was in after our visit to centsible. And in many ways, I still feel the same way. The original Amiga was a truly revolutionary machine, which didn't utilize almost any of the then-current solutions for designing and building computers. In that same sense, I was willing, in my 'pretend' design stage, to willingly and knowingly leave current design convention behind as well, and attempt to find newer, more innovative ways of creating a powerful machine.

I have a lot of problems with the way typical PCs do almost everything they do today, from graphics, to sound, to motherboard traffic. Such machines are (overly) large and powerful because they aren't very elegantly designed, or realized, so they almost have to be. I don't know if it’s because of the clunky OSes demand hardware on steroids, or if its the other way around – it’s probably a little bit of both. But just because nVidia decided that 'this' is the way to do graphics, doesn't mean its the right way, or the only way. (You can lift a boulder with a crane that costs tens of thousands of dollars, or you can lift it with a lever that costs next to nothing.) Speaking again of current graphics accelerators, look at what the result is, a card with more memory than computers 5 years ago, gobbling up power, generating heat (and money!), and manipulating huge amounts of data on textures, at the cost of overall system performance. Not very efficient, or elegant, and extremely demanding on the wallet, and the resources of the machine. But, this technology evolved out of the PC's way of doing things back when the Amiga already had a better way. What would we have now, if that 'better way' would have been allowed to evolve? If the 'video card' were a true, full-blown video processor, for instance, it could use complex calculations to create and modify graphics ('micro-fractals?'), instead of storing, or sampling any raw data. Still,  'it is beyond the scope of this post' (always wanted to say that!) to fully go into all the intricacies of how such graphic accelerators work, but, needless to say, the way they work is probably not the be-all and end-all of solutions, nor the most efficient.

The same could be said for the motherboard; I'm sure we haven't seen the end of their evolution, or the best solutions to many of the challenges faced by mobo designers.  In this instance, in the interest of the goal of finding a better, more revolutionary way, I would again fully advocate throwing out both the bath water AND the baby, and starting over from scratch. After all, isn't that what they did back when the Amiga was originally conceived?

P.S. I still think the 'no cards' idea is also a good one. The idea of a processor suite does not necessarily mean that  they (the cpus) have to be hard wired to the board. They could just as easily be removed and upgraded at any time. And, the evolution of technological devices, which always tends towards smaller, and not larger (pocket pcs, phones and notebooks?) would almost insure that, at some point, we'd have to drop the big, clunky cards for something a lot more modular.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #2 on: January 11, 2005, 10:17:20 AM »
Quote

bloodline wrote:
Modern PC's are the technological evolution of the Amiga idea.

My Gfx card and sound card are computers in their own right!


I would tend to disagree. Even though I, too, use these 'newfangled' PC cards, I would compare them in this way, to what I feel the Amiga would have/could evolve in to: basically, the difference between Bruce Lee (the Amiga) and Hulk Hogan (the PC).
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #3 on: January 11, 2005, 06:39:18 PM »
Quote

Cymric wrote:
This is sidestepping a little, but I think you're placing too much faith in graphics alternatives. There is a reason why we are manipulating textures, bump maps and working our asses off to get affordable and quick lighting in a chip, and that is simply because the alternatives are even more demanding and expensive. Do you have any idea what it takes to render a decent-looking 3D scene? In other words, have you ever written out the math (traditional, fractal or otherwise)? Because I have a hunch you haven't.

Second, blaming the PC (graphics) architecture for turning out the way it did 'even when the Amiga already had a better way' is really nonsense. The Amiga showed the power of a coprocessor doing video or sound work, and unless I miss my guess, that's exactly what nVidia, ATi, Matrox, 3dfx, Creative, Terratec, and all the others were/are/will be doing. In fact, they make exchangable coprocessors. On cards, true, but exchangable nonetheless. So please, enlighten me, what is the 'better way of the Amiga'?


Well, actually...

No, I've never done any of the 'actual' math for scene rendering of any type, and I doubt many of us have. But, I have written my own games, and I do have some idea of what it takes to make things move as fluidly as possible. (From a programmer's point a view, and simply put, I would say lower level languages over the higher ones, the lower the better, with pure machine code being the most obvious, though also most cumbersome choice). And yes, these modern cards do use co-processors onboard to lots of the processing work, but, and I repeat this again, at great cost system-wise, money-wise, and in terms of the need for more and more heat dissipation. Look at pocket PCs: for their size, use and cost, they have sound and graphics that are far superior to desktops of a few years ago. And, depending on type and cost of model, they can handle phone calls, play games, perform a host of other computing tasks, and do a pretty sophisticated job of it, all with no cards, and not much heat generation. And, of course, their cost is going down all the time. The same, to a greater degree, can be said for today’s notebooks. So, it is possible to be more conservative of system requirements, and still have a robust system, if there’s both a need and desire for it.

Necessity IS the mother of invention, and can lead to greater innovation, as it did in the Amiga's time. True, system resources for graphics and sound were little, to none, memory was expensive, as well as storage, etc, etc. But, the end result was a capable, powerful machine that managed to 'do without'. Had the machine been allowed to evolve and survive, operating along those same design principles, the face of computer technological development would be very, very different today. But it’s always easier to just throw a lot of money, superfluous hardware and bloated code at a problem, rather than solve it elegantly. There's that word again. 'Elegant' is a term old programmers used to use to describe a job, hardware, or programming, that was extremely tight, small, and well done, with as few resources as possible. I defy any one using PCs today (mostly Wintel based) to find anything 'non-bloated', or elegant about how those machines operate. I would say this goes for every aspect of their operation, from the software solutions, to the hardware ones as well. In the constant rush, down through the years, to get (buggy) code, and system-hungry hardware to the market, we have evolved computer systems that are as bloated, inefficient, and money hungry as, perhaps, the society at large, that created them.

As for, what the better way of the Amiga is? I would say, rather 'what that better way could have been, or should be'. Sadly, we may never know. It’s probably a bad idea to try to resurrect the machine, many have tried, and few if any have succeeded. It could be that some of the things I'd like to see will eventually come about from the constant evolution of the smaller devices, its been a historical fact that innovation in these areas often 'trickle up' to the larger devices, after time has proven their worth. As for enlightening 'us' on that better way, sir, I could do that for hours and hours and hours, but who would listen? Not many are open to ideas as words and words alone, most people, myself included, would like to eventually see something in hardware, before long, to prove or disprove ideas. But the best ideas often start as words, a coin this medium is most suited for, so like everyone else, and for now, its all I have.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #4 on: January 12, 2005, 01:07:18 AM »
Quote

Rogue wrote:
Quote
Quick reboots lead to sloppy programming and sloppy use:


Sorry, but I think that's nonsense. Reboot time is completely irrelevant if your data just got blown away by a crash. You can't say, oh heck, two hours work gone, but hey, I can reboot in five seconds.


What I actually envisioned (back when I wrote the piece, and even now) is nothing less than a solid-state computer, which is always 'on' and ready to go, and which only needs to be reset in the event of an upgrade, service call, or catastrophe so profound, the machine needs to be reset to work. I want my computer as accessible (no boot-up time) as my microwave, or even my kitchen sink.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #5 on: January 12, 2005, 11:19:55 AM »
Well thought out post. And well written. And much too long to quote here!  :-o Just like to say briefly, that what you see as limitations or roadblocks are actually the shortcomings of current technology. And those usually don't last long, once they stand in the way of progress. How many innovations do we have today, which were deemed impossible just a few years ago? Too many to list here. I maintain that each of these limitations will fall in their time, so that the computers of ten years from now will be full of 'stuff' that we, here, today, would say 'impossible!' if we could travel to the future to see them. And, of course, our machines, will seem as big, slow, old and 'clunkety' as the Osborne laptops of old.

 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #6 on: January 12, 2005, 06:19:29 PM »
NVIDIA’s Geforce 256 GPU murders any current Motorola CPU when it comes to 3D acceleration...

[/quote]
Could that possibly have anything to do with the fact that, nVidia chips (and ATIs) have been constantly evolving, driven by the gaming market (by far the most powerful force behind the growth of any computer technology), while Motorola chips have not? Its been shown throughout our industry that, what one company (or person) can do, so can another. I think, obviously, given the market, (and the reason) to push the envelope of their hardware development, Motorola (or any current chip manufacturer) could also boast similar technologies. In that sense, it’s like comparing the current state of the machines (the Amiga, stuck in the past), and the PCs that hosts those chips and technologies, directly to each other right now. It is, in effect, not a valid comparison. And who knows? Commodore, which held rights to CMOS technologies, may have developed their own custom chips.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #7 on: January 12, 2005, 07:07:46 PM »
Cymric wrote:
@MiAmigo:

I note that your emphasis has shifted from technical matters to matters of 'elegance'. Which is fine, but does not quite address the issue at hand. You claim PocketPCs and notebooks even get quite a lot of bang for the buck, and even run cooler too. True, very true. But nobody is expecting them to run Doom III or Half Life 2 at 1600x1200x32 at 8xFSAA and over 80 Hz refresh rate either.

Quote

Not yet, but, given time, and the nature of the market, they soon will. Just as notebooks have been 'forced' from the business arena, into the gaming one. Why? Because people decided they wanted their notebooks to be as capable as their desktops, in this area. Which demanded a better way to implement the technology ('more elegant'), a more efficient way of solving all the problems desktops had, to be able to port these games reliably to notebooks. Smaller devices, such as Pocket PCs will follow in time. As a matter of fact, they already are, just look at what Sony's doing.
 

Offline MiAmigoTopic starter

  • Arbiter of Succession
  • Sr. Member
  • ****
  • Join Date: Dec 2004
  • Posts: 391
  • Country: us
  • Thanked: 1 times
  • Gender: Male
    • Show all replies
Re: Google Acquires Rights to 20 Year Usenet Archive
« Reply #8 on: January 12, 2005, 09:22:25 PM »
Sometimes, there is too much of a willingness to gloss over, or completely dismiss the concept behind the term ‘elegance’ when used to refer to current computer technologies. Maybe the term itself is misleading; perhaps ‘efficiency’ is a better word. In these posts, the Amiga has been used as an example of what computer technology could have become, simply because a), this is an Amiga forum, and b), in its time, it so far eclipsed what then-current computers were doing, that it makes a good candidate for ‘what if’ scenarios.

Throughout these discussions, I have taken a few basic tactical positions.

1). The current technological limitations (many listed here in this thread) are merely obstacles to be overcome, which is, rightfully, all they can be seen as, or we’re stating that we are basically ‘finished’ when it comes to evolution of computing machines. I, too, am impressed with the specs of current graphic accelerator technology, but I also understand that being too enamored of the achievements of the present, can endanger, or, at least, severely hamper whatever possibilities may exist for the future. The proper attitude should always be ‘that’s good, but, how can it be done even better’.

 2). Every technological advantage we enjoy and take for granted today was once seen as an impossible, or insurmountable challenge. ‘No problem is unsolvable’ is the mantra that must always be adapted whenever pushing this envelope (be it graphics manipulation, power sources, memory and storage issues, or power problems) to the next level, or there simply will be no ‘next level’.

3). As technology moves forward, it becomes more and more important to get these jobs done in the most efficient manner possible (read: elegant). Why? How large (and hot) does a graphics accelerator have to get, before its too unwieldy? Right now, they’ve almost doubled in size, and require their own power supplies. Will cpus follow suit? Even thought hard drives now sport massive amounts of storage, their size hasn’t really changed that much: a good thing. Hence, they still fit in current machines. But, what about reliable storage solutions for smaller devices, which are the future? There is an entire host of them, CF Cards, Secure Digital, USB devices, and of course, notebook variety HDs, but none of them has, as of yet, approached the truly massive capacities, and reliability, that today’s users and applications expect and demand. And, there’s the heat problem with them as well. Again, better designs, and new solutions to old, ‘insurmountable’ problems, are absolutely necessary. It could be that somehow combining these two technologies into an as yet, unforeseen new hybrid will be the answer.

Another reason why better design is a requirement for continued evolution of these machines, or, it’s a scientific fact, they will reach of point where evolution is no longer possible. This is a command fact of physics. Example: Ever fired a gun, or shot a bow and arrow at a distant target, or, even lifted an extremely heavy weight? What these things all have in common with computer design is an increasing need for accuracy and precision, as magnitude, unit of measure, and quantity increases. Although its very easy to hit a target that’s only a few feet away, hitting that same target becomes extremely difficult, when its several dozen yards away, since the greater amount (of the distance) magnifies errors of aim and execution from insignificant ones, to formidable ones. The slightest misjudgment, the merest shaking or movement of the bow (or gun) are greatly multiplied, and increased, as the distance is traversed, and the projectile travels farther and farther from the point of origination. (Anyone who has ever done any power lifting, or bodybuilding knows the same holds true: errors in sloppy execution are greatly increased as the weight increases, to a point where it becomes impossible to perform an extremely heavy lift without perfect technique). These same laws and principles, hold true, of course, for any increased amplitude of performance, such as speed, and overall efficiency, when applied to machines, as well, and computers, for all their glory, are nothing more than very precise machines. As they grow in power, that precision must be extremely accurate and efficient.

So, at what point do current machines become so bloated that they can no long (efficiently) support their own weight? Too noisy, too hot, too much storage required, too much power? These are all valid concerns, since poor solutions could easily move them out of the market of practicality, and affordability. A lot of these problems could be solved right now, if better, more efficient ways were used to implement them. While its true that sometimes “Elegance directly conflicts with human nature”, it’s also equally true that everyone can generally ‘have their cake, and eat it too”, with a little more innovation in the areas of better, more efficient technology implementation, instead of ‘hack’ solutions, i.e., ‘just make it bigger, stick on another power socket’, shove in more RAM, add more transistors’. There is a point of no return, where, not only will these solutions no longer be enough, they’ll actually begin to ‘trip over themselves’, collapsing under their own weight, and severely compromising the very systems they support. This is happening right now: a well-known fact to CPU designers at Intel and AMD, and all hard drive manufacturers, who must constantly come up with new ways to ‘shoe-horn’ in more storage capacity. They all realize that soon, they must ‘hit the wall’, and come up with a newer, more innovative, better way, or die. Its just another example of where increased quantities demand more and more accuracy of execution, and more efficient solutions.