Amiga.org

Amiga News and Community Announcements => Amiga News and Community Announcements => Amiga Hardware News => Topic started by: seer on September 08, 2004, 07:07:16 PM

Title: Intel Announces 65nm Breakthrough
Post by: seer on September 08, 2004, 07:07:16 PM
Intel claims to have achieved a significant milestone in developing next-generation chip manufacturing technology.

In what is proving to be a self-fullfilling prophecy, Intel has managed to shrink transistors enough so that chips can have more of them. It was Intel's founder, Gordon Moore, who, in 1965, claimed that the number of transistors on a chip would roughly double every two years. This prediction has come to be known as Moore's law and this latest achievement, Intel claims, confirms its founders prediction.

More here (http://www.megagames.com/news/html/hardware/intelannounces65nmbreakthrough.shtml)

I suppose IBM is looking towards doing the same.
Title: Re: Intel Announces 65nm Breakthrough
Post by: mikeymike on September 08, 2004, 11:01:18 PM
Considering they haven't properly 'broken through' with 90nm technology*, I'm skeptical.

* - leakage problems, much more wasted energy than with previous processors.
Title: Re: Intel Announces 65nm Breakthrough
Post by: Glaucus on September 08, 2004, 11:30:25 PM
Ummm....  what are they at now???

  - Mike
Title: Re: Intel Announces 65nm Breakthrough
Post by: chsedge on September 08, 2004, 11:54:45 PM
leakage is a problem concerning the mosfet transistors. You can reduce feature size even more because there's a physical limit of materials involved. After that the mosfets aren't under control anymore...
Title: Re: Intel Announces 65nm Breakthrough
Post by: KennyR on September 09, 2004, 12:03:00 AM
They've just started to investigate diamond and silicon carbide transistors. Maybe Intel and IBM should invest in that direction. A CPU core built from these would suffer much less leakage.
Title: Re: Intel Announces 65nm Breakthrough
Post by: minator on September 09, 2004, 12:16:10 AM
They'll all be working on this and probably have been for many years, if you read the right stuff you'll see the odd comment about plans for 22nm which is 7+ years away.

I can remember reading about IBM experimenting with 65nm while Commodore were still around, but only at the lab level.

Freescale (and partners) announcing a while back they'll be going into pilot production in under a year.


Quote
Considering they haven't properly 'broken through' with 90nm technology*, I'm skeptical.


Intel will be using SOI for 65nm so their leakage problems should reduce considerably.
Title: Re: Intel Announces 65nm Breakthrough
Post by: Waccoon on September 09, 2004, 08:15:28 AM
Funny, after decades of research some things are still theories, like the Theory of Evolution, the Big Bang Theory, and the Theory of Relativity...

...but in the fast paced world of computers, we have Moore's Law.

Sorry, that just always ticks me off.
Title: Re: Intel Announces 65nm Breakthrough
Post by: bloodline on September 09, 2004, 09:16:12 AM
Well "Moor's Rule of Thumb", would be more accurate... but less of a soundbite, in an industry built on soundbites.
Title: Re: Intel Announces 65nm Breakthrough
Post by: mikeymike on September 09, 2004, 09:26:33 AM
Quote
Ummm.... what are they at now???

They're at 90nm at the moment, but the Prescott core is a total bar heater (~130W wasted energy when the CPU is being run hard?).  IBM have also been trying to get a 90nm Power5 core out but have been suffering from similar leakage problems.  I haven't heard much about AMD's attempts.

I think the announcement is stockholder candy.
Title: Re: Intel Announces 65nm Breakthrough
Post by: itix on September 09, 2004, 11:35:34 AM
Interesting Amiga news I must say ;-)

But seriously it is unfortunate Moore's law is not true in the PPC world.
Title: Re: Intel Announces 65nm Breakthrough
Post by: Hammer on September 09, 2004, 11:58:06 AM
AMD uses Low-K Black Diamond in its 90nm process. All of the current 90nm AMD64s (Q3 2004) are being shipped to the mobile market (one of the premium processor markets).
Title: Re: Intel Announces 65nm Breakthrough
Post by: Waccoon on September 09, 2004, 07:27:12 PM
Quote
But seriously it is unfortunate Moore's law is not true in the PPC world.

Or in the GPU world.  For quite a while GPUs were trippling in performance every 18 months.  Of course, competition is much tougher in that industry.
Title: Re: Intel Announces 65nm Breakthrough
Post by: billt on September 09, 2004, 08:06:29 PM
>Funny, after decades of research some things are
>still theories, like the Theory of Evolution, the
>Big Bang Theory, and the Theory of Relativity...

They've come a long way since they were mere hypotheses... ;)

They've been working on this for a few years at least. Same for smaller technologies, already being worked on, and thre was a recent story on slashdot about 35nm... Lab experiments have created transistors using only 5 atoms also, I read about that some months ago. That's when Moore's Law will truely hit a wall, when there's no more atoms to leave out and have the thing still work. We'll need a profound paradigm shift of some sort at that point to improve performance again.

I'm looking forward to what diamond substrate can make possible. Saw a documentary a while ago about artificial pure diamonds that can be grown to pretty much any size, which is needed for chip wafer production. You can't sensibly make chips using those little natural jewelry sized things... Of course there's always waiting for that diamond production to pick up, they're nearly an underground thing due to fears of retaliation from Debeers and friends, who are of course extremely not happy about the idea of truely pure and easy to make "artificial" diamonds. Diamond substrate will allow far far far better heat tolerance and dissipation both, to allow these extremely small transistors to make really complex chips which will get very very hot.

It's pretty cool to hear about though, I'm still designing at 180nm for the cips I work on. The company is moving to 130nm, but it'll be a while before we're pumping the things out of the fab.
Title: Re: Intel Announces 65nm Breakthrough
Post by: The_Power_of_the_Ginger on September 09, 2004, 10:27:06 PM
Reminds me of the story I heard back in 2000 that by 2090 computer chips will have to generate small nuclear explosions to get the necessary procssing power required for Moore's Law (by then, I'm sure, history will have changes so that it was Sir Roger Moore who made the original statement  :-D ).

Question is, why bother? Soon you will have all the power you need, like a car doesn't need unlimited horsepower, and you only need a few hundred nuclear bombs to destroy a planet, so why make more?

You'd have to have some INCREDIBLY inefficient programs to require processing power by then.

Actually, Windows 2090 (Oh God, I hope not) would be just the application. It would still run slowly and crash...
Title: Re: Intel Announces 65nm Breakthrough
Post by: Valan on September 10, 2004, 06:16:11 AM
Who will buy the next round of chips?

Even the 3d video industry will be satisfied soon with the progress of software efficiency bringing down rendering times.

Hardware is at the stage where most of the market is already satisfied.
Title: Re: Intel Announces 65nm Breakthrough
Post by: mikeymike on September 10, 2004, 10:51:07 AM
There are potential uses for computers that have been put on hold because the processing power hasn't been available.  Some functions can be accomplished in a completely different fashion just because (for example) it can be relied upon for the host machine to have 2GB RAM.

It's not necessarily a case of just getting more eye candy in new software.  It is annoying when that happens, admittedly.

There are other downsides to increased system resources.  Users tend to waste them as well.  The size of the average systray icon collection is increasing with new customers' machines that I see.  The 1GB RAM mark seems to make my life more difficult :-)

I'm sure similar things were said about passing the 100, 200, or 500MHz barriers... "why do we need that much processing capacity?"... "who's going to use this then?".  Sadly, the reason tends to be Windows, rather than increased application capability.  Games seem to be the only real signs of the hardware frontiers being pushed.  I think we're just as unproductive as ever.
Title: Re: Intel Announces 65nm Breakthrough
Post by: toRus on September 10, 2004, 11:11:12 PM
Move over Newton and Einstein. It's Moore's "Law" that drives this world. :lol:
Title: Re: Intel Announces 65nm Breakthrough
Post by: Leo42 on September 13, 2004, 02:34:04 AM
Quote

Or in the GPU world. For quite a while GPUs were trippling in performance every 18 months. Of course, competition is much tougher in that industry.


The Moore has nothing to do with speed, but rather with the amount of transistors fitted in a processor. Even if this has a direct consequence on the speed, these are 2 different things.

Leo.
Title: Re: Intel Announces 65nm Breakthrough
Post by: mikeymike on September 13, 2004, 06:44:49 AM
And of course, the transister count increasing is a bad thing if you're trying to keep down heat wastage and power consumption...
Title: Re: Intel Announces 65nm Breakthrough
Post by: whabang on September 13, 2004, 01:10:03 PM
Quote

I'm sure similar things were said about passing the 100, 200, or 500MHz barriers... "why do we need that much processing capacity?"... "who's going to use this then?". Sadly, the reason tends to be Windows, rather than increased application capability. Games seem to be the only real signs of the hardware frontiers being pushed. I think we're just as unproductive as ever.

Exactly!

100 MHz: The 060 was still on par with early Pentium CPUs. 3D-rendering at home got a serious boost.

200 MHz: Amiga users atarted to complain that noone needs that much CPU-power (Until the PPC-boards were released, that is).

500 MHz: Amiga users had given up whining about CPUs (and bus-arches). People who couldn't afford modern equipment complained about uneccesary eye-candy.

1 GHz: Gamers cheered. Soon we will be able to pirate even more advanced games that we can't play because the HW-demands will be too high.
Intel lost the GHz barrier-battle, but rushes P4 developement by aiming at MHz instead of performance.

2 GHz: Gamers cheer again. Owners of 1 GHz-machines complain about XPs eyecandy.
Mac-users says that you only need 1 GHz CPUs, but goes out the next day to buy a dual CPU-system instead.

3 GHz. Gamers barely stopped cheering about the 2 GHz-barrier before Intel was at it again. Unable to push further, the Mac crowd was given right when Intel put dual cores into their CPUs.

Now, ask yourself how much the functionality of productivity applications has increased during this period.
 :-D

Personally, the only major benefit I got from getting a newer system was a queter system, and a higher max. amm. of system memory.
I went from AGP to PCI on the Gfx-front, as I doubled the ammount of Video-RAM. The new card was considerably faster, BTW.