Amiga.org
Amiga computer related discussion => Amiga Hardware Issues and discussion => Topic started by: takemehomegrandma on October 31, 2012, 02:35:09 PM
-
This post is probably most interesting to AROS currently (and to everyone else interested in general computer technology evolution of course ;)):
Just days after the Exynos 5 Dual (http://www.theverge.com/2012/8/9/3231616/samsung-exynos-5-dual-specs) CPU made its debut in Samsung's new Chromebook (we will most certainly see it in a future Galaxy IV phone as well), the worlds first Cortex-A15 ARM CPU showing truly impressive performance, ARM officially launches the "Next Generation" 64-bit ARMv8 architecture (backards compatible to ARMv7):
"ARM Launches Cortex-A50 Series, the World’s Most Energy-Efficient 64-bit Processors"
http://www.arm.com/about/newsroom/arm-launches-cortex-a50-series-the-worlds-most-energy-efficient-64-bit-processors.php
These are the first two ARMv8 cores available for licensing from ARM themselves. Here is a summary:
Cortex-A53: ~1x "today’s superphone" performance using 1/4 of the power
"The most efficient ARM application processor ever"
Cortex-A57: ~3x "today’s superphone" performance using 1x of the power
"Provides computer performance comparable to a legacy PC, while operating in a mobile power budget"
And like before with big.LITTLE, these cores can be combined in various ways in a single CPU chip: "The Cortex-A53 processor combined with the Cortex-A57 and big.LITTLE processing technology will enable platforms with extreme performance range while radically reducing the energy consumption"
Among those CPU manufacturers that is going to make their own products based on these cores you see Broadcom, Calxeda, HiSilicon, Samsung and STMicroelectronics, and a bit surprisingly AMD:
"AMD has signed a license for a 64-bit processor design from ARM, ending its exclusive commitment to x86"
http://www.techworld.com.au/article/440450/amd_sell_arm-based_server_chips_2014/
Other big players making CPU's based on this new 64-bit ARMv8 architecture you'll find Applied Micro, Cavium and of course nVidia.
nVidia kind of shook the ground in the industry when they announced their future ARM strategy (http://pressroom.nvidia.com/easyir/customrel.do?easyirid=A0D622CE9F579F09&version=live&releasejsp=release_157&xhtml=true&prid=705184) together with Microsoft (http://www.microsoft.com/presspass/press/2011/jan11/01-05SOCsupport.mspx) almost two years ago.
This one is IMHO the most interesting one, the one I personally am most curious about. These CPU's ("Denver" core) will be "designed to support future products ranging from personal computers and servers to workstations and supercomputers", "we are designing a high-performing ARM CPU core in combination with our massively parallel GPU cores to create a new class of processor"
According to this blog (http://blogs.nvidia.com/2011/01/project-denver-processor-to-usher-in-new-era-of-computing/) they are looking to go head-to-head with x86: "Denver frees PCs, workstations and servers from the hegemony and inefficiency of the x86 architecture. For several years, makers of high-end computing platforms have had no choice about instruction-set architecture. The only option was the x86 instruction set with variable-length instructions, a small register set, and other features that interfered with modern compiler optimizations, required a larger area for instruction decoding, and substantially reduced energy efficiency.
Denver provides a choice. System builders can now choose a high-performance processor based on a RISC instruction set with modern features such as fixed-width instructions, predication, and a large general register file. These features enable advanced compiler techniques and simplify implementation, ultimately leading to higher performance and a more energy-efficient processor."
And I think this is a lot bigger deal for nVidia than most people think, they are making something completely different out of it than "just" putting out another "CPU core with GPU" to the market. In this very interesting blog/interview (http://venturebeat.com/2011/03/04/qa-nvidia-chief-explains-his-strategy-for-winning-in-mobile-computing/) (read it, really, do it!), nVidia chief Jen-Hsun Huang describes it as an upcoming paradigm shift, and a "re-invention" of the whole company:
Nvidia 1.0 was PC graphics (made possible by "fab-less production").
Nvidia 2.0 was the creation of the "GPU"
Nvidia 3.0 (about to happen) is about parallel processing (in a "newish" way, as I read it)
Nvidia is for the first time designing both the CPU core(s) and GPU on their own, in-house, towards goals and a purpose they have defined themselves. That interview is 1.5 years old, and in it, Mr. Huang mentions they (a few hundred engineers) have been working internally with this for 3.5 years already. I have seen a post somewhere (can't remember) suggesting that these new chips will integrate the "GPU" and "CPU" on the silicon in a previously never seen manner, the on-chip internal data bandwidth between those parts will be enormous, it will be so considerable that those previously "separated parts" will kind of merge in practice. Again, nVidia themselves are labeling what they are now doing with ARM/Denver as being equally significant as the fab-less chip production and the rise of the GPU concept, it will be the next step, and will constitute "Nvidia 3.0".
That's why I'm so curious to see it. It's definitely going to be more than "just another CPU-core/GPU SoC bundle, only faster". Making a faster "Tegra" won't exactly warrant a "Nvidia 3.0" label, there has got to be more than that to it.
"Every single one of this project are fully funded and the expectation is within the next three or four years [1.5 years ago] we’re going to bring to the mobile market performance that is nearly a hundred times higher than today’s’ PC. And that’s the roll out if you will of our Nvidia 3.0 strategy."
"ARM is now the only CPU in the world that will have deep penetration in the mobile devices, the PC, servers and supercomputers."
1.5 years ago he said that it takes about 5 years to design a custom CPU, and at that time they had already been working 3.5 years on it. There are many speculations about release dates, some say 2013, and that seems kind of probable.
ARM Naysayers - REPENT!
:)
-
Very interesting times, indeed! :)
-
Acorn fans are, I expect, feeling slightly vindicated now.
Now if they'll only start putting any of this tech in useful form-factors such as laptops...Chromebook's a start, I guess, if you're not bound to run Google CollectAllYourDataOS on it.
-
Acorn fans are, I expect, feeling slightly vindicated now.
Now if they'll only start putting any of this tech in useful form-factors such as laptops...Chromebook's a start, I guess, if you're not bound to run Google CollectAllYourDataOS on it.
Bad consumer! No biscuit! :razz:
And yeah, I imagine there's all sorts of things that /could/ be done with this, that'd be wicked cool, but "consumer gadgets" are a safer market than the traditional computer market.
-
Acorn fans are, I expect, feeling slightly vindicated now.
Now if they'll only start putting any of this tech in useful form-factors such as laptops...Chromebook's a start, I guess, if you're not bound to run Google CollectAllYourDataOS on it.
Just as long as that useful form factor has 4 modems and accompanying rj-11 ports :D
-
Just as long as that useful form factor has 4 modems and accompanying rj-11 ports :D
I personally refuse any consumer electronics without at least 8 phone jacks
-
According to this blog (http://blogs.nvidia.com/2011/01/project-denver-processor-to-usher-in-new-era-of-computing/) they are looking to go head-to-head with x86
They can try to go head to head with x86, but they only really compete with the atom. So it's more likely to keep x86 out of the embedded space, than it is to increase market share.
-
From a business perspective, I imagine embedded devices is the money maker currently, so that might be well on purpose.
-
According to this blog they are looking to go head-to-head with x86:
"Denver frees PCs, workstations and servers from the hegemony and
inefficiency of the x86 architecture. For several years, makers of high-
end computing platforms have had no choice about instruction-set
architecture. The only option was the x86 instruction set with
variable-length instructions, a small register set, and other features
that interfered with modern compiler optimizations, required a larger
area for instruction decoding, and substantially reduced energy
efficiency.
Denver provides a choice. System builders can now choose a high-
performance processor based on a RISC instruction set with modern
features such as fixed-width instructions, predication, and a large
general register file. These features enable advanced compiler
techniques and simplify implementation, ultimately leading to higher
performance and a more energy-efficient processor."
OH, bulls...
nVidia tried for years, unsuccesfuly, to get an x86 licensing deal... Than they tried going around legal issues and designed Denver so that it translates x86 code into a custom internal machine code(that design feature will also be present in the ARM Denver core, a legacy of the x86 begginings)... And when Intel said sorry we'll still sue your asses, than they switched to ARM. Now it's all roses it seems...
-
OH, bulls...
According to the nVidia president in that interview, these are the things that matters to them, and the reason to their decisions (made a couple of years ago):
- Energy efficient computing (of growing importance on all markets, not just mobile)
- Mobile computing (one of the sides of "energy efficient computing")
- A common platform, ranging from tiny handhelds all the way up to supercomputer
- Unique control over the IP of that platform (the whole "eco-system" actually)
- Not making some x86 commodity that won't ever be able to compete with Intel
ARM brings this, x86 doesn't.
"Now inside the company we say the way we distilled Nvidia 3.0 down into actions is three arms. We say go parallel, go mobile and go ARM. Now a lot of you have asked me over the years what is our CPU (central processing unit, or computer’s brain) strategy and I’ve said over the years it was ARM. And I said it so matter of factly and it almost sounded like a joke. But it was the truth. It was the same thing I was telling our company inside that our CPU strategy is ARM. I believe that ARM will do for CPUs what Taiwan Semiconductor Manufacturing Co. did for foundries (the contract chip manufacturers)."
"Instead of hundreds of millions of devices, it’s several billion. And so that would make the ARM processor the most valuable instruction set architecture (or chip processing architecture) in the world."
"our strategy with Project Denver was to extend the reach of ARM beyond the mobile, the handheld computing space. To take the ARM processor, partner with them to develop a next-generation 64 bit processor to extend it so that all of computing can have the benefits of that instruction set architecture. It is backward-compatible with today’s ARM processors."
"And so everyone now sees the picture that our CPU strategy really is ARM, that we intend to take the ARM for mobile devices all the way to supercomputers. ARM is now the only CPU in the world that will have deep penetration in the mobile devices, the PC, servers and supercomputers."
"People still thought a cloud over our heads was our big battle with Intel. People said that Nvidia’s Intel chip set (MCP) business is going away and of course we announced that our dispute with Intel has been resolved. We’ve extended our cross license with Intel and the licensing revenues that would come to our company would be approximately $1.5 billion over six years. That by and large replaces and some the business that we lost with MCP."
"There are two reasons why we decided not to do x86. Aside from, well the second reason is what I said earlier that in fact it’s the wrong instruction set architecture. The first reason is simply very large of course. The world’s not waiting for us to build yet another x86 and we’re not going to go hire a bunch of the world’s best engineers so that we can wake up in the morning to go do something that somebody else has already done 25 years ago. It’s not logical. And so it’s another way of saying it’s a commodity. Intel has got every single price point covered from $10 all the way up to $1000. There is not one nook and cranny we can cover by ourselves. AMD has covered everything else."
"At AMD, they actually make perfectly good CPUs. I’ve never met a CPU at AMD that I didn’t like. They are all fine. They just can’t win."
"even if they give me rights to make an x86 chip, I will be building a commodity that at every price point they have an alternative to. And if they have an alternative to everything that I make, and it’s easier to buy from Intel, it’s just really not possible to distinguish yourself in an x86 world. And so that’s sort of the reason why, that’s one of the negative reasons why you don’t do it. But the positive reason is we all want to go make a contribution to something and make a difference in the world. I mean you’re going to go spend $1 billion in r&d, you go spend $1 billion building something that matters."
Everything makes perfect sense in my eyes.
:)
-
From a business perspective, I imagine embedded devices is the money maker currently, so that might be well on purpose.
To meet x86 head on you need to be able to beat them in all of it's strong areas, not just defend your own.
They can complain about x86 being the wrong instruction set, but to prove it they need to launch an ARM core that can beat an i7 and they can't.
As soon as ARM is faster as well as cheaper and less power hungry than x86, then everyone will switch everything to it.
-
ARM is now the only CPU in the world that will have deep penetration in the mobile devices, the PC, servers and supercomputers.
Arguably, Intel already has that with Medfield + Clover Trail and IvyBridge variants.
nVidia only really wanted to make CPUs to pair with their Tesla GPUs for a specific market and uses... Since Denver will not match Intel's Haswell/Broadwell performance it kinda begs the question how it differs from x86 Denver.
aside from a different architecture they will always be playing second fiddle to Intel... And on the ARM side of things, there are more big players than on x86, albeit maybe not one has the advantages and monopoly of Intel on x86...
And lets not forget, Intel owns an ARM license and should they decide one day to reenter that market with their fab/process advantage they could easily decimate competition...
The power usage thing is a myth, Intel proved with Medfield and Clover trail x86 scales to sub 1W TDP while delivering A9/A15 rivaling performance - and that with an ancient in-order Atom core
-
These are significant leaps forward for ARM.
But...a couple of things need to be pointed out.
We're still talking about sub-2.0 GHz processors here (for the moment).
PPCs were at 2.7 years ago when Apple abandoned them because they weren't getting faster quick enough. And X86 currently cruises happily above 4 GHz.
Both PPC and X86 have had 64 bit variants for several years now (while 64 bit ARM processors have been announced, but are not in production).
And Nvidia's President is known for colorful hyperbole, but the company has only limited experience designing and manufacturing CPU (while making snide comments about AMD - the only company who's products have ever topped Intel's).
Intel was so sure it didn't need ARM that it sold off its ARM designs.
And while IBM has produced some ARM processors at its foundries, you don't see them falling over themselves to pursue this either.
So Via, Nvidia, and a host of other small players say this is the next big thing.
This doesn't necessarily make it so, and until ARM has at least performance parity with other ISAs, its a little premature to crow about this.
-
To meet x86 head on you need to be able to beat them in all of it's strong areas, not just defend your own.
They can complain about x86 being the wrong instruction set, but to prove it they need to launch an ARM core that can beat an i7 and they can't.
As soon as ARM is faster as well as cheaper and less power hungry than x86, then everyone will switch everything to it.
Power, power, power! The same insane obsession again and again. Computers aren't cars! When will this madness end?
Raw power approach is ugly, ineficient and...boring as hell.
Heading Intel processors (i7 or whatever, I don't care) may be impossible to beat in a raw processing power measurement. But ARM processors can't be beat for low-profile designs where power needs are crucial. And I'm not only talking phones here.
As an example: take my damn noisy and hot Intel i5-based laptop and give me a cold, silent, efficient ARM-based laptop with 1/4 the raw power. Keep raw power for yourself! All I need is a video core with HW geometry transformation engine and HW fullHD H264 & MPEG2 decoding.
You see? The ARM computer is more capable, better designed and respects the enviroment by not needing a thermonuclear plant for himself as Intel chips usually do.
Oh, I forgot! we have FULL bloated desktops with ZERO optimization or software/hardware integration, call it Windows, Mac OSX or Linux, wich need RAW processing power.
Well, here's the solution: Windows and OSX are for slaves. Since I'm not an slave but a free person with ability to use and understand how computer works, I use Linux (And Amiga OS, and Open Risc OS). What's more, I'm experimenting with Linux + Wayland these days, so no bloated Xorg for me anymore: KMS+DRM allows my new, experimental but light desktop to run with VERY few resources. That's how it's meant to be in the upcoming years.
Then again, why should I explain it in an Amiga forum? The Amiga design itself was FAR superior to the stupid "raw power" idea... It's just that hearing people defend this primitive idea of what a desktop computer is makes my blood pressure increase...
-
IBM is only interested in custom products for specific customers. And they would probably do such a thing with ARM if somebody came knocking at their door with a big enough bucket of money...
Freescale, on the other hand, has falling revenue and is bleeding money. It's only a matter of time before they switch to supporting just one viable architecture and it's no brainer that PPC is not it.
-
As an example: take my damn noisy and hot Intel i5-based laptop and give me a cold, silent, efficient ARM-based laptop with 1/4 the raw power. Keep raw power for yourself! All I need is a video core with HW geometry transformation engine and HW fullHD H264 & MPEG2 decoding.
You see? The ARM computer is more capable, better designed and respects the enviroment by not needing a thermonuclear plant for himself as Intel chips usually do.
Underclock your i5 and it'll use very little power while still having very good performance. No need for a CPU switch :D
That's what I did with my Turion, at 900 MHz it's noiseless and produces very little heat - still sufficiently fast for everyday computing
When I need power, I bring it back on it's native 1.8 GHz clock.
-
To meet x86 head on you need to be able to beat them in all of it's strong areas, not just defend your own.
They can complain about x86 being the wrong instruction set, but to prove it they need to launch an ARM core that can beat an i7 and they can't.
As soon as ARM is faster as well as cheaper and less power hungry than x86, then everyone will switch everything to it.
I think we will see something novel regarding parallel processing, like a combination of a "traditional" CPU (probably a few of them on chip) with more than enough horse power for most traditional applications, connected to a massively paralleled, high speed GPU that will join in on "general computing". I think the key here would be the removal of any bottle necks in between them, to get *massive*, super fast, direct, on-chip bandwidth in between them. That would bring a whole new meaning to CUDA (http://www.nvidia.com/object/cuda_home_new.html), and this together would bring *a whole new class* of processors.
Could be wrong of course! But there are interesting things ahead...
:)
-
Freescale, on the other hand, has falling revenue and is bleeding money. It's only a matter of time before they switch to supporting just one viable architecture and it's no brainer that PPC is not it.
PPC has been Doomed Forever since 2005, hasn't disappeared yet.
-
IBM is only interested in custom products for specific customers. And they would probably do such a thing with ARM if somebody came knocking at their door with a big enough bucket of money...
Freescale, on the other hand, has falling revenue and is bleeding money. It's only a matter of time before they switch to supporting just one viable architecture and it's no brainer that PPC is not it.
That would explain the flood of new PPC designs including two new 64bit cores (a first for Freescale). Yep, look like they're abandoning the market.:laugh1:
-
I think we will see something novel regarding parallel processing, like a combination of a "traditional" CPU (probably a few of them on chip) with more than enough horse power for most traditional applications, connected to a massively paralleled, high speed GPU that will join in on "general computing". I think the key here would be the removal of any bottle necks in between them, to get *massive*, super fast, direct, on-chip bandwidth in between them. That would bring a whole new meaning to CUDA (http://www.nvidia.com/object/cuda_home_new.html), and this together would bring *a whole new class* of processors.
Could be wrong of course! But there are interesting things ahead...
:)
That's an interesting approach and it allows you to retain the performance for when you need it.
Basically, I decided that most of the time I just didn't need it so my current primary machines are two Intel Atom powered netbooks (I'm typing on a 1.8GHz one right now).
They were cheaper then the price of a Chromebook, they have real hard drives (not cloud or flash storage), and they probably perform as well or slightly better then a similar ARM based system.
Plus they run X86 Windows apps (no small selling point).
And, of course, when I still want my Amiga flavored fix I can used my PPC based MorphOS system.
-
Intel Haswell.
-
That would explain the flood of new PPC designs including two new 64bit cores (a first for Freescale). Yep, look like they're abandoning the market.:laugh1:
They will be forced to in the near future. They're loosing market share left and right, revenues are down, most divisions are selling less products... It's pretty obvious a point where they'll not be able to support and develop 2 distinct architectures is not far away....
-
That would explain the flood of new PPC designs including two new 64bit cores (a first for Freescale). Yep, look like they're abandoning the market.:laugh1:
If they will have to cut somewhere (which they probably will), then it won't be their ARM section. They are even replacing some PPC based applications with ARM themselves, and this is something happening in many embedded industries which has been traditional PPC strongholds, like automotive. AFAIK.
Not that PPC matters anymore to anyone interested in desktops/laptops, but anyway...
;)
-
It's only a matter of time before Microsoft abandons PPC in it's XBox. It makes no sense in the unified Windows 8 world to support a dead end processor that's different to both tablet/phone and desktop...
-
They will be forced to in the near future. They're loosing market share left and right, revenues are down, most divisions are selling less products... It's pretty obvious a point where they'll not be able to support and develop 2 distinct architectures is not far away....
Rumors from my buddies at the Intel campus in Beaverton aren't very rosy either. ARM is making a dent.
-
Google CollectAllYourDataOS :roflmao:
Thats so true, I wonder if the idiots buying anything running that know just how much of their info is getting sent back to google?
Good news on the arm stuff... Competition is ALWAYS good. Hope to see some
16 core arm processors soon. :anger::)
Steven
-
It's only a matter of time before Microsoft abandons PPC in it's XBox. It makes no sense in the unified Windows 8 world to support a dead end processor that's different to both tablet/phone and desktop...
By that logic, it wouldn't have made sense to move to a PPC in the 360 when the original Xbox used an Intel processor.
Nothing about the next generation of video games is well known, except that the first to market (the Wii U) is again PPC based.
-
Rumors from my buddies at the Intel campus in Beaverton aren't very rosy either. ARM is making a dent.
2? You obviously don't know their product line very well.
There are at least four distinct ISAs currently supported (more if you considered discontinued but still listed products like the 68060).
-
It's only a matter of time before Microsoft abandons PPC in it's XBox.
As I've said before, I wouldn't be totally surprised if the new X-box will be ARM based, powered by Denver/nVidia. Watch it happen! ;)
We'll see in time I guess...
:)
-
"We're still talking about sub-2.0 GHz processors here (for the moment).
PPCs were at 2.7 years ago when Apple abandoned them because they weren't getting faster quick enough"
http://www.theregister.co.uk/2012/05/08/tsmc_28_nanometer_cortex_a9_arm/
I guess they can scale the clock speed for markets that demand it.
-
As I've said before, I wouldn't be totally surprised if the new X-box will be ARM based, powered by Denver/nVidia. Watch it happen! ;)
We'll see in time I guess...
:)
Nope, I'll bet you on this one as Denver isn't due to be released in time.
And the PS4 is probably some weird AMD X86 based APU/Cell hybrid.
http://www.theregister.co.uk/2012/05/08/tsmc_28_nanometer_cortex_a9_arm/
I guess they can scale the clock speed for markets that demand it.
This, on the other hand is neat. I wonder why the did this with a dual core (and not a quad), and why they didn't consider the A15 instead.
Something tells me 3.1 is probably pushing it (considering 1.4 is typical for this CPU @ 40nm).
Still, its a great step forward.
-
By that logic, it wouldn't have made sense to move to a PPC in the 360 when the original Xbox used an Intel processor.
It only made sense for Microsoft to go for a processor from IBM and a graphics chip from ATI, because Microsoft didn't have a great relationship with Intel or NVidia after the original xbox.
Plus they got the CPU cheap because Sony helped pay for it.
None of the next gen consoles are going to be ARM based, it's just not good enough. It's ok for tablets or a nas, maybe even for a computer that isn't going to be pushed too hard.
-
Yes, my be is that XBox will return to intel or play intel and AMD off one another and go for the cheapest. Splitting the company on three completely different processors makes zero sense.
-
None of the next gen consoles are going to be ARM based, it's just not good enough. It's ok for tablets or a nas, maybe even for a computer that isn't going to be pushed too hard.
Thank you. I'm glad somebody here realizes this. Even the new Cloudbook is less powerful then its Intel powered predecessor.
Yes, my be is that XBox will return to intel or play intel and AMD off one another and go for the cheapest. Splitting the company on three completely different processors makes zero sense.
Rumor has it that Sony will use an AMD APU in its PS4 along with a Cell processor. The GPU that Microsoft is going to use has been announced, but I could see them doing something similar and using both GPUs.
-
Rumor has it that Sony will use an AMD APU in its PS4 along with a Cell processor.
Well not just 1 cell, the rumours was that it's double the core's than the PS3. If they want backwards compatibility with the PS3 then they'll need to include the Cell though, no way that is going to be emulated.
They could probably emulate the PS2 on the PS4 as well, but whether they will is another matter.
-
They could probably emulate the PS2 on the PS4 as well, but whether they will is another matter.
Since they eliminated PS2 emulation on the most recent PS3s, that's unlikely, but they could do it via software.
-
Since they eliminated PS2 emulation on the most recent PS3s, that's unlikely, but they could do it via software.
Not exactly the most recent, they removed PS2 backward compatibility 5 years ago. It was only in the models available in the first 12 months. Software compatibility was hidden in them for ages, but if you hack the console to use it then you'll find that a lot of games run really slow.
I guess it depends on how quick the PS4 is.
-
Thank you. I'm glad somebody here realizes this. Even the new Cloudbook is less powerful then its Intel powered predecessor.
I guess you ment Chromebook and AnandTech (http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/6) seems to disagree with your statement.
Staf.
-
We won't see ARM in any of the next gen consoles no, they are by far to slow still.
I hope both M$ and Sony go for AMD since AMD needs the cash, they are sinking fast and can't compete on the high-end anymore witch leaves Intel in a monopoly stance wich I do not like.
EDIT: Latest rumors, one of the latest dev systems from sony holds AMD's HSA-cpu A10 "trinity", so if this will be the CPU used AMD will get som cash dire needed fror their x86 family. I hope to god they won't use integrated GPU tho.
-
I guess you ment Chromebook and AnandTech (http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/6) seems to disagree with your statement.
Staf.
Well Staf, the first benchmarks I saw compared the new Chromebook to the older Intel Celeron based model and in all tests it was slower, so yeah I disagree with their opinions.
http://gigaom.com/mobile/intel-v-arm-the-chromebook-performance-battle/
-
Well Staf, the first benchmarks I saw compared the new Chromebook to the older Intel Celeron based model and in all tests it was slower, so yeah I disagree with their opinions.
http://gigaom.com/mobile/intel-v-arm-the-chromebook-performance-battle/
So I would like to know why these two have opposing results ? Don't know gigaom well but I do know AnandTech always does a thorough investigation and cataloging something they do as opinion is IMHO not doing them justice.
greets,
Staf.
-
So I would like to know why these two have opposing results ? Don't know gigaom well but I do know AnandTech always does a thorough investigation and cataloging something they do as opinion is IMHO not doing them justice.
greets,
Staf.
Its obvious that the A15 does well when compared to an Atom (and very well when compared to all other ARM processors).
Its just that other X86 processors have a much greater lead in performance.
No big deal if your concern is performance per watt.
-
I was lucky enough to grab a new Chromebook (I do like the Cloudbook ref) this past week. I've seen the bench marks and, yes the new ARM model is a tick slower than the Intel in tests. From user experience I've found it quite snappy. Much faster on the internet than my first gen iPad.
I picked it up to see how much you could do for $249. It's screen could be better, the camera could have more pixels, but all of that (including an Intel) would have raised the price beyond $249.
ARM can help reduce costs and that will put a computer into the hands of more folks. All good!
One of the things I originally loved about the Amiga and the Mac was the marriage between the OS and hardware makers. ChromeOS seems headed in the same direction.
-
@beller
The only real complaint I've heard so far is the same as yours (that the screen could have a better quality display).
$249 is a good price, although I paid less for my Atom based netbooks and i got real hard drives in them.
My main concern would be the dependency on cloud based applications and storage.
But I'll admit that the A15 is a nice step forward.
-
Once again there are rumors of a potential, upcoming architectural switch for Apple Mac's:
http://www.bloomberg.com/news/2012-11-05/apple-said-to-be-exploring-switch-from-intel-chips-for-the-mac.html
This is far from being the first rumor of this kind, many similar has circulated over the net during the last few years. One almost begin to wonder if there is a certain threshold when one can actually start using the old "where there is smoke, there is fire" saying. ;)
The usual responses to these rumors are: "Bullsh!t, ARM doesn't have enough performance", and the people saying so are usually sitting in the car of present, driving down the road of time, and looking at the various ARM CPU's in their rear mirror. A natural thing to do (looking in the rear mirror that is), since the road goes over a hill a bit further down the road, a hill blocking the view of whats at the other side of the crest. So since you don't have a picture of the future, but you do of the past, your comments couldn't possibly be anything other than the one above.
But just because you and me and all the other common people can't see at all what's on the other side of the hill, it doesn't matter that there aren't people there doing stuff already, things we are about to see in a year or two when we actually drives over that hill and gets our first view over the previously hidden horizon.
If I may speculate, I think Mr. Jen-Hsun Huang (http://venturebeat.com/2011/03/04/qa-nvidia-chief-explains-his-strategy-for-winning-in-mobile-computing/), founder and chief of the nVidia corporation, will be standing there with a few new CPU's based on their "Denver" concept. "within the next three or four years we’re going to bring to the mobile market performance that is nearly a hundred times higher than today’s’ PC." (That was 1.5 years ago) "ARM is now the only CPU in the world that will have deep penetration in the mobile devices, the PC, servers and supercomputers."
They have been working with this for years, and should almost be ready. He is the guy who brought us a true computer graphic evolution (through competition and fabless production), he is the guy who brought us the GPU, the concept that took graphics and gaming to a whole new dimension by its enormous performance in running massively paralleled code. He has delivered. He has reformed and evolved a whole industry. Twice. Now he is very confident it will happen again with a whole new class of processor. And while you and me can't really see this yet (we will come over that hill in a year or two), I have no doubt whatsoever that both Apple and Microsoft is sitting down at nVidias briefings and presentations (or by now, maybe even demonstrations) of the thing. And within context like this, the "migration" rumors like the one mentioned above doesn't sound like "bullsh!t" anymore.
What did the nVidia chief have to say about a possible Mac OS migration to ARM?
Q: Is ARM on the Mac OS possible?
A: "I don’t know their plans but if you look at it from 10,000 feet, it seems to make sense, right? Because if they go Mac on ARM, they could address some of their concerns with their own SOC. So instead of paying $150, they can pay $15."
Nothing in his answer about whether it would be technically doable at all, if it would make technologically sense, that seems to be *a given* in his reply (and he has unique inside knowledge of the products ahead, nobody can deny that). No instead he focus on IP and the economic side of doing their own SoC. Which Apple seems to be quite aware about themselves, releasing their own SoC CPU's every half a year or so.
So it will be indeed very interesting to see what's behind that crest of the hill that the road of time will pass over in a year or so.
:)
-
If it leads to other companies offering all this swanky new ARM tech in useful form factors (laptops and desktops) I'm all for it. If it just means they're looking at merging their tablet and desktop/laptop lines, meh. One Windows 8 was way the hell more than enough.
-
If it leads to other companies offering all this swanky new ARM tech in useful form factors (laptops and desktops) I'm all for it. If it just means they're looking at merging their tablet and desktop/laptop lines, meh. One Windows 8 was way the hell more than enough.
While I think there will be desktops and (particularly) laptops for some time still in the future, there is a shift going on where both the importance and impact of "traditional" computers like these declines. It has been going on for quite some time now (all statistics proves this, but you don't have to read boring statistics, it's enough to observe your surroundings for one day, and you'll see what I mean), and it will accelerate in a close future, that's a safe bet. And this no matter the underlying architecture. The user pattern changes, as simple as that.
-
I wonder if they still maintain Rosetta internally.
-
While I think there will be desktops and (particularly) laptops for some time still in the future, there is a shift going on where both the importance and impact of "traditional" computers like these declines. It has been going on for quite some time now (all statistics proves this, but you don't have to read boring statistics, it's enough to observe your surroundings for one day, and you'll see what I mean), and it will accelerate in a close future, that's a safe bet. And this no matter the underlying architecture. The user pattern changes, as simple as that.
Lord, here we go again...first off, "observing [my] surroundings" paints a very different picture than "all statistics." I know (and see in the general public) at least as many people who use laptops primarily as use tablets, and I know many more people who are primarily desktop users. I don't know a single person who is exclusively a tablet user, but I know a lot of people who use "traditional" computers but don't have a tablet (myself included.) So if we're going simply off of personally observed reality, as you suggest, I'd have to say that "all statistics" are fairly bunk.
Even giving them the benefit of the doubt, my experiences suggest that the often-quoted trends don't tell close to the whole story. Certainly tablet sales have boomed in the past couple years, but I suspect that's as much because the past couple years is how long it's been since tablets started being not total crap and had the full force of the Apple marketing machine to make them look cool. And laptop and desktop sales have declined. But that doesn't say anything about actual day-to-day use patterns.
I have laptops that are five to ten years old that work like new, and that are perfectly usable for daily basic-use stuff like web browsing and email, not to mention the wide assortment of other software they can run perfectly well. Desktops, even moreso. So the simple fact that people aren't buying as many laptops and desktops as they used to doesn't really say anything about how many people are using them (particularly when you consider that desktops reached a pretty fair saturation point for basic use somewhere in the mid-Core 2 era.)
And I don't think it's anything like a "safe bet" that tablet growth is just going to continue to accelerate forever and ever. Tablets are in a boom phase right now; it started when the iPad made them suddenly cool, and it's been fed by the fact that damn near every manufacturer in the industry has been trying to get in on the Hot New Thing. But every boom eventually goes bust, or sometimes just settles down quietly. When the novelty rush wears off, tablet sales will stabilize at a level the market can actually support, long-term. I don't know where that will be, but I know quite certainly that it won't continue indefinitely at the growth rate it's seen over the two and a half years since the iPad's release, much less accelerate, because if it does it will outpace global population growth. That is quite simply not going to happen.
Use patterns change, but only to the extent that users let them. In the end, people will settle on the solutions that are best for them, whether or not that's what tablet evangelists want to see.
-
Thanks John,
Because if it doesn't have a keyboard I really don't want it.
And to be honest, I don't care what kind of CPU it has as long as its powerful enough for my day to day uses.
-
What would be the point?
I wonder if they still maintain Rosetta internally.
-
And to be honest, I don't care what kind of CPU it has as long as its powerful enough for my day to day uses.
Well, personally I would like to see some more variety in computer hardware again. I've just been perpetually frustrated by the fact that all these manufacturers are rolling out progressively sweeter ARM hardware and refusing to do anything with it besides tablets and smartphones...
What would be the point?
The same could be asked about why they maintained x86 builds of OSX before they even had any plans to switch to Intel. Not every internal project at a company is directly tied to current business prospects.
-
Arm leaps forward, elbow dives under, shoulder hits square on...
-
What would be the point?
To run x86 code on ARM obviously.
-
And interesting enough, the idea of switching Macs to ARM is lighting up the news now....
http://www.decryptedtech.com/news/is-apple-planning-to-move-their-macs-to-arm.html (http://www.decryptedtech.com/news/is-apple-planning-to-move-their-macs-to-arm.html)
http://www.macobserver.com/tmo/article/apple-mulls-switching-dropping-intel-for-macs (http://www.macobserver.com/tmo/article/apple-mulls-switching-dropping-intel-for-macs)
http://www.zdnet.com/apple-semiconductors-brave-new-macs-7000006977/ (http://www.zdnet.com/apple-semiconductors-brave-new-macs-7000006977/)
-
I can imagine Apple's legacy support... an ARM emulating an Intel emulating a PowerPC emulating a 68K!
-
The same could be asked about why they maintained x86 builds of OSX before they even had any plans to switch to Intel. Not every internal project at a company is directly tied to current business prospects.
I think it was always the plan, in the meantime it served as a stick to beat Motorola/IBM with.
Back when people knew about computers, they had to differentiate on technology. Apple didn't use those stupid Intel processors that IBM picked, because they were different to IBM.
They needed to wait until enough people wouldn't care that it had an Intel chip instead of something from Motorola/IBM & they now just rely on the case design and UI to differentiate.
They also had to wait for Intel to bury the p4. It wouldn't surprise me if Apple motivated Intel to produce something better.
-
The same could be asked about why they maintained x86 builds of OSX before they even had any plans to switch to Intel. Not every internal project at a company is directly tied to current business prospects.
Steve Jobs planned and wanted for Apple to switch to Intel x86 as early as late 90s. He was personally very dissatisfied with Motorola and their PPC chips and since Motorola also lost a lot of money when Jobs killed the clone market, there was no love between the two.
-
And interesting enough, the idea of switching Macs to ARM is lighting up the news now....
http://www.decryptedtech.com/news/is-apple-planning-to-move-their-macs-to-arm.html (http://www.decryptedtech.com/news/is-apple-planning-to-move-their-macs-to-arm.html)
http://www.macobserver.com/tmo/article/apple-mulls-switching-dropping-intel-for-macs (http://www.macobserver.com/tmo/article/apple-mulls-switching-dropping-intel-for-macs)
http://www.zdnet.com/apple-semiconductors-brave-new-macs-7000006977/ (http://www.zdnet.com/apple-semiconductors-brave-new-macs-7000006977/)
Does that mean I'll have to jailbreak my computer to do anything useful?
I was running geekbench on one of my older dual processor computers and I searched the net to see what else was equivalent. IPhone 5 was the exact same score.
So you have the power of a computer in your hand but you are sandboxed into Apple's little world where you can't even download some file off a basic html website like Aminet (which I do with my 1200 all the time).
-
I think it was always the plan, in the meantime it served as a stick to beat Motorola/IBM with.
That really doesn't make a lot of sense, considering how many years they spent on PowerPC and how much they loved to place themselves in opposition to Intel. I seriously doubt that twelve years of PowerPC was just a phase they were going through while they plotted how best to dump the technology they'd invested all that time and money in.
Steve Jobs planned and wanted for Apple to switch to Intel x86 as early as late 90s. He was personally very dissatisfied with Motorola and their PPC chips and since Motorola also lost a lot of money when Jobs killed the clone market, there was no love between the two.
Got a reference for that? Granted Jobs was a capricious whacko when it came to picking directions for the company, I wouldn't be surprised if he just up and decided that he wanted to change architectures, but the period from the late '90s to the laying to rest of PPC Macs in 2006 spanned a whole three new generations of PowerPC Macs (G3, G4, and G5,) all of which were touted as the best thing since sliced bread and way cooler than pokey old x86. These claims that Apple was secretly planning to get all buddy-buddy with Intel even while they were roundly abusing them in the press really just don't seem to fit.
-
Got a reference for that?
Sure, it's in Isaacson biography on Jobs.
-
Got a reference that doesn't require I buy a book I'm not really interested in reading?
-
That really doesn't make a lot of sense, considering how many years they spent on PowerPC and how much they loved to place themselves in opposition to Intel. I seriously doubt that twelve years of PowerPC was just a phase they were going through while they plotted how best to dump the technology they'd invested all that time and money in.
Apple were using PowerPC before Jobs came back. It wouldn't make any sense to ditch it when Intel and their customers weren't ready.
Jobs switch from Motorola to Intel at next.
-
Steve Jobs planned and wanted for Apple to switch to Intel x86 as early as late 90s. He was personally very dissatisfied with Motorola and their PPC chips and since Motorola also lost a lot of money when Jobs killed the clone market, there was no love between the two.
perhaps this project?
http://lowendmac.com/orchard/05/star-trek-mac-os-intel.html
-
Intel P4 - the hottest thing since direct electric heater :D
Does it makes sense business wise to go ARM for Apple?
-
It might in the future, at least for laptops.