Those of you that are waiting to compare the One X to the GSIII

Search This thread

vitaminxero

Senior Member
Apr 11, 2011
328
37
San Francisco, California
They did do something different. The current Apple TV dual core with one core disabled A5 is 32nm. The fully functional A5 in the ipad 2 16 2012 model is a fully functional A5 also at 32nm. So at 32nm, which is about half the size of the 45nm A5X of the ipad 3. Half the heat, or less than half.

I see no reason why a 32nm A5x which sports a dual core A9 and Quad core GPU couldn't fit in a iphone 5. But your are correct even at 32nm, it may still be too hot. A Quad core GPU gives off more heat than a dual core GPU or a Quad core CPU.

With some OS and speed tweaks it is possible if the iphone 5 is slightly bigger with a bigger screen. A good GPU is more important to IOS devices because IOS 5 uses hardware acceleration at more than three times the rate of Android devices which are not running 4.0 yet or ICS. ICS takes greater advantage of hardware acceleration and does a good job at it. Not at the level of IOS 5 yet, but close. ICS is really impressive to say the least. Biggest update to Android to date.

Tegra 3 fits in a phone just fine and that is 40nm. But is a quad CPU not GPU. Exynos 4412 is also a Quad CPU and not GPU at 32nm. The A5X is reverse of the those two, dual CPU and Quad GPU and curently 45nm which has something to do with the excessive heat. At 32nm it would not get that hot.

We may also see the first A15 Cortex SOC with the A6. The Snapdragon S4 is a custom design which uses the Cortex A15 reference designs, but is pretty much a in house custom designed SOC. Qualcomm licences the reference designs only, thats it.


Whereas chip designers like Samsung and Texas Instruments (TI) license the architecture for ARM’s Cortex cores. Companies can also obtain an ARM architectural license for designing their own, different CPU cores using the ARM instruction set. Qualcomm designed their own ARM-compatible cores based on the reference designs only and custom make the chip.

The Snapdragon S3 was based on the Cortex A8 and A9 reference designs. Not just the A8 as most believe. Reference designs are allowed to be used even before ARM licences out the architecture because there are not many SOC's manufactures out there that can do much with just a reference design. Qualcomm is one of them. I know ARM is worried that Qualcomm will start to use there own reference designs, and that may be the case with the Snapdragon S5 which may or may not be based on ARM. The cortex A15 and the Snapdragon S4 both are about 30 to 40 percent faster clock for clock and core for core than a Cortex A9. Both scale to 2.5Ghz and are also made to be used in laptops(ultrabooks) as well as tablets.

Most don't realize that the Snapdragon S4 is a more advanced design than even the Cortex A15. More efficient clock for clock and uses less power. There are not better chip designers than those who work at Qualcomm, and that is including intel. Intel only has to worry about the CPU and GPU, SOC's makers have to worry about allot more than that.

There isn't alot different design wise between a Cortex A8 and A9 arm chip aside from scaleability. Now there is alot of difference between a Cortex A15 and a Cortex A8 or A9. Same as there is a big difference between a Snapdragon S4 and Cortex A8 or A9.

A small point that doesn't affect your claims. The Mali 400 in the Exynos 4412 is a quad core. In terms of GPUs, cores don't really mean much as each core can have varying processing capability. Some cores are smaller, some are bigger. I think the Power VR 543MP4 is a much larger than the Mali 400 but I don't have the numbers to back me up. Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.
 

iamthedudeman

Member
Apr 15, 2012
28
29
A small point that doesn't affect your claims. The Mali 400 in the Exynos 4412 is a quad core. In terms of GPUs, cores don't really mean much as each core can have varying processing capability. Some cores are smaller, some are bigger. I think the Power VR 543MP4 is a much larger than the Mali 400 but I don't have the numbers to back me up. Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.


I don't know as much about the Mali T400 but I am pretty sure it's not a true quad core. Just as the Tegra 3 is not a '12' core GPU. Alot of internet sites are claiming that the Mali 400 is quad core which isn't really the case. Mali-400 MP is scalable from 1 to 4 cores. It is multi-threaded which can give the impression of a quad core chip. For instance VR states that there PowerVR 5 series are MP 2 and MP 4. Multi processing GPU cores x2 and x4. They are true Quad core and dual core GPU's. Multi processors x2 and x4.

http://www.differencebetween.com/difference-between-mali-400mp-gpu-and-vs-tegra-2/#ixzz1tYp8YnWu

Maybe samsung has a quad core model, we will not know for sure until it is released. I don't think so but we will see. As I don't see how they would fit a true 'quad core' GPU and a quad core CPU in a phone.

More likely is that people don't really what they mean by GPU Cores. Graphics processors typically use an array of basic computing “building blocks” that are also called “cores”. And because computer graphics is highly parallelizable, adding more cores is one way to scale graphics performance. GPU vendors like AMD, Power VR, Qualcomm, NVIDIA and others have done this for more than a decade, and a modern GPU for desktop computers can host 512+ “cores”.

The thing is, GPU Cores have been marketed so much that the term doesn’t mean much anymore. It’s useful when comparing several GPUs models within a single generation, but from one brand to the next, the definition of graphics core can vary greatly. For instance, AMD usually declare having much more “cores” than NVIDIA, but in the end, the actual performance does not reflect the difference at all.

A GPU core is referred to as a 'undercore' or 'fragment processor' by most engineers, meaning that it is not a complete GPU core. The A5 is a true dual core GPU, and the A5X is a true quad core GPU. Sadly the current Mali 400 is not.

It says here single core which is what I thought it was:

http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-400-mp.php

The Power VR 543MP4 is a true quad core GPU, not the Mali 400 MP. The Power VR 543MP4 is larger, much larger. And gives off more heat because of the 45nm size and the full quad core layout. The chip size between the two is enormous. The A5X is as big as some desktop processors.

http://www.anandtech.com/show/5685/apple-a5x-die-size-measured-16294mm2-likely-still-45nm

Here is the layout of the Mali 400. That whole diagram is a full GPU core. They count the 'fragment processors' as 'Cores' hence the MP designation. In contrast the A5X has 'four full GPU cores with multiple "under cores' or 'fragment processors'. Notice the Mali 400 has 4 of these 'fragment processors'.

http://www.arm.com/images/Mali-400_1000px.jpg

Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.

Being a engineer, I would not know this. Lol. :)(Just kidding).

We will not know what the Galaxy S3 has in it until it ships.
 
Last edited:

vitaminxero

Senior Member
Apr 11, 2011
328
37
San Francisco, California
I don't know as much about the Mali T400 but I am pretty sure it's not a true quad core. Just as the Tegra 3 is not a '12' core GPU. Alot of internet sites are claiming that the Mali 400 is quad core which isn't really the case. Mali-400 MP is scalable from 1 to 4 cores. It is multi-threaded which can give the impression of a quad core chip. For instance VR states that there PowerVR 5 series are MP 2 and MP 4. Multi processing GPU cores x2 and x4. They are true Quad core and dual core GPU's. Multi processors x2 and x4.

http://www.differencebetween.com/difference-between-mali-400mp-gpu-and-vs-tegra-2/#ixzz1tYp8YnWu

Maybe samsung has a quad core model, we will not know for sure until it is released. I don't think so but we will see. As I don't see how they would fit a true 'quad core' GPU and a quad core CPU in a phone.

More likely is that people don't really what they mean by GPU Cores. Graphics processors typically use an array of basic computing “building blocks” that are also called “cores”. And because computer graphics is highly parallelizable, adding more cores is one way to scale graphics performance. GPU vendors like AMD, Power VR, Qualcomm, NVIDIA and others have done this for more than a decade, and a modern GPU for desktop computers can host 512+ “cores”.

The thing is, GPU Cores have been marketed so much that the term doesn’t mean much anymore. It’s useful when comparing several GPUs models within a single generation, but from one brand to the next, the definition of graphics core can vary greatly. For instance, AMD usually declare having much more “cores” than NVIDIA, but in the end, the actual performance does not reflect the difference at all.

A GPU core is referred to as a 'undercore' or 'fragment processor' by most engineers, meaning that it is not a complete GPU core. The A5 is a true dual core GPU, and the A5X is a true quad core GPU. Sadly the current Mali 400 is not.

It says here single core which is what I thought it was:

http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-400-mp.php

The Power VR 543MP4 is a true quad core GPU, not the Mali 400 MP. The Power VR 543MP4 is larger, much larger. And gives off more heat because of the 45nm size and the full quad core layout. The chip size between the two is enormous. The A5X is as big as some desktop processors.

http://www.anandtech.com/show/5685/apple-a5x-die-size-measured-16294mm2-likely-still-45nm

Here is the layout of the Mali 400. That whole diagram is a full GPU core. They count the 'fragment processors' as 'Cores' hence the MP designation. In contrast the A5X has 'four full GPU cores with multiple "under cores' or 'fragment processors'. Notice the Mali 400 has 4 of these 'fragment processors'.

http://www.arm.com/images/Mali-400_1000px.jpg

Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.

Being a engineer, I would not know this. Lol. :)(Just kidding).

We will not know what the Galaxy S3 has in it until it ships.

Whatever it is, Samsung is going to tweak the clocks to make sure it benchmarks better than the competition. ;)
 

k4p741nkrunch

Senior Member
Jun 14, 2010
512
153
Florida
First of all, AWESOME thread! Thank you everyone for the informative posts, especially you Iamthedudeman.

I'm on the fence about buying the One-XL this weekend, as I can't afford the One-X and didn't know if it was really an upgrade from my G2x. This thread has definitely shown me that what I will be getting is truly the next generation of smartphones. At this point I don't even want the Tegra 3 version.

On a side note, do you think it is possible that HTC used such debate-inspiring SoCs as a way to draw more attention and fill the media with info-hungry potential consumers? I mean, personally I have never scoured YouTube looking for comparative benchmarks like I have today.
 

Maroon Mushroom

Senior Member
Jul 15, 2010
1,172
444
Portland, Oregon
There aren't even real benchamarks out for the S3.

I also don't see what's wrong with the quad core A9. Sure it's not the dual A15 with the new mali, but it's still an incredible SoC.
 

iamthedudeman

Member
Apr 15, 2012
28
29
No problem everyone, just trying to contribute to the board.

At least some people find my comments informative, my wife sure doesn't. :D

---------- Post added at 10:47 PM ---------- Previous post was at 10:14 PM ----------

First of all, AWESOME thread! Thank you everyone for the informative posts, especially you Iamthedudeman.

I'm on the fence about buying the One-XL this weekend, as I can't afford the One-X and didn't know if it was really an upgrade from my G2x. This thread has definitely shown me that what I will be getting is truly the next generation of smartphones. At this point I don't even want the Tegra 3 version.

On a side note, do you think it is possible that HTC used such debate-inspiring SoCs as a way to draw more attention and fill the media with info-hungry potential consumers? I mean, personally I have never scoured YouTube looking for comparative benchmarks like I have today.

Well the Tegra 3 is a very powerfull SOC. You have to remember that most of these SOC's are 'overkill' to most applications. Having a Tegra 3 vs a Snapdragon S4 in the One X will be transparent to the user because the user experience for both will be all things considered "equal' in most tasks.

You might see a slight advantage for the S4 if you had them side by side, and that is even questionable. The only reason the Snapdragon was chosen was because of the integrated LTE and low power draw because of LTE. Also the performance is equal to the user experience of the Tegra 3. You don't want to ship a LTE capable chip that cannot be used in a non-LTE country. Which most European countries are, as well as international.

The S4 was fast tracked by Qualcomm for a early release to gain market share over the impending rival chipmakers. The Exynos 5520, and the OMAP 5.

http://www.ti.com/general/docs/wtbu...teId=6123&navigationId=12864&contentId=103103



My firm got marketing materials almost a year ago for the snapdragon S4! With the ever increasing performance per watt and power draw of current top end smartphones, I can tell you that manufactures were very happy to get the Snapdragon S4 with integrated LTE which can perform on par or exceed their Cortex A9 quad counterparts.

For the end user, lucky for us who want the One XL we get the future of SOC's now, minus the Adreno 225.

Now Qualcomm is having problems with yields to the 28nm process since it did not spend enough time in the development stage. It seems that the gamble paid off since the S4 contract with Samsung is huge. As well as HTC. You will see a S4 in the american Galaxy S3 with LTE most likely.
 

k4p741nkrunch

Senior Member
Jun 14, 2010
512
153
Florida
Dudeman, you are like the encyclopedia of all things relevant!

I asked my question mainly to spark discussion, only because I believed the topic was speculative. It's nice to have a definitive explanation for what some people might consider HTC's greatest mistake.

Honestly when I first saw the sign on the wall for the One X at Radioshack I thought back to those MWC articles about the Tegra 3 quad core smartphone and said "Wow, I can actually afford the first quad core phone!" Later I came home and found out about the XL vs. X debacle and was crushed. I immediately started preparing for two more months of dealing with my not-so-bueno phone until the Galaxy S3 came out. After much deliberation and info cramming I'm actually MORE excited to get my hands on this revolutionary dual-core. You're totally right, the benefits beyond it are transparent. Unless you stick a floating FPS meter on your games the XL should still get over 35fps.

I'm really interested in trying one thing out once this puppy gets rooted. If any of you have ever used ChainFire 3D you'll know what I mean. I used to trick tegra games into running on my Evo, quite smoothly might I add. I had a plugin I loaded with ChainFire 3D pro and fixed the market settings so Tegra apps were available. I wonder if I can do the same thing here.. I really want to try Glowball and Dark Meadows when it comes out.
 

iamthedudeman

Member
Apr 15, 2012
28
29
Not really. You should see my casual wardrobe, which was relevant back in the 90's. :eek:

Just wanted to update this thread, as we are seeing some real full reviews of the One XL. I posted a few of the links in another thread but thought it would be more relevant here. As some of you know the Samsung Galaxy S3 is being revealed today. The specs will be very similar to the One XL.

Full review:

http://www.anandtech.com/show/5779/htc-one-x-for-att-review/1

Summary:

http://blog.gsmarena.com/htc-one-x-for-att-turns-out-faster-than-the-international-version/

Battery life:

http://blog.htc.com/2012/05/improving-battery-life/

http://www.engadget.com/2012/05/03/htc-one-x-battery-life/

Especially the US version. Last month a few marketing execs from ASC(Arrow Semiconductor Corporate) were telling me the american version of the Galaxy S3 was to have the same SOC as the One XL. I could not confirm, but they were accurate in the past when I spoke to them. Seems that is indeed the case.

American S4 version of the Galaxy S3:

http://pocketnow.com/android/samsung-galaxy-s3s-dual-core-cpu-for-verizon-confirmed

Other than the form factor and memory, these two will very close in the spec dept. Another question is build quality, which is the reason I picked the HTC One XL. HTC's build quality is second to none. On par with Apple.

Samsung, well............not so much. But it is getting better. They might surprise us today. Will it be as good as the HTC One X. I doubt it. As good, maybe.
 

mesko402

Member
Nov 20, 2007
15
5
Atlanta
Thanks dudeman!

That was really informative with great references. Love the descriptive architectural differences between all the current SOCs on the market. If you have any more information please share. I think there are several of us who appreciate your insight and knowledge!
 

darkamikaze

Senior Member
Aug 7, 2010
1,524
174
Candyland
VERY dissappointinggg SGS3 ....
It looks ugly not to mention they are more software oriented ... software I'd want to change regardless of features..
 

carporsche

Senior Member
Jul 17, 2010
69
0
Wooohooo. One X it is then. Way better looking , better build quality and dare i say better screen wrt to color saturation than the AMOLED.

Good to know i shall get the One X for $50 from amazon :)

S3 has a 2100mAh battery, but still reviews have shown that 1800mAh battery on the S4 one X does an excellent job.
 

UMGixxer

Senior Member
Jun 30, 2008
391
14
I made the right decision with the HOX no regrets

Edit: Cant believe I waited all this time just to hear this, with all the hype you would have figured it would have been superior to even laptops or soemthing lol, should be able to read my mind, really like 6 months of hype all for nothing but face and voice recognition total fail to me
 
Last edited:

vioalas

Senior Member
Apr 3, 2012
670
97
Lakeland, Fl
Ugly... GNex with buttons. I'm so happy!!!! And "release in summer" Um.... That's vague.

2100Mah battery, the extra 300mah will be wasted tracking my eyes..
 

Top Liked Posts

  • There are no posts matching your filters.
  • 4
    A small point that doesn't affect your claims. The Mali 400 in the Exynos 4412 is a quad core. In terms of GPUs, cores don't really mean much as each core can have varying processing capability. Some cores are smaller, some are bigger. I think the Power VR 543MP4 is a much larger than the Mali 400 but I don't have the numbers to back me up. Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.


    I don't know as much about the Mali T400 but I am pretty sure it's not a true quad core. Just as the Tegra 3 is not a '12' core GPU. Alot of internet sites are claiming that the Mali 400 is quad core which isn't really the case. Mali-400 MP is scalable from 1 to 4 cores. It is multi-threaded which can give the impression of a quad core chip. For instance VR states that there PowerVR 5 series are MP 2 and MP 4. Multi processing GPU cores x2 and x4. They are true Quad core and dual core GPU's. Multi processors x2 and x4.

    http://www.differencebetween.com/difference-between-mali-400mp-gpu-and-vs-tegra-2/#ixzz1tYp8YnWu

    Maybe samsung has a quad core model, we will not know for sure until it is released. I don't think so but we will see. As I don't see how they would fit a true 'quad core' GPU and a quad core CPU in a phone.

    More likely is that people don't really what they mean by GPU Cores. Graphics processors typically use an array of basic computing “building blocks” that are also called “cores”. And because computer graphics is highly parallelizable, adding more cores is one way to scale graphics performance. GPU vendors like AMD, Power VR, Qualcomm, NVIDIA and others have done this for more than a decade, and a modern GPU for desktop computers can host 512+ “cores”.

    The thing is, GPU Cores have been marketed so much that the term doesn’t mean much anymore. It’s useful when comparing several GPUs models within a single generation, but from one brand to the next, the definition of graphics core can vary greatly. For instance, AMD usually declare having much more “cores” than NVIDIA, but in the end, the actual performance does not reflect the difference at all.

    A GPU core is referred to as a 'undercore' or 'fragment processor' by most engineers, meaning that it is not a complete GPU core. The A5 is a true dual core GPU, and the A5X is a true quad core GPU. Sadly the current Mali 400 is not.

    It says here single core which is what I thought it was:

    http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-400-mp.php

    The Power VR 543MP4 is a true quad core GPU, not the Mali 400 MP. The Power VR 543MP4 is larger, much larger. And gives off more heat because of the 45nm size and the full quad core layout. The chip size between the two is enormous. The A5X is as big as some desktop processors.

    http://www.anandtech.com/show/5685/apple-a5x-die-size-measured-16294mm2-likely-still-45nm

    Here is the layout of the Mali 400. That whole diagram is a full GPU core. They count the 'fragment processors' as 'Cores' hence the MP designation. In contrast the A5X has 'four full GPU cores with multiple "under cores' or 'fragment processors'. Notice the Mali 400 has 4 of these 'fragment processors'.

    http://www.arm.com/images/Mali-400_1000px.jpg

    Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.

    Being a engineer, I would not know this. Lol. :)(Just kidding).

    We will not know what the Galaxy S3 has in it until it ships.
    3
    No problem everyone, just trying to contribute to the board.

    At least some people find my comments informative, my wife sure doesn't. :D

    ---------- Post added at 10:47 PM ---------- Previous post was at 10:14 PM ----------

    First of all, AWESOME thread! Thank you everyone for the informative posts, especially you Iamthedudeman.

    I'm on the fence about buying the One-XL this weekend, as I can't afford the One-X and didn't know if it was really an upgrade from my G2x. This thread has definitely shown me that what I will be getting is truly the next generation of smartphones. At this point I don't even want the Tegra 3 version.

    On a side note, do you think it is possible that HTC used such debate-inspiring SoCs as a way to draw more attention and fill the media with info-hungry potential consumers? I mean, personally I have never scoured YouTube looking for comparative benchmarks like I have today.

    Well the Tegra 3 is a very powerfull SOC. You have to remember that most of these SOC's are 'overkill' to most applications. Having a Tegra 3 vs a Snapdragon S4 in the One X will be transparent to the user because the user experience for both will be all things considered "equal' in most tasks.

    You might see a slight advantage for the S4 if you had them side by side, and that is even questionable. The only reason the Snapdragon was chosen was because of the integrated LTE and low power draw because of LTE. Also the performance is equal to the user experience of the Tegra 3. You don't want to ship a LTE capable chip that cannot be used in a non-LTE country. Which most European countries are, as well as international.

    The S4 was fast tracked by Qualcomm for a early release to gain market share over the impending rival chipmakers. The Exynos 5520, and the OMAP 5.

    http://www.ti.com/general/docs/wtbu...teId=6123&navigationId=12864&contentId=103103



    My firm got marketing materials almost a year ago for the snapdragon S4! With the ever increasing performance per watt and power draw of current top end smartphones, I can tell you that manufactures were very happy to get the Snapdragon S4 with integrated LTE which can perform on par or exceed their Cortex A9 quad counterparts.

    For the end user, lucky for us who want the One XL we get the future of SOC's now, minus the Adreno 225.

    Now Qualcomm is having problems with yields to the 28nm process since it did not spend enough time in the development stage. It seems that the gamble paid off since the S4 contract with Samsung is huge. As well as HTC. You will see a S4 in the american Galaxy S3 with LTE most likely.
    2
    While Qualcomm S4 Krait is a good upgrade from their Scorpion core, its actually much slower than A15 at the same clock. About the same as A9 but a little more efficient. Its good at vellamo and linpack but neither are any indication of a phone's actual performance in use. Antutu is the best benchmark for android and S4 scores about half of what Tegra 3 does. So saying S4 is comparable to Tegra3 in performance is far from the truth.
    ,

    Same as A9, slower than A15? What are you smoking?

    Antutu is the worst bench mark there is because it is not based on real world applications. The Snapdragon S4 is a more complex design than a Cortex A15.

    http://www.anandtech.com/show/4940/qualcomm-new-snapdragon-s4-msm8960-krait-architecture

    ARM's Cortex A15 design by comparison features a 15-stage integer pipeline. Qualcomm's design does contain more custom logic than ARM's stock A15, which has typically given it a clock speed advantage. The A15's deeper pipeline should give it a clock speed advantage as well. Whether the two effectively cancel each other out remains to be seen.

    and:

    At 3.3, Krait should be around 30% faster than a Cortex A9 running at the same frequency. At launch Krait will run 25% faster than most A9s on the market today, a gap that will only grow as Qualcomm introduces subsequent versions of the core. It's not unreasonable to expect a 30 - 50% gain in performance over existing smartphone designs. ARM hasn't published DMIPS/MHz numbers for the Cortex A15, although rumors place its performance around 3.5 DMIPS/MHz.

    A 3.3 rating vs a rumored 3.5 for Cortex A15. They are on par with each other. A9 is old in comparison. No comparison to a Snapdragon S4 clock for clock.

    http://www.anandtech.com/show/5559/...mance-preview-msm8960-adreno-225-benchmarks/2

    If you want to abstract by one more level: Krait will be faster regardless of application, regardless of usage model. You're looking at a generational gap in architecture here, not simply a clock bump.


    Occasionally we'll see performance numbers that just make us laugh at their absurdity. Krait's Linpack performance is no exception. The performance advantage here is insane. The MSM8960 is able to deliver more than twice the performance of any currently shipping SoC

    http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3

    AnTuTu looks at CPU, memory, SD card reading, 2D graphics and 3D graphics, amongst other things. In what scenerio do you use all of that at the same time? Is that really a real world benchmark?

    for pure CPU performance, Linpack is the best.

    A higher score is better. Tegra 3 has the advantage because it has four cores. The bad thing about that is that there isn't much a smartphone does that stresses all four cores. Most are single or dual core.


    http://www.pcmag.com/article2/0,2817,2400409,00.asp


    The Snapdragon S4 beats it out in just about any other test with only two cores. There wouldn't be a comparison if the Snapdragon S4 had four cores.
    2
    There isn't one bench mark where a dual core Cortex A9 beats a Snapdragon S4 in any test any where on the web.

    http://www.extremetech.com/computin...ragon-s4-has-the-competition-on-the-defensive

    With two 1.5GHz Krait cores, Qualcomm’s new part was able to thoroughly thrash dual-core ARM chips in most tests, and even beat Nvidia’s Tegra 3 quad-core SoC in benchmarks that don’t rely heavily on multi-core optimization.


    Nvidia might have thought getting to quad-core first assured it victory, but Qualcomm’s Krait core is more similar to the next generation Cortex-A15 than it is to Tegra’s A9.

    Clock for clock the S4 literally destroys any Cortex A9 on the market. The Tegra 3 keeps up with the S4 because it is a Quad core chip. Of coarse benchmarks which display raw horsepower the Tegra 3 will win some of them.

    Will you see a difference between a tegra 3 or a Exynos quad core or a Snapdragon S4 dual core. Probably not. The Tegra 3 is fast, the Quad Exynos is fast. So is the 'dual' Snapdragon S4. If and when the quad Snapdragon S4 comes out. This would not even be a conversation. No contest.

    You are saying that a last generation Cortex A9 is just as fast or faster than a Snapdragon S4 which is a A15 class SOC clock for clock. Is this right? The same S4 which beats any dual core A9, and badly. The Snapdragon S4 design is even more advanced and more efficient than the next gen Cortex A15. Lets compare the Quad Tegra 3 to the Dual Snapdragon S4.

    Loses six of seven tests:

    http://www.engadget.com/2012/04/02/htc-one-s-review/

    The software for both phones were pre production. About ten days old. So take the benchmarks with a grain of salt for the pre production models. The numbers for the Trasformer Prime and the MSM8960 which run on mature hardware and software would be a more accurate benchmark. The One XL has the MSM8960 not the MSM8260A.

    If you compare the chip that is actually in the One XL the MSM8960 against the tegra 3, the MSM8960 wins 4 out of six tests to tegra 3's two. Basically 4-2.
    If you compare the HTC one S vs the tegra 3, the split 3 to 3.

    http://www.anandtech.com/show/5584/htcs-new-strategy-the-htc-one

    http://www.stuff.tv/news/phone/news...ne-x-vs-htc-one-xl-–-tegra-3-vs-snapdragon-s4

    He states that the Snapdragon S4 and the Cortex A15 should be close in performance clock for clock. Dhrystone MIPS is what the chip manufactures use to gauge performance when they are designing and clocking the chip do finalize the performance of the chip to wattage ratio. It is the defacto test for the manufactures.

    Linpack does reflect real world peformance. It stresses the CPU and only the CPU. How could you get more real world than that?

    How does any Cortex dual A9 compare the the S4 in these benchmarks?

    http://www.anandtech.com/show/5559/...mance-preview-msm8960-adreno-225-benchmarks/2

    The Snapdragon S4 in the One XL is the msm8960 in case you were wondering.

    He clearly states that there isn't a Cortex A9 that comes close clock for clock. And states the percentages.

    ARM's Cortex A15 design by comparison features a 15-stage integer pipeline. Qualcomm's design does contain more custom logic than ARM's stock A15, which has typically given it a clock speed advantage. The A15's deeper pipeline should give it a clock speed advantage as well. Whether the two effectively cancel each other out remains to be seen.

    At 3.3, Krait should be around 30% faster than a Cortex A9 running at the same frequency. At launch Krait will run 25% faster than most A9s on the market today, a gap that will only grow as Qualcomm introduces subsequent versions of the core. It's not unreasonable to expect a 30 - 50% gain in performance over existing smartphone designs. ARM hasn't published DMIPS/MHz numbers for the Cortex A15, although rumors place its performance around 3.5 DMIPS/MHz.

    A 3.3 rating vs a rumored 3.5 for Cortex A15. They are on par with each other. A9 is old in comparison. No comparison to a Snapdragon S4 clock for clock.

    What dual core A9 comes close to the Snapdragon S4?

    http://www.anandtech.com/show/5559/...mance-preview-msm8960-adreno-225-benchmarks/2

    If you want to abstract by one more level: Krait will be faster regardless of application, regardless of usage model. You're looking at a generational gap in architecture here, not simply a clock bump.

    Occasionally we'll see performance numbers that just make us laugh at their absurdity. Krait's Linpack performance is no exception. The performance advantage here is insane. The MSM8960 is able to deliver more than twice the performance of any currently shipping SoC


    I cannot see how you can say a Cortex A9 core for core is as good as a snapdragon S4. That is like saying a Cortex A9 is as good as a Cortex A15. The next round of chips will be dual core A15's to compete with Qualcomms Snapdragon S4.

    The only reason you are seeing quad core cortex A9's is because they don't have Cortex A15's SOC's ready. Qualcomm released their Snapdragon S4 early. Caught Nvidia and Samsung with their pants down. The Snapdragon S4 was not originally to be released with a Adreno 225. Which is just a enhanced Adreno 220. It was supposed to be released with a Adreno 320. The original plan of the S4 was to be released late Q2 2012. Not late Q1 2012. They are now going to call the Snapdragon S4 with the Adreno 320 the MSM8960 'Pro'.

    They knew about that Qualcomm might release the S4 early a year ago! Samsung rushed their Exynos 4412 to market because their A15 5520 was not ready yet. That is why it still runs with the same Mali 400 as the Galaxy S2. The future is not 6 to 8 core smartphones any time soon. You will see more advanced lower wattage lower power chips first. Yes there is going to be Quad S4's and A15's. Cortex A9 will be gone by the end of this year or early next year. Replaced by Cortex A15 and Snapdragon S4 SOC's.
    2
    Maybe I can help......

    I am typing this on an Asus Transformer Prime, the first quad core Tegra 3 tablet. We can talk numbers all day long and I have read many threads comparing benchmark numbers between what Sammy, Qual, Nvidia etc. And they are just numbers....like us grease monkeys comparing dyno numbers for our cars, they dont tell what the real world applications are.

    While I love my Prime, it can be extremely frustrating at times. I was hoping with the power of the tegra 3 that this thing would be lightning fast.

    Tegra 3 with its 5 cores maybe fast in benchmark tests, but when you throw in ICS and real world applications it can stumble. Heavy web pages tell me the stock browser wont respond, scrolling will lag and be choppy, and will force close....often.

    I am no where near as smart as 99% of you guys, so I am giving a basic user experience. Kinda just a real world deal. I freaking love my Prime, and would ABSOLUTELY LOVE to show off my Prime to friends, but doing so would make me nervous.

    A lot of Prime users have found ways around these problems ie. changing settings, change how it renders, flash to "on demand", change the browser, etc, etc....BUT why, WHY would I have to do this in a $700 (I have the keyboard dock also) Tegra 3 quad core device that should be able to crush such basic functions.


    Not sure if I helped any of you guys. Since I cant help any of you all with anything technical I thought maybe I could give some of ya a real world experience of what a Tegra 3 is like.

    Personally I am in the same boat as you guys.....I am reading every rumor about the gs3 and the one x. My 2 year old beloved htc evo is ready for an upgrade....