Those of you that are waiting to compare the One X to the GSIII

Search This thread

JayFReaZy

Senior Member
Apr 8, 2011
259
31
I'd say you're going with the wrong phone. The HOX outperforms the s3 in most benchmarks, and has a better display, a better build quality, and a better feeling to it just in general. Besides, if the storage is so important to you, you know that the HOX gets 25gb free dropbox too right? http://www.androidcentral.com/att-htc-one-x-will-have-free-dropbox-space-company-spokeswoman-says

I'd love to own the one X and I'm still fidgeting with the idea in the back of my head. However, I think storage space is important for me because I'd like to have most of my music on the go with me. I do understand there is 25gb of dropbox, however, capped at 2GB a month of data, it doesn't help me much. Plus, when I do listen to music, most of the time it's in the subway, where there is no cell service.

I know google music has a way for you to listen to music offline, but from what I've read, that takes up space from storage. I also take a lot of pics and videos.

The options I'm contemplating are whether to wait for the s3, or see how much my termination fee would be and get the evo one for sprint which seems to be better as far as options go.
 
D

Deleted member 1154637

Guest
Glad to be back on HTC. rocking my H1X since Fri. battery is pretty good.
SG3, not a chance. TW sucks.
 

louis.b

Senior Member
Feb 17, 2012
238
67
Sydney
LOL

12 MP camera! 2 GB RAM! Super AMOLED HD screen! Ceramic coating! New beautiful touchwiz!
Yeah right. Overhyping kills a decent phone. But it's not the end of the world. The Iphone 4s was overhyped too and it was a best-seller. But wait, this is Samsung. So who knows.
 

vioalas

Senior Member
Apr 3, 2012
670
97
Lakeland, Fl
I know google music has a way for you to listen to music offline, but from what I've read, that takes up space from storage. I also take a lot of pics and videos.

I'm using google music, I just cycle the albums I want to listen to online, that way it doesn't use much storage. I upload my pics automatically to dropbox too.
 

vioalas

Senior Member
Apr 3, 2012
670
97
Lakeland, Fl
. . shortage? so you're implying they would have used s4's on all htc one x versions if it werent for shortage, when it is known that the only reason they bothered with S4's is because of LTE compatibility?... interesting.

Samsung is also using a quadcore device (which according to benchmark vs s4, s4 comes out as winner) because s4's are on shortage and not cause of LTE? alright...

I'll wait for more optimized benchmarks for quadcores though before adding to this topic.

He was not directly implying that.

If you would've addressed the rest of his message you would've reached his statement that part of why they are implementing the quad core processor is as a marketing strategy. The general public won't care if a dual core is faster, four is more. Just look at all the noobs that bash windows phones because of their single core processors even though windows phone is really smooth.
 

sschrupp

Senior Member
Jul 27, 2008
371
10
Iowa
He was not directly implying that.

If you would've addressed the rest of his message you would've reached his statement that part of why they are implementing the quad core processor is as a marketing strategy. The general public won't care if a dual core is faster, four is more. Just look at all the noobs that bash windows phones because of their single core processors even though windows phone is really smooth.

Very true. It's amazing how many posts in various forums/blogs/etc where people make statements like "OMG US is getting a disabled phone with only 2 cores instead of 4!!!" They often mention "just like the HTC One X in US was gimped!"

4 is always better than 2 obviously. /sarcasm
 

eallan

Senior Member
Apr 23, 2010
1,428
113
Very true. It's amazing how many posts in various forums/blogs/etc where people make statements like "OMG US is getting a disabled phone with only 2 cores instead of 4!!!" They often mention "just like the HTC One X in US was gimped!"

4 is always better than 2 obviously. /sarcasm

To be fair, 2 is far better than one. The jump from 2 to 4 is not quite as profound. The ability to run a single threaded app full throttle and have an entire core left over is excellent.
 

Top Liked Posts

  • There are no posts matching your filters.
  • 4
    A small point that doesn't affect your claims. The Mali 400 in the Exynos 4412 is a quad core. In terms of GPUs, cores don't really mean much as each core can have varying processing capability. Some cores are smaller, some are bigger. I think the Power VR 543MP4 is a much larger than the Mali 400 but I don't have the numbers to back me up. Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.


    I don't know as much about the Mali T400 but I am pretty sure it's not a true quad core. Just as the Tegra 3 is not a '12' core GPU. Alot of internet sites are claiming that the Mali 400 is quad core which isn't really the case. Mali-400 MP is scalable from 1 to 4 cores. It is multi-threaded which can give the impression of a quad core chip. For instance VR states that there PowerVR 5 series are MP 2 and MP 4. Multi processing GPU cores x2 and x4. They are true Quad core and dual core GPU's. Multi processors x2 and x4.

    http://www.differencebetween.com/difference-between-mali-400mp-gpu-and-vs-tegra-2/#ixzz1tYp8YnWu

    Maybe samsung has a quad core model, we will not know for sure until it is released. I don't think so but we will see. As I don't see how they would fit a true 'quad core' GPU and a quad core CPU in a phone.

    More likely is that people don't really what they mean by GPU Cores. Graphics processors typically use an array of basic computing “building blocks” that are also called “cores”. And because computer graphics is highly parallelizable, adding more cores is one way to scale graphics performance. GPU vendors like AMD, Power VR, Qualcomm, NVIDIA and others have done this for more than a decade, and a modern GPU for desktop computers can host 512+ “cores”.

    The thing is, GPU Cores have been marketed so much that the term doesn’t mean much anymore. It’s useful when comparing several GPUs models within a single generation, but from one brand to the next, the definition of graphics core can vary greatly. For instance, AMD usually declare having much more “cores” than NVIDIA, but in the end, the actual performance does not reflect the difference at all.

    A GPU core is referred to as a 'undercore' or 'fragment processor' by most engineers, meaning that it is not a complete GPU core. The A5 is a true dual core GPU, and the A5X is a true quad core GPU. Sadly the current Mali 400 is not.

    It says here single core which is what I thought it was:

    http://www.arm.com/products/multimedia/mali-graphics-hardware/mali-400-mp.php

    The Power VR 543MP4 is a true quad core GPU, not the Mali 400 MP. The Power VR 543MP4 is larger, much larger. And gives off more heat because of the 45nm size and the full quad core layout. The chip size between the two is enormous. The A5X is as big as some desktop processors.

    http://www.anandtech.com/show/5685/apple-a5x-die-size-measured-16294mm2-likely-still-45nm

    Here is the layout of the Mali 400. That whole diagram is a full GPU core. They count the 'fragment processors' as 'Cores' hence the MP designation. In contrast the A5X has 'four full GPU cores with multiple "under cores' or 'fragment processors'. Notice the Mali 400 has 4 of these 'fragment processors'.

    http://www.arm.com/images/Mali-400_1000px.jpg

    Regardless, heat comes down to a lot of things like the build process, clockspeed, size, etc.

    Being a engineer, I would not know this. Lol. :)(Just kidding).

    We will not know what the Galaxy S3 has in it until it ships.
    3
    No problem everyone, just trying to contribute to the board.

    At least some people find my comments informative, my wife sure doesn't. :D

    ---------- Post added at 10:47 PM ---------- Previous post was at 10:14 PM ----------

    First of all, AWESOME thread! Thank you everyone for the informative posts, especially you Iamthedudeman.

    I'm on the fence about buying the One-XL this weekend, as I can't afford the One-X and didn't know if it was really an upgrade from my G2x. This thread has definitely shown me that what I will be getting is truly the next generation of smartphones. At this point I don't even want the Tegra 3 version.

    On a side note, do you think it is possible that HTC used such debate-inspiring SoCs as a way to draw more attention and fill the media with info-hungry potential consumers? I mean, personally I have never scoured YouTube looking for comparative benchmarks like I have today.

    Well the Tegra 3 is a very powerfull SOC. You have to remember that most of these SOC's are 'overkill' to most applications. Having a Tegra 3 vs a Snapdragon S4 in the One X will be transparent to the user because the user experience for both will be all things considered "equal' in most tasks.

    You might see a slight advantage for the S4 if you had them side by side, and that is even questionable. The only reason the Snapdragon was chosen was because of the integrated LTE and low power draw because of LTE. Also the performance is equal to the user experience of the Tegra 3. You don't want to ship a LTE capable chip that cannot be used in a non-LTE country. Which most European countries are, as well as international.

    The S4 was fast tracked by Qualcomm for a early release to gain market share over the impending rival chipmakers. The Exynos 5520, and the OMAP 5.

    http://www.ti.com/general/docs/wtbu...teId=6123&navigationId=12864&contentId=103103



    My firm got marketing materials almost a year ago for the snapdragon S4! With the ever increasing performance per watt and power draw of current top end smartphones, I can tell you that manufactures were very happy to get the Snapdragon S4 with integrated LTE which can perform on par or exceed their Cortex A9 quad counterparts.

    For the end user, lucky for us who want the One XL we get the future of SOC's now, minus the Adreno 225.

    Now Qualcomm is having problems with yields to the 28nm process since it did not spend enough time in the development stage. It seems that the gamble paid off since the S4 contract with Samsung is huge. As well as HTC. You will see a S4 in the american Galaxy S3 with LTE most likely.
    2
    While Qualcomm S4 Krait is a good upgrade from their Scorpion core, its actually much slower than A15 at the same clock. About the same as A9 but a little more efficient. Its good at vellamo and linpack but neither are any indication of a phone's actual performance in use. Antutu is the best benchmark for android and S4 scores about half of what Tegra 3 does. So saying S4 is comparable to Tegra3 in performance is far from the truth.
    ,

    Same as A9, slower than A15? What are you smoking?

    Antutu is the worst bench mark there is because it is not based on real world applications. The Snapdragon S4 is a more complex design than a Cortex A15.

    http://www.anandtech.com/show/4940/qualcomm-new-snapdragon-s4-msm8960-krait-architecture

    ARM's Cortex A15 design by comparison features a 15-stage integer pipeline. Qualcomm's design does contain more custom logic than ARM's stock A15, which has typically given it a clock speed advantage. The A15's deeper pipeline should give it a clock speed advantage as well. Whether the two effectively cancel each other out remains to be seen.

    and:

    At 3.3, Krait should be around 30% faster than a Cortex A9 running at the same frequency. At launch Krait will run 25% faster than most A9s on the market today, a gap that will only grow as Qualcomm introduces subsequent versions of the core. It's not unreasonable to expect a 30 - 50% gain in performance over existing smartphone designs. ARM hasn't published DMIPS/MHz numbers for the Cortex A15, although rumors place its performance around 3.5 DMIPS/MHz.

    A 3.3 rating vs a rumored 3.5 for Cortex A15. They are on par with each other. A9 is old in comparison. No comparison to a Snapdragon S4 clock for clock.

    http://www.anandtech.com/show/5559/...mance-preview-msm8960-adreno-225-benchmarks/2

    If you want to abstract by one more level: Krait will be faster regardless of application, regardless of usage model. You're looking at a generational gap in architecture here, not simply a clock bump.


    Occasionally we'll see performance numbers that just make us laugh at their absurdity. Krait's Linpack performance is no exception. The performance advantage here is insane. The MSM8960 is able to deliver more than twice the performance of any currently shipping SoC

    http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3

    AnTuTu looks at CPU, memory, SD card reading, 2D graphics and 3D graphics, amongst other things. In what scenerio do you use all of that at the same time? Is that really a real world benchmark?

    for pure CPU performance, Linpack is the best.

    A higher score is better. Tegra 3 has the advantage because it has four cores. The bad thing about that is that there isn't much a smartphone does that stresses all four cores. Most are single or dual core.


    http://www.pcmag.com/article2/0,2817,2400409,00.asp


    The Snapdragon S4 beats it out in just about any other test with only two cores. There wouldn't be a comparison if the Snapdragon S4 had four cores.
    2
    There isn't one bench mark where a dual core Cortex A9 beats a Snapdragon S4 in any test any where on the web.

    http://www.extremetech.com/computin...ragon-s4-has-the-competition-on-the-defensive

    With two 1.5GHz Krait cores, Qualcomm’s new part was able to thoroughly thrash dual-core ARM chips in most tests, and even beat Nvidia’s Tegra 3 quad-core SoC in benchmarks that don’t rely heavily on multi-core optimization.


    Nvidia might have thought getting to quad-core first assured it victory, but Qualcomm’s Krait core is more similar to the next generation Cortex-A15 than it is to Tegra’s A9.

    Clock for clock the S4 literally destroys any Cortex A9 on the market. The Tegra 3 keeps up with the S4 because it is a Quad core chip. Of coarse benchmarks which display raw horsepower the Tegra 3 will win some of them.

    Will you see a difference between a tegra 3 or a Exynos quad core or a Snapdragon S4 dual core. Probably not. The Tegra 3 is fast, the Quad Exynos is fast. So is the 'dual' Snapdragon S4. If and when the quad Snapdragon S4 comes out. This would not even be a conversation. No contest.

    You are saying that a last generation Cortex A9 is just as fast or faster than a Snapdragon S4 which is a A15 class SOC clock for clock. Is this right? The same S4 which beats any dual core A9, and badly. The Snapdragon S4 design is even more advanced and more efficient than the next gen Cortex A15. Lets compare the Quad Tegra 3 to the Dual Snapdragon S4.

    Loses six of seven tests:

    http://www.engadget.com/2012/04/02/htc-one-s-review/

    The software for both phones were pre production. About ten days old. So take the benchmarks with a grain of salt for the pre production models. The numbers for the Trasformer Prime and the MSM8960 which run on mature hardware and software would be a more accurate benchmark. The One XL has the MSM8960 not the MSM8260A.

    If you compare the chip that is actually in the One XL the MSM8960 against the tegra 3, the MSM8960 wins 4 out of six tests to tegra 3's two. Basically 4-2.
    If you compare the HTC one S vs the tegra 3, the split 3 to 3.

    http://www.anandtech.com/show/5584/htcs-new-strategy-the-htc-one

    http://www.stuff.tv/news/phone/news...ne-x-vs-htc-one-xl-–-tegra-3-vs-snapdragon-s4

    He states that the Snapdragon S4 and the Cortex A15 should be close in performance clock for clock. Dhrystone MIPS is what the chip manufactures use to gauge performance when they are designing and clocking the chip do finalize the performance of the chip to wattage ratio. It is the defacto test for the manufactures.

    Linpack does reflect real world peformance. It stresses the CPU and only the CPU. How could you get more real world than that?

    How does any Cortex dual A9 compare the the S4 in these benchmarks?

    http://www.anandtech.com/show/5559/...mance-preview-msm8960-adreno-225-benchmarks/2

    The Snapdragon S4 in the One XL is the msm8960 in case you were wondering.

    He clearly states that there isn't a Cortex A9 that comes close clock for clock. And states the percentages.

    ARM's Cortex A15 design by comparison features a 15-stage integer pipeline. Qualcomm's design does contain more custom logic than ARM's stock A15, which has typically given it a clock speed advantage. The A15's deeper pipeline should give it a clock speed advantage as well. Whether the two effectively cancel each other out remains to be seen.

    At 3.3, Krait should be around 30% faster than a Cortex A9 running at the same frequency. At launch Krait will run 25% faster than most A9s on the market today, a gap that will only grow as Qualcomm introduces subsequent versions of the core. It's not unreasonable to expect a 30 - 50% gain in performance over existing smartphone designs. ARM hasn't published DMIPS/MHz numbers for the Cortex A15, although rumors place its performance around 3.5 DMIPS/MHz.

    A 3.3 rating vs a rumored 3.5 for Cortex A15. They are on par with each other. A9 is old in comparison. No comparison to a Snapdragon S4 clock for clock.

    What dual core A9 comes close to the Snapdragon S4?

    http://www.anandtech.com/show/5559/...mance-preview-msm8960-adreno-225-benchmarks/2

    If you want to abstract by one more level: Krait will be faster regardless of application, regardless of usage model. You're looking at a generational gap in architecture here, not simply a clock bump.

    Occasionally we'll see performance numbers that just make us laugh at their absurdity. Krait's Linpack performance is no exception. The performance advantage here is insane. The MSM8960 is able to deliver more than twice the performance of any currently shipping SoC


    I cannot see how you can say a Cortex A9 core for core is as good as a snapdragon S4. That is like saying a Cortex A9 is as good as a Cortex A15. The next round of chips will be dual core A15's to compete with Qualcomms Snapdragon S4.

    The only reason you are seeing quad core cortex A9's is because they don't have Cortex A15's SOC's ready. Qualcomm released their Snapdragon S4 early. Caught Nvidia and Samsung with their pants down. The Snapdragon S4 was not originally to be released with a Adreno 225. Which is just a enhanced Adreno 220. It was supposed to be released with a Adreno 320. The original plan of the S4 was to be released late Q2 2012. Not late Q1 2012. They are now going to call the Snapdragon S4 with the Adreno 320 the MSM8960 'Pro'.

    They knew about that Qualcomm might release the S4 early a year ago! Samsung rushed their Exynos 4412 to market because their A15 5520 was not ready yet. That is why it still runs with the same Mali 400 as the Galaxy S2. The future is not 6 to 8 core smartphones any time soon. You will see more advanced lower wattage lower power chips first. Yes there is going to be Quad S4's and A15's. Cortex A9 will be gone by the end of this year or early next year. Replaced by Cortex A15 and Snapdragon S4 SOC's.
    2
    Maybe I can help......

    I am typing this on an Asus Transformer Prime, the first quad core Tegra 3 tablet. We can talk numbers all day long and I have read many threads comparing benchmark numbers between what Sammy, Qual, Nvidia etc. And they are just numbers....like us grease monkeys comparing dyno numbers for our cars, they dont tell what the real world applications are.

    While I love my Prime, it can be extremely frustrating at times. I was hoping with the power of the tegra 3 that this thing would be lightning fast.

    Tegra 3 with its 5 cores maybe fast in benchmark tests, but when you throw in ICS and real world applications it can stumble. Heavy web pages tell me the stock browser wont respond, scrolling will lag and be choppy, and will force close....often.

    I am no where near as smart as 99% of you guys, so I am giving a basic user experience. Kinda just a real world deal. I freaking love my Prime, and would ABSOLUTELY LOVE to show off my Prime to friends, but doing so would make me nervous.

    A lot of Prime users have found ways around these problems ie. changing settings, change how it renders, flash to "on demand", change the browser, etc, etc....BUT why, WHY would I have to do this in a $700 (I have the keyboard dock also) Tegra 3 quad core device that should be able to crush such basic functions.


    Not sure if I helped any of you guys. Since I cant help any of you all with anything technical I thought maybe I could give some of ya a real world experience of what a Tegra 3 is like.

    Personally I am in the same boat as you guys.....I am reading every rumor about the gs3 and the one x. My 2 year old beloved htc evo is ready for an upgrade....