Originally Posted by CDB-Man
Interesting thought. Though, you could always use something like Antutu, or get an FPS meter app?
It's a bit different - it measures real playback perfomance, not some abstract number. Even if it was measuring pure cpu perfomance - different archictures have different efficiency at video decoding (think about extentions like MMX/SSE/AVX on x86), plus decoder gets better over time (you get more fps for same cpu perfomance).
Is 40k in antutu enough to play 720p hi10p flawlessy? "It depends".
Fpsmeter will (at best) show only frame drops - when player was not fast enough to draw a frame. If you play 30 fps video and it will say that it plays at 20 fps - it doesn't mean that you can play similar video at 20 fps or that you need to get 50% faster. And if it plays without frame drops - you'll never know how much extra perfomance you have.
But that way it would be possible to do such things:
1) Run video and say:
- "hey, it runs at >120%, I don't need to touch anything to be happy".
- "it runs at 100%, which means that it barely could play it - I need to do something".
- "it runs at <80%, nothing will help so it's better to give up".
2) Change settings and say:
- "switching to yuv/rgb32/rgb16 made it 10% faster, so I should probably use it if I'm happy with quality"
- "I needed some extra perfomance and speed-up tricks got me extra 30% - just what I needed"
3) Give video and ask to benchmark it and then judge how capable the device it (I've seen people that say "flawlessly"/"watchable"/"playable" at 15 fps).
For example I've wasted hours testing hi10p perfomance on my Z3c - sometimes it plays flawlessely, sometimes overheats (drops cpu freq), sometimes lags... and there're different setting to play with, let alone videos with different complexity (and subtiles).