Attend XDA's Second Annual Developer Conference, XDA:DevCon 2014!
5,785,935 Members 52,896 Now Online
XDA Developers Android and Mobile Development Forum

[PROJ] Overclocking the Adreno GPU on Snapdragon Devices

Tip us?
 
Geniusdog254
Old
(Last edited by Geniusdog254; 12th July 2010 at 01:05 AM.)
#1  
Geniusdog254's Avatar
Retired Recognized Developer - OP
Thanks Meter 166
Posts: 1,094
Join Date: Jan 2009
Location: St. Louis

 
DONATE TO ME
Default [PROJ] Overclocking the Adreno GPU on Snapdragon Devices

INFO:
Let me save you the time of reading all this. In it's current form, this is impossible. An inside connect at Qualcomm has told us we can't change the GPU clock from userland or kernel. That mean's we're screwed basically. Here is the post by Jack_R1:
Quote:
Checked, as promised. Bad news. If I sum it up in one sentence - GPU overclock w/o touching the rest of the system is plain impossible, and GPU overclock while trying to play with the whole system is most probably still impossible (pending further deeper check), and it's definitely impossible for anyone that doesn't have the clock diagram, which is NDA, thus won't be available. The reason isn't software, and don't ask, what it is - I won't give direct explanations.

I've written a long post with some explanations of clock networks, to educate those that want to learn and prevent some of big mistakes that I've seen along this thread, and it's pending approval, since I don't want to accidentally spill more than I can share. I hope it gets approved and I'll post it. It should give some more insight.

The only good news is - global overclocking that affects 1GHz CPU clock will affect GPU too.
I'm sure we all heard about being able to overclock the GPU on some of the old MSM devices, but the Snapdragon handles the graphics chip in a different way. The goal of this thread is to try and overclock the GPU on Snapdragon devices as well.

There is nothing GPU related in acpuclock-scorpion.c at least for setting clock speed as far as I can tell.

In board-mahimahi.c there is some kgsl init code, but so far as I can tell it isn't setting the clock there, instead it seems to be pointing to PWR_RAIL_GRP_CLK to set the clock. It defines the variable in board-mahimahi.c but I'm not sure where that variable is set, since it doesn't seem to be in any of the other board files as far as I can tell. I could be completely off here too though.

In drivers/video/msm/gpu/kgsl/kgsl.c there is a method called kgsl_clk_enable that seems to be called whenever the GPU is enabled. It looks like this:

Code:
/* the hw and clk enable/disable funcs must be either called from softirq or
 * with mutex held */
static void kgsl_clk_enable(void)
{
	//clk_set_rate(kgsl_driver.ebi1_clk, 128000000);
	clk_set_rate(kgsl_driver.ebi1_clk, 245000000);
	clk_enable(kgsl_driver.imem_clk);
	clk_enable(kgsl_driver.grp_clk);
}
The line that's been commented out is the original value, I replaced it with my value on the line below it in a failed attempt to overclock it. Probably a stupid effort on my part, I doubt it's that simple, but it was worth a shot.

According to the clk.h files in the standard linux kernel, clk_set_rate is obviously a method to set the clock rate. The first variable is a struct that tells it which clock to set, and the second variable is a long value that is the rate you want it set at. Is it setting the right clock there for Snapdragon chips? Or is it only the clock for older chips? Actually I've looked again, the kgsl files are ONLY for the newer Qualcomm chips, the QSD8x50 Snapdragons & the midrange MSM7x27 series that replaced the old MSM7x00a chips like in the Dream & Magic. At least thats the way it seems to me judging from what Qualcomm & AMD have written in the source.

I'm in way over my head with this source, I'm but a lowly Java dev , but I really wanna solve this. Can anyone with a little more experience than me throw in a little more info? Sorry if it doesn't make much sense, if it isn't clear just ask me & I'll try to explain a little more.

Regards,
Jesse C


EDIT: Okay I did a little more digging and those kgsl settings should work for QSD8x50 chips. In the config file, under Drivers, Graphics Support, it allows you to enable 3D accelleration for QSD8x50 & MSM7x27 chips. The tag for that is CONFIG_MSM_KGSL_MMU. If you check in kgsl.c it checks to see if that is enabled in the config, and if it is then it compiles and uses kgsl.c & all of the kgsl code. That tells me that the clock is either not being set, or the wrong clock is being set. I'm adding some debug code right now so I can see in dmesg what code is actually being run.

EDIT 2: Another status update. Adding the debug code showed that it is setting the clock at my level whenever the kgsl initializes. Also, as IntersectRaven pointed out, clocks.h in arch/arm/mach-msm nicely lays out what each clock is:

Code:
grp_clock = Graphics clock
ebi1_clock = External Bus Interface 1 clock
imem_clock = Internal Graphics Memory clock
If you want to add your own debug code to it, calling the method "pr_info" and passing it a string with the message you want will output it to the kernel logs you can view with dmesg. An example would be "pr_info("kgsl: clock set at 245mhz\n");"

EDIT 3: I now know the values for all the GPU related clocks.
Code:
<6>[   70.681793] kgsl: grp_clock= 256000000
<6>[   70.682464] kgsl: imem_clock= 256000000
<6>[   70.683441] kgsl: ebi1_clock= 128000000
If you want to get these values yourself, the following code in the kgsl_clk_enable method will output them to the kernel log:

Code:
	int clkg = clk_get_rate(kgsl_driver.grp_clk);
	int clki = clk_get_rate(kgsl_driver.imem_clk);
	int clke = clk_get_rate(kgsl_driver.ebi1_clk);

	// this will get the clock rate & print it in the kernel log
	pr_info("kgsl: grp_clock = %d", clkg);
	pr_info("kgsl: imem_clock = %d", clki);
	pr_info("kgsl: ebi1_clock = %d", clke);
Just add that into the function I mentioned and it will output it to the kernel log, quite often in fact, so I wouldn't leave it in there except to see the values


EDIT 4: 7-2-2010
Qualcomm has now released a new driver. It looks a lot better for our purposes. I'm looking through it today but I have other stuff I need to do. I will look at it, but I can't promise I can devote much time to it until at least Sunday.

EDIT 7-7-2010:
I've collaborate with storm99999 over GTalk and things aren't looking good. Here's what it comes down to:
  • If we try and lock it with any of the clk_set_rate methods, there is no effect. Not matter how we change it, it just stays at the original value. This either means it's set in the radio (impossible for us to change) or that it's set one time on boot and never changes, but we just don't know where it's actually set (more likely, but I'm not sure either way)
  • If we try and change it with msm_proc_comm, which is a direct interface to the hardware, it causes the kernel not to boot at all. This is really really strange. We can monitor the clock with pr_info as before, and if we read the data1 argument, it prints to the kernel fine, but if we try and read the data2 arg passed to msm_proc_comm, it also freezes on boot.

We seem to be out of ideas here. We're open to any reasonable suggestions, so if you have any, PLEASE let us know in this thread or PM one of us. Or you can email me at geniusdog254@gmail.com
Follow me on Twitter @Geniusdog254

If you like my work, please buy one of my apps on the Market or donate!
  • Reflex-A-Tron - A colorful, addicting, & fun game to test yourself & challenge friends for only $.99!
  • Search2Play - Control your music with your device's buttons!
  • Task'd - A beautiful to-do list app!

Phones:
Android Dev Phone 1 - Dec. '08
Rogers HTC Dream - Sept. '09
AT&T Nexus One - March '10
Motorola Droid - Apr. '10
HTC Evo - Jun. '10
HTC Evo 3D - Jun. '11
Sprint Galaxy Nexus - May '12
The Following 4 Users Say Thank You to Geniusdog254 For This Useful Post: [ Click to Expand ]
 
jlevy73
Old
(Last edited by jlevy73; 26th June 2010 at 03:30 AM.)
#2  
jlevy73's Avatar
Senior Member
Thanks Meter 1,284
Posts: 9,211
Join Date: Nov 2009
Location: Los Angeles
This is a great idea for sure. I will certainly lend a hand where I can.

That being said don't know if it is possible to OC the GPU. I remember discussing this with kmobs in the past and he didn't think so. Perhaps tweaking Open GL might prove more fruitful. All of this is definitely worth thinking about.
 
Geniusdog254
Old
#3  
Geniusdog254's Avatar
Retired Recognized Developer - OP
Thanks Meter 166
Posts: 1,094
Join Date: Jan 2009
Location: St. Louis

 
DONATE TO ME
Quote:
Originally Posted by jlevy73 View Post
This is a great idea for sure. I will certainly lend a hand where I can.

That being said don't know if it is possible to OC the GPU. I remember discussing this with kmobs in the past and he didn't think so. Perhaps tweaking Open GL might prove more fruitful. All of this is definitely worth thinking about.
Hmm well that may be kind of a downer...

I know it sets the values somewhere, but I can't seem to find anywhere it could be setting them other than the kgsl stuff, and that doesn't have any effect on performance at all. Unless it could be set in hardware?

I don't really care about harming my Nexus at this point, now that I've upgraded to an Evo, so I'm willing to try dangerous kernels on my Nexus (read: any kernel changes I make ).

If someone could give me any more ideas about where to look, I'd be greatful. I've gone through everything I can think of and it doesn't seem to be getting me anywhere closer.

P.S. I love your avatar jlevy! Simpsons FTW!
Follow me on Twitter @Geniusdog254

If you like my work, please buy one of my apps on the Market or donate!
  • Reflex-A-Tron - A colorful, addicting, & fun game to test yourself & challenge friends for only $.99!
  • Search2Play - Control your music with your device's buttons!
  • Task'd - A beautiful to-do list app!

Phones:
Android Dev Phone 1 - Dec. '08
Rogers HTC Dream - Sept. '09
AT&T Nexus One - March '10
Motorola Droid - Apr. '10
HTC Evo - Jun. '10
HTC Evo 3D - Jun. '11
Sprint Galaxy Nexus - May '12
 
jlevy73
Old
#4  
jlevy73's Avatar
Senior Member
Thanks Meter 1,284
Posts: 9,211
Join Date: Nov 2009
Location: Los Angeles
This is definitely for the kernel masters like Cyanogen, pershoot and kmobs to weigh in on. In the meantime I am going to rip through the code to see what I can find (if anything). To bad you are moving on to the EVO. We'll miss all of the great ideas/solutions you provide (well N1 user will)
 
Geniusdog254
Old
#5  
Geniusdog254's Avatar
Retired Recognized Developer - OP
Thanks Meter 166
Posts: 1,094
Join Date: Jan 2009
Location: St. Louis

 
DONATE TO ME
Quote:
Originally Posted by jlevy73 View Post
This is definitely for the kernel masters like Cyanogen, pershoot and kmobs to weigh in on. In the meantime I am going to rip through the code to see what I can find (if anything). To bad you are moving on to the EVO. We'll miss all of the great ideas/solutions you provide (well N1 user will)
Thanks! I'll still be hanging around the N1 forums, since so much of the code is interchangeable with them both having the QSD chips and all. If you're looking through source, can you get with me on GTalk or Wave & we can work together on it. Also I've updated the first post with a little more info.
Follow me on Twitter @Geniusdog254

If you like my work, please buy one of my apps on the Market or donate!
  • Reflex-A-Tron - A colorful, addicting, & fun game to test yourself & challenge friends for only $.99!
  • Search2Play - Control your music with your device's buttons!
  • Task'd - A beautiful to-do list app!

Phones:
Android Dev Phone 1 - Dec. '08
Rogers HTC Dream - Sept. '09
AT&T Nexus One - March '10
Motorola Droid - Apr. '10
HTC Evo - Jun. '10
HTC Evo 3D - Jun. '11
Sprint Galaxy Nexus - May '12
 
jlevy73
Old
#6  
jlevy73's Avatar
Senior Member
Thanks Meter 1,284
Posts: 9,211
Join Date: Nov 2009
Location: Los Angeles
Quote:
Originally Posted by Geniusdog254 View Post
Thanks! I'll still be hanging around the N1 forums, since so much of the code is interchangeable with them both having the QSD chips and all. If you're looking through source, can you get with me on GTalk or Wave & we can work together on it. Also I've updated the first post with a little more info.
Sounds good, I will PM you my gtalk address.
 
Geniusdog254
Old
#7  
Geniusdog254's Avatar
Retired Recognized Developer - OP
Thanks Meter 166
Posts: 1,094
Join Date: Jan 2009
Location: St. Louis

 
DONATE TO ME
So I added the debug code. It shows that it is calling my code every time the GPU is initialized. This means it is setting the EBI1 clock to 245mhz instead of the original 128mhz. However I guess that is the wrong clock. it also defines 2 other clocks along with the EBI1 clock, however EBI1 is the only one that it sets a value for, the other 2 it just turns on (hardware values?).

Follow me on Twitter @Geniusdog254

If you like my work, please buy one of my apps on the Market or donate!
  • Reflex-A-Tron - A colorful, addicting, & fun game to test yourself & challenge friends for only $.99!
  • Search2Play - Control your music with your device's buttons!
  • Task'd - A beautiful to-do list app!

Phones:
Android Dev Phone 1 - Dec. '08
Rogers HTC Dream - Sept. '09
AT&T Nexus One - March '10
Motorola Droid - Apr. '10
HTC Evo - Jun. '10
HTC Evo 3D - Jun. '11
Sprint Galaxy Nexus - May '12
 
intersectRaven
Old
(Last edited by intersectRaven; 26th June 2010 at 05:51 AM.)
#8  
Recognized Developer
Thanks Meter 1,013
Posts: 1,965
Join Date: Mar 2010

 
DONATE TO ME
After looking at the code, the clocks seem to be initialized at kgsl_platform_probe by clk_get. (correct me if I'm wrong since I'm a newbie at tracing through the GPU... )

*Nope...wrong analysis...that was the clock SOURCE instead of clock RATE...
**After looking through a bigger part of everything, I think you can't specifically overclock it. The way I see it, the clock source used by kgsl is unified with different clocks being set elsewhere. This is why you're not getting it to work. When it calls the kgsl part you modified, it sets to the modified clock BUT the moment it exits, some other part which uses the same clock source will reset it to 128Mhz. If you want to modify that, you'll pretty much have to modify everything which uses the same clock source. (again, correct me if I'm wrong but I think that pretty much describes what I'm seeing with the code...)
Xperia Z1 Compact - Stock ROM w/ own kernel
Nexus 7 (2013) - Stock ROM w/ own kernel

Buy me a beer (or something...)
BitCoin donations accepted:
15MZpCG4J21myvk8mxfHgjCrJ97SgaZmwC
 
jlevy73
Old
#9  
jlevy73's Avatar
Senior Member
Thanks Meter 1,284
Posts: 9,211
Join Date: Nov 2009
Location: Los Angeles
Quote:
Originally Posted by Geniusdog254 View Post
So I added the debug code. It shows that it is calling my code every time the GPU is initialized. This means it is setting the EBI1 clock to 245mhz instead of the original 128mhz. However I guess that is the wrong clock. it also defines 2 other clocks along with the EBI1 clock, however EBI1 is the only one that it sets a value for, the other 2 it just turns on (hardware values?).

Yeah, I think the other two value are hardware related. To find the right clock...
 
jlevy73
Old
#10  
jlevy73's Avatar
Senior Member
Thanks Meter 1,284
Posts: 9,211
Join Date: Nov 2009
Location: Los Angeles
Quote:
Originally Posted by intersectRaven View Post
After looking at the code, the clocks seem to be initialized at kgsl_platform_probe by clk_get. (correct me if I'm wrong since I'm a newbie at tracing through the GPU... )

*Nope...wrong analysis...that was the clock SOURCE instead of clock RATE...
**After looking through a bigger part of everything, I think you can't specifically overclock it. The way I see it, the clock source used by kgsl is unified with different clocks being set elsewhere. This is why you're not getting it to work. When it calls the kgsl part you modified, it sets to the modified clock BUT the moment it exits, some other part which uses the same clock source will reset it to 128Mhz. If you want to modify that, you'll pretty much have to modify everything which uses the same clock source. (again, correct me if I'm wrong but I think that pretty much describes what I'm seeing with the code...)
I think you are reading it correctly and that one would have to manipulate all the other values to ensure it won't reset to 128mhz. However, even if you do modify all that, there could be a master over-ride that controls the clock set at a pre-set value.

One interesting thing I did come across is one can write their own Open GL code. http://android-developers.blogspot.c...el/OpenGL%20ES

So if one could manipulate the values for Open GL, it would almost be equivalent to OC the GPU. Just a thought.

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes