How to make your app Xperia Play optimized

Search This thread

bigbison

New member
Dec 19, 2011
3
15
Hello, I am the developer of Zeus Arena, an Xperia Play optimized port of the ioquake3 engine for android.

It was a bit of a headache to get Xperia Play controls working when I first started working on Zeus Arena about 4 or 5 months ago. However recently I decided to try and support more devices with Zeus Arena by adding touch screen controls. Since all of Zeus Arenas graphics where being done in native code, using a native activity no less, this meant I would either have to have two completely separate applications or rewrite most of Zeus Arena. I did the later.

I found an easy way to support Xperia Play controls to an existing application (to those of you unfamiliar with Zeus Arena it is built upon kwaak3). So this post will be a brief tutorial on adding Xperia Play controls to an existing application.

First a note: This tutorial will not tell you how to set up or use the ndk, there are plenty of tutorials for that already. The hardest part of this process should be settting up the ndk.

Adding Xperia Play controls to your existing application:

The main problem with adding Xperia Play controls to your application is the touch pad (if you don't need to support the touch pad ignore this tutorial and look online for SE's tutorial, it's easy). The touchpad requires that you use a native activity to get your input, so the main purpose of this tutorial will be how to use a native activity whilst changing your existing code as little as possible.

First you will need to make a native activity in c code, this activity will poll for input from the touchpad as well as load references to the methods in your android code that deal with the touchpad input. (see sample code below)

Next open your main activity and change it from extending Activity to extend NativeActivity.

Now, and this is the key part really, asap after your call of super.onCreate(savedInstanceState); add this line of code: getWindow().takeSurface(null);

That one magical line of code allows you to add your graphics in your java code, meaning you don't have to change your existing java code any more than this.

However we aren't quite done yet. As mentioned above the native code is getting the input for the touchpad, we probably want to send this to the java code where all the other event handling takes place.

This is simple make a java method that accepts touch pad input and load it up in your native activity and then call it when ever the touch pad is touched.

Example code from Zeus Arena (if you are familiar with Zeus Arena's code (it's open source) the example code won't look too familiar because the code has modified to make it a bit simpler and some of it is from a new update coming to Zeus Arena soon):

java code:
public class Game extends NativeActivity{

public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().takeSurface(null);
RegisterThis();
mGLSurfaceView = new KwaakView(this, this); // a custom made view for Zeus Arena
setContentView(mGLSurfaceView);
mGLSurfaceView.requestFocus();
mGLSurfaceView.setId(1);
}

//gives the native activity a copy of this object so it can call OnNativeMotion
public native int RegisterThis();

//loads the .so, change library name to whater your library is called
static {
System.loadLibrary("kwaakjni");
}

//called by the native activity when ever touch input is found
public void OnNativeMotion(int action, int x, int y, int source, int device_id) {
if(source == 1048584){ //touchpad
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
(366-y),
metaState
);
mGLSurfaceView.onTouchPadEvent(motionEvent); //custom made method for dealing with touch input
}
else{
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
y,
metaState
);
// Dispatch touch event to view
mGLSurfaceView.dispatchTouchEvent(motionEvent);
}
}
}

Native code:

#include <dlfcn.h>
#include <stdio.h>
#include <string.h>
#include <android/log.h>
#include <jni.h>
#include <errno.h>
#include <android_native_app_glue.h>
#include <time.h>
#include <unistd.h>
#include "quake_two_android_Quake2.h"

#define EXPORT_ME __attribute__ ((visibility("default")))

static JavaVM *jVM;

typedef unsigned char BOOL;
#define FALSE 0
#define TRUE 1

//|------------------------------------------------------ NATIVE ACTIVITY ------------------------------------------------------|
static jobject g_pActivity = 0;
static jmethodID javaOnNDKTouch = 0;
/**
* Our saved state data.
*/
struct TOUCHSTATE
{
int down;
int x;
int y;
};

/**
* Shared state for our app.
*/
struct ENGINE
{
struct android_app* app;
int render;
int width;
int height;
int has_focus;
//ugly way to track touch states
struct TOUCHSTATE touchstate_screen[64];
struct TOUCHSTATE touchstate_pad[64];
};

void attach(){

}

/**
* Process the next input event.
*/
static
int32_t
engine_handle_input( struct android_app* app, AInputEvent* event )
{
JNIEnv *jni;
(*jVM)->AttachCurrentThread(jVM, &jni, NULL);

struct ENGINE* engine = (struct ENGINE*)app->userData;
if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
{
int nPointerCount = AMotionEvent_getPointerCount( event );
int nSourceId = AInputEvent_getSource( event );
int n;

for( n = 0 ; n < nPointerCount ; ++n )
{
int nPointerId = AMotionEvent_getPointerId( event, n );
int nAction = AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
int nRawAction = AMotionEvent_getAction( event );
struct TOUCHSTATE *touchstate = 0;

if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
touchstate = engine->touchstate_pad;
else
touchstate = engine->touchstate_screen;

if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
{
int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
}

if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
{
touchstate[nPointerId].down = 1;
}
else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
{
touchstate[nPointerId].down = 0;
}

if (touchstate[nPointerId].down == 1)
{
touchstate[nPointerId].x = AMotionEvent_getX( event, n );
touchstate[nPointerId].y = AMotionEvent_getY( event, n );
}
int handled = 0;
if( jni && g_pActivity ){
//send the event to java code, sends both touch screen and touch pad events, I think the java code will still intercept touch screen events
//so sending them probably isn't needed. If it is needed intercepting key events will be needed in native code as well.
(*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0 );
}
}

return 1;
}
return 0;
}

/**
* Process the next main command.
*/
static
void
engine_handle_cmd( struct android_app* app, int32_t cmd )
{
struct ENGINE* engine = (struct ENGINE*)app->userData;
switch( cmd )
{
case APP_CMD_SAVE_STATE:
// The system has asked us to save our current state. Do so if needed
break;
case APP_CMD_INIT_WINDOW:
// The window is being shown, get it ready.
if( engine->app->window != NULL )
{
engine->has_focus = 1;
}
break;

case APP_CMD_GAINED_FOCUS:
engine->has_focus = 1;
break;

case APP_CMD_LOST_FOCUS:
// When our app loses focus, we stop rendering.
engine->render = 0;
engine->has_focus = 0;
//engine_draw_frame( engine );
break;
}
}

/**
* This is the main entry point of a native application that is using
* android_native_app_glue. It runs in its own thread, with its own
* event loop for receiving input events and doing other things (rendering).
*/
void
android_main( struct android_app* state )
{
struct ENGINE engine;

// Make sure glue isn't stripped.
app_dummy();

memset( &engine, 0, sizeof(engine) );
state->userData = &engine;
state->onAppCmd = engine_handle_cmd;
state->onInputEvent = engine_handle_input;
engine.app = state;

//setup(state);
//JNIEnv *env;
//(*jVM)->AttachCurrentThread(jVM, &env, NULL);

if( state->savedState != NULL )
{
// We are starting with a previous saved state; restore from it.
}
// our 'main loop'
while( 1 )
{
// Read all pending events.
int ident;
int events;
struct android_poll_source* source;
// If not rendering, we will block forever waiting for events.
// If rendering, we loop until all events are read, then continue
// to draw the next frame.
while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
//while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
{
// Process this event.
// This will call the function pointer android_app::eek:nInputEvent() which in our case is
// engine_handle_input()
if( source != NULL )
{
source->process( state, source );
}
// Check if we are exiting.
if( state->destroyRequested != 0 )
{
return;
}
usleep(17000); //17 miliseconds
}
}
}
jint EXPORT_ME
JNICALL Java_quake_two_android_Quake2_RegisterThis(JNIEnv * env, jobject clazz){
g_pActivity = (jobject)(*env)->NewGlobalRef(env, clazz);
return 0;
}
jint EXPORT_ME JNICALL
JNI_OnLoad(JavaVM * vm, void * reserved)
{
JNIEnv *env;
jVM = vm;
if((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK)
{
return -1;
}
const char* interface_path = "quake/two/android/Quake2";
jclass java_activity_class = (*env)->FindClass( env, interface_path );
javaOnNDKTouch = (*env)->GetMethodID( env, java_activity_class, "OnNativeMotion", "(IIIII)V");
javaOnNDKKey = (*env)->GetMethodID( env, java_activity_class, "OnNativeKeyPress", "(III)V");
return JNI_VERSION_1_4;
}

Licences:
/*
* Copyright (c) 2011, Sony Ericsson Mobile Communications AB.
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* * Neither the name of the Sony Ericsson Mobile Communications AB nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*/

/*
* This example uses the NDK and a helper library available in the NDK called 'native app glue',
* which is available in %NDK_ROOT/source/android/native_app_glue. If you are new to NDK or to
* the NativeActivity, look through the native_app_glue source to see how you should set up your
* native app and handle callbacks and messages from Android. Note that the callbacks registered
* in the ANativeActivity_onCreate() entry-point must return in a timely manner, as does
* ANativeActivity_onCreate() itself. The Native App Glue does this by creating a pipe() and
* synchronization objects to handle communication between the Android, the NativeActivity and
* the game/sample logic.
*
* In this example, we read the 'pointer' information from touch events from both the touch-screen
* and the touch-pad (if available). We store their positions and state (up or down), then draw
* the touch positions scaled to the screen.
*
* Although we are using hard-coded values for the touch-pad resultion, you can and should read
* those values at runtime in java by enumerating InputDevices and finding the touchpad device.
*
*/

/*
* Kwaak3 - Java to quake3 interface
* Copyright (C) 2010 Roderick Colenbrander
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
 
Last edited:

bigbison

New member
Dec 19, 2011
3
15
Yeah I just had to scroll down to see if there where any reply's and it is damn long.

Most of it is example code which can be skipped unless your actually going to use it, the first section tells you what you need to know.

Also anyone can take this tutorial and put it anywhere people may want to see it if they want
 

Reapman

Senior Member
Oct 5, 2010
115
19
Sweet, thanks for posting! I've only done a bit of development, and eventually want to get a game going - this is bookmarked and thanked!!!
 

wiffeltje

Member
May 24, 2009
26
6
... of it is example code which can be skipped unless your actually going to use ...

bigbison,

I've also been playing a bit with the native app example. And I still have some questions and didn't find a good place to ask them. So, maybe you can help me out ?

If I understand it correctly ...
The onAppCmd and onInputEvent are called from within the main application thread.
The android_main is running in a seperate thread.

But I don't see any mutexes (or other locking mechanism) in place to prevent simultaneous access to the same application data from both threads. Could it be that your code (and also the standard example code) is not thread-safe?

Are we up for some random crashes ? Or am I missing something ? (I hope so ...)
 

twe69

Senior Member
Dec 24, 2008
593
31
Nice post, hopefully helps someone....
Just to let you know there is an emoticon in your code that should be fixed....
thanks.
 

Hogwarts

Senior Member
Dec 6, 2011
376
116
Just wanted to say im a big fan of Zeus Arena. Its really nice.

Very fun to play offline to
Great work man. you got the Xperia controls working perfectly
 

wiffeltje

Member
May 24, 2009
26
6
bigbison,
... Or am I missing something ? (I hope so ...)

Just answering my own question. :eek:

After posting my question I took yet another look at the android_native_app_glue code and how the ALoop_pollAll interacts with that. And looking at it again, I think I get it.

It looks like the threading is OK and that (most of) the main loop won't be able to run when the call-back functions are running. So, that should be OK.

So, I did overlook some things :eek:
Sorry to disturb you a little bit too early with my question.
 
  • Like
Reactions: AndroHero

ninjatjj

Member
Sep 6, 2013
5
0
github.com
OnNativeKeyPress

Hi,

I am trying to follow this a little, I can access the touchpad just fine in my nativeactivity but I have lost all access to onKeyDown and onKeyUp using the Sony tutorial. Your code mentions OnNativeKeyPress but I can't see you calling this anywhere, is this expected?

Thanks for any pointers you can offer!

Hello, I am the developer of Zeus Arena, an Xperia Play optimized port of the ioquake3 engine for android.

It was a bit of a headache to get Xperia Play controls working when I first started working on Zeus Arena about 4 or 5 months ago. However recently I decided to try and support more devices with Zeus Arena by adding touch screen controls. Since all of Zeus Arenas graphics where being done in native code, using a native activity no less, this meant I would either have to have two completely separate applications or rewrite most of Zeus Arena. I did the later.

I found an easy way to support Xperia Play controls to an existing application (to those of you unfamiliar with Zeus Arena it is built upon kwaak3). So this post will be a brief tutorial on adding Xperia Play controls to an existing application.

First a note: This tutorial will not tell you how to set up or use the ndk, there are plenty of tutorials for that already. The hardest part of this process should be settting up the ndk.

Adding Xperia Play controls to your existing application:

The main problem with adding Xperia Play controls to your application is the touch pad (if you don't need to support the touch pad ignore this tutorial and look online for SE's tutorial, it's easy). The touchpad requires that you use a native activity to get your input, so the main purpose of this tutorial will be how to use a native activity whilst changing your existing code as little as possible.

First you will need to make a native activity in c code, this activity will poll for input from the touchpad as well as load references to the methods in your android code that deal with the touchpad input. (see sample code below)

Next open your main activity and change it from extending Activity to extend NativeActivity.

Now, and this is the key part really, asap after your call of super.onCreate(savedInstanceState); add this line of code: getWindow().takeSurface(null);

That one magical line of code allows you to add your graphics in your java code, meaning you don't have to change your existing java code any more than this.

However we aren't quite done yet. As mentioned above the native code is getting the input for the touchpad, we probably want to send this to the java code where all the other event handling takes place.

This is simple make a java method that accepts touch pad input and load it up in your native activity and then call it when ever the touch pad is touched.

Example code from Zeus Arena (if you are familiar with Zeus Arena's code (it's open source) the example code won't look too familiar because the code has modified to make it a bit simpler and some of it is from a new update coming to Zeus Arena soon):

java code:
public class Game extends NativeActivity{

public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().takeSurface(null);
RegisterThis();
mGLSurfaceView = new KwaakView(this, this); // a custom made view for Zeus Arena
setContentView(mGLSurfaceView);
mGLSurfaceView.requestFocus();
mGLSurfaceView.setId(1);
}

//gives the native activity a copy of this object so it can call OnNativeMotion
public native int RegisterThis();

//loads the .so, change library name to whater your library is called
static {
System.loadLibrary("kwaakjni");
}

//called by the native activity when ever touch input is found
public void OnNativeMotion(int action, int x, int y, int source, int device_id) {
if(source == 1048584){ //touchpad
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
(366-y),
metaState
);
mGLSurfaceView.onTouchPadEvent(motionEvent); //custom made method for dealing with touch input
}
else{
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
action,
x,
y,
metaState
);
// Dispatch touch event to view
mGLSurfaceView.dispatchTouchEvent(motionEvent);
}
}
}

Native code:

#include <dlfcn.h>
#include <stdio.h>
#include <string.h>
#include <android/log.h>
#include <jni.h>
#include <errno.h>
#include <android_native_app_glue.h>
#include <time.h>
#include <unistd.h>
#include "quake_two_android_Quake2.h"

#define EXPORT_ME __attribute__ ((visibility("default")))

static JavaVM *jVM;

typedef unsigned char BOOL;
#define FALSE 0
#define TRUE 1

//|------------------------------------------------------ NATIVE ACTIVITY ------------------------------------------------------|
static jobject g_pActivity = 0;
static jmethodID javaOnNDKTouch = 0;
/**
* Our saved state data.
*/
struct TOUCHSTATE
{
int down;
int x;
int y;
};

/**
* Shared state for our app.
*/
struct ENGINE
{
struct android_app* app;
int render;
int width;
int height;
int has_focus;
//ugly way to track touch states
struct TOUCHSTATE touchstate_screen[64];
struct TOUCHSTATE touchstate_pad[64];
};

void attach(){

}

/**
* Process the next input event.
*/
static
int32_t
engine_handle_input( struct android_app* app, AInputEvent* event )
{
JNIEnv *jni;
(*jVM)->AttachCurrentThread(jVM, &jni, NULL);

struct ENGINE* engine = (struct ENGINE*)app->userData;
if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
{
int nPointerCount = AMotionEvent_getPointerCount( event );
int nSourceId = AInputEvent_getSource( event );
int n;

for( n = 0 ; n < nPointerCount ; ++n )
{
int nPointerId = AMotionEvent_getPointerId( event, n );
int nAction = AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
int nRawAction = AMotionEvent_getAction( event );
struct TOUCHSTATE *touchstate = 0;

if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
touchstate = engine->touchstate_pad;
else
touchstate = engine->touchstate_screen;

if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
{
int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
}

if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
{
touchstate[nPointerId].down = 1;
}
else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
{
touchstate[nPointerId].down = 0;
}

if (touchstate[nPointerId].down == 1)
{
touchstate[nPointerId].x = AMotionEvent_getX( event, n );
touchstate[nPointerId].y = AMotionEvent_getY( event, n );
}
int handled = 0;
if( jni && g_pActivity ){
//send the event to java code, sends both touch screen and touch pad events, I think the java code will still intercept touch screen events
//so sending them probably isn't needed. If it is needed intercepting key events will be needed in native code as well.
(*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0 );
}
}

return 1;
}
return 0;
}

/**
* Process the next main command.
*/
static
void
engine_handle_cmd( struct android_app* app, int32_t cmd )
{
struct ENGINE* engine = (struct ENGINE*)app->userData;
switch( cmd )
{
case APP_CMD_SAVE_STATE:
// The system has asked us to save our current state. Do so if needed
break;
case APP_CMD_INIT_WINDOW:
// The window is being shown, get it ready.
if( engine->app->window != NULL )
{
engine->has_focus = 1;
}
break;

case APP_CMD_GAINED_FOCUS:
engine->has_focus = 1;
break;

case APP_CMD_LOST_FOCUS:
// When our app loses focus, we stop rendering.
engine->render = 0;
engine->has_focus = 0;
//engine_draw_frame( engine );
break;
}
}

/**
* This is the main entry point of a native application that is using
* android_native_app_glue. It runs in its own thread, with its own
* event loop for receiving input events and doing other things (rendering).
*/
void
android_main( struct android_app* state )
{
struct ENGINE engine;

// Make sure glue isn't stripped.
app_dummy();

memset( &engine, 0, sizeof(engine) );
state->userData = &engine;
state->onAppCmd = engine_handle_cmd;
state->onInputEvent = engine_handle_input;
engine.app = state;

//setup(state);
//JNIEnv *env;
//(*jVM)->AttachCurrentThread(jVM, &env, NULL);

if( state->savedState != NULL )
{
// We are starting with a previous saved state; restore from it.
}
// our 'main loop'
while( 1 )
{
// Read all pending events.
int ident;
int events;
struct android_poll_source* source;
// If not rendering, we will block forever waiting for events.
// If rendering, we loop until all events are read, then continue
// to draw the next frame.
while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
//while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
{
// Process this event.
// This will call the function pointer android_app::eek:nInputEvent() which in our case is
// engine_handle_input()
if( source != NULL )
{
source->process( state, source );
}
// Check if we are exiting.
if( state->destroyRequested != 0 )
{
return;
}
usleep(17000); //17 miliseconds
}
}
}
jint EXPORT_ME
JNICALL Java_quake_two_android_Quake2_RegisterThis(JNIEnv * env, jobject clazz){
g_pActivity = (jobject)(*env)->NewGlobalRef(env, clazz);
return 0;
}
jint EXPORT_ME JNICALL
JNI_OnLoad(JavaVM * vm, void * reserved)
{
JNIEnv *env;
jVM = vm;
if((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK)
{
return -1;
}
const char* interface_path = "quake/two/android/Quake2";
jclass java_activity_class = (*env)->FindClass( env, interface_path );
javaOnNDKTouch = (*env)->GetMethodID( env, java_activity_class, "OnNativeMotion", "(IIIII)V");
javaOnNDKKey = (*env)->GetMethodID( env, java_activity_class, "OnNativeKeyPress", "(III)V");
return JNI_VERSION_1_4;
}

Licences:
/*
* Copyright (c) 2011, Sony Ericsson Mobile Communications AB.
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* * Neither the name of the Sony Ericsson Mobile Communications AB nor the
* names of its contributors may be used to endorse or promote products
* derived from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*/

/*
* This example uses the NDK and a helper library available in the NDK called 'native app glue',
* which is available in %NDK_ROOT/source/android/native_app_glue. If you are new to NDK or to
* the NativeActivity, look through the native_app_glue source to see how you should set up your
* native app and handle callbacks and messages from Android. Note that the callbacks registered
* in the ANativeActivity_onCreate() entry-point must return in a timely manner, as does
* ANativeActivity_onCreate() itself. The Native App Glue does this by creating a pipe() and
* synchronization objects to handle communication between the Android, the NativeActivity and
* the game/sample logic.
*
* In this example, we read the 'pointer' information from touch events from both the touch-screen
* and the touch-pad (if available). We store their positions and state (up or down), then draw
* the touch positions scaled to the screen.
*
* Although we are using hard-coded values for the touch-pad resultion, you can and should read
* those values at runtime in java by enumerating InputDevices and finding the touchpad device.
*
*/

/*
* Kwaak3 - Java to quake3 interface
* Copyright (C) 2010 Roderick Colenbrander
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 2
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
 
Hi,

I am trying to follow this a little, I can access the touchpad just fine in my nativeactivity but I have lost all access to onKeyDown and onKeyUp using the Sony tutorial. Your code mentions OnNativeKeyPress but I can't see you calling this anywhere, is this expected?

Thanks for any pointers you can offer!

If you're still searching:

Code:
static
int32_t
engine_handle_input( struct android_app* app, AInputEvent* event )
{
	JNIEnv *jni;
	(*jVM)->AttachCurrentThread(jVM, &jni, NULL);

	struct ENGINE* engine = (struct ENGINE*)app->userData;
	if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
	{
		int nPointerCount	= AMotionEvent_getPointerCount( event );
		int nSourceId		= AInputEvent_getSource( event );
		int n;

		jboolean newTouch = JNI_TRUE;
		for( n = 0 ; n < nPointerCount ; ++n )
		{
			int nPointerId	= AMotionEvent_getPointerId( event, n );
			int nAction		= AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
			int nRawAction	= AMotionEvent_getAction( event );
			struct TOUCHSTATE *touchstate = 0;

			if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
				touchstate = engine->touchstate_pad;
			else
				touchstate = engine->touchstate_screen;

			if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
			{
				int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
				nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
			}

			if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
			{
				touchstate[nPointerId].down = 1;
			}
			else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
			{
				touchstate[nPointerId].down = 0;
			}

			if (touchstate[nPointerId].down == 1)
			{
				touchstate[nPointerId].x = AMotionEvent_getX( event, n );
				touchstate[nPointerId].y = AMotionEvent_getY( event, n );
			}
			int handled = 0;
			if( jni && g_pActivity ){
				(*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0, newTouch);
			}
			newTouch = JNI_FALSE;
		}

		return 1;
	}
	else if (AInputEvent_getType(event) == AINPUT_EVENT_TYPE_KEY){
		int action = AKeyEvent_getAction(event);
		int keyCode = AKeyEvent_getKeyCode(event);
		if(jni && g_pActivity){
			if((*jni)->ExceptionCheck(jni)) {
				(*jni)->ExceptionDescribe(jni);
				(*jni)->ExceptionClear(jni);
			}
			(*jni)->CallIntMethod(jni, g_pActivity, javaOnNDKKey, action, keyCode, AKeyEvent_getMetaState(event));
		}
	}
	return 0;
}

taken from: https://play.google.com/store/apps/details?id=zeus.arena.source&hl=en&rdid=zeus.arena.source

Btw: The application should not be run. It doesn't work and will waste your time. Open the APK with a file browser and get the 7z out of the assets manually. There are plenty of places to copy that code from, but I guess it wasn't on Sony's developer page :/
 
  • Like
Reactions: mattnmag
Thanks for the code snippet, I managed to get this working already: github.com/ninjatjj/btjoypad

Alright. I just added support to reicast and stumbled on this trying to find the Sony page that moved. This is a very basic copy paste implementation. It should really explain the what and why if it's going to leave out so much.

I'll take a look at how you did it shortly and let you know if there's anything I did that might help. I rewrote a lot of the native to allow skipping over the input for everything but the Play.
 

ninjatjj

Member
Sep 6, 2013
5
0
github.com
Alright. I just added support to reicast and stumbled on this trying to find the Sony page that moved. This is a very basic copy paste implementation. It should really explain the what and why if it's going to leave out so much.

I'll take a look at how you did it shortly and let you know if there's anything I did that might help. I rewrote a lot of the native to allow skipping over the input for everything but the Play.

Much appreciated - I was actually checking out reicast yesterday but I don't have my MvC rom with me (on holiday) - are you getting decent performance on the xperia play? That is one device that definitely needs a hardware upgrade soon.
 
Much appreciated - I was actually checking out reicast yesterday but I don't have my MvC rom with me (on holiday) - are you getting decent performance on the xperia play? That is one device that definitely needs a hardware upgrade soon.

I saw you're using the full native version so not much I did will have any benefit. I use the native just to pass back to the Java for ease of use now since everything else is Java-based.

It is starting to. It doesn't work on Gingerbread due to memory allocation issues with the build but I get the cutscenes at about 60 fps and gameplay is anywhere from 20 to 40 depending on how intense the scene is. That's what I've been working on, though. Every little option to squeeze another 10 or so out of it.
 

Top Liked Posts

  • There are no posts matching your filters.
  • 15
    Hello, I am the developer of Zeus Arena, an Xperia Play optimized port of the ioquake3 engine for android.

    It was a bit of a headache to get Xperia Play controls working when I first started working on Zeus Arena about 4 or 5 months ago. However recently I decided to try and support more devices with Zeus Arena by adding touch screen controls. Since all of Zeus Arenas graphics where being done in native code, using a native activity no less, this meant I would either have to have two completely separate applications or rewrite most of Zeus Arena. I did the later.

    I found an easy way to support Xperia Play controls to an existing application (to those of you unfamiliar with Zeus Arena it is built upon kwaak3). So this post will be a brief tutorial on adding Xperia Play controls to an existing application.

    First a note: This tutorial will not tell you how to set up or use the ndk, there are plenty of tutorials for that already. The hardest part of this process should be settting up the ndk.

    Adding Xperia Play controls to your existing application:

    The main problem with adding Xperia Play controls to your application is the touch pad (if you don't need to support the touch pad ignore this tutorial and look online for SE's tutorial, it's easy). The touchpad requires that you use a native activity to get your input, so the main purpose of this tutorial will be how to use a native activity whilst changing your existing code as little as possible.

    First you will need to make a native activity in c code, this activity will poll for input from the touchpad as well as load references to the methods in your android code that deal with the touchpad input. (see sample code below)

    Next open your main activity and change it from extending Activity to extend NativeActivity.

    Now, and this is the key part really, asap after your call of super.onCreate(savedInstanceState); add this line of code: getWindow().takeSurface(null);

    That one magical line of code allows you to add your graphics in your java code, meaning you don't have to change your existing java code any more than this.

    However we aren't quite done yet. As mentioned above the native code is getting the input for the touchpad, we probably want to send this to the java code where all the other event handling takes place.

    This is simple make a java method that accepts touch pad input and load it up in your native activity and then call it when ever the touch pad is touched.

    Example code from Zeus Arena (if you are familiar with Zeus Arena's code (it's open source) the example code won't look too familiar because the code has modified to make it a bit simpler and some of it is from a new update coming to Zeus Arena soon):

    java code:
    public class Game extends NativeActivity{

    public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    getWindow().takeSurface(null);
    RegisterThis();
    mGLSurfaceView = new KwaakView(this, this); // a custom made view for Zeus Arena
    setContentView(mGLSurfaceView);
    mGLSurfaceView.requestFocus();
    mGLSurfaceView.setId(1);
    }

    //gives the native activity a copy of this object so it can call OnNativeMotion
    public native int RegisterThis();

    //loads the .so, change library name to whater your library is called
    static {
    System.loadLibrary("kwaakjni");
    }

    //called by the native activity when ever touch input is found
    public void OnNativeMotion(int action, int x, int y, int source, int device_id) {
    if(source == 1048584){ //touchpad
    // Obtain MotionEvent object
    long downTime = SystemClock.uptimeMillis();
    long eventTime = SystemClock.uptimeMillis() + 100;
    // List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
    int metaState = 0;
    MotionEvent motionEvent = MotionEvent.obtain(
    downTime,
    eventTime,
    action,
    x,
    (366-y),
    metaState
    );
    mGLSurfaceView.onTouchPadEvent(motionEvent); //custom made method for dealing with touch input
    }
    else{
    // Obtain MotionEvent object
    long downTime = SystemClock.uptimeMillis();
    long eventTime = SystemClock.uptimeMillis() + 100;
    // List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
    int metaState = 0;
    MotionEvent motionEvent = MotionEvent.obtain(
    downTime,
    eventTime,
    action,
    x,
    y,
    metaState
    );
    // Dispatch touch event to view
    mGLSurfaceView.dispatchTouchEvent(motionEvent);
    }
    }
    }

    Native code:

    #include <dlfcn.h>
    #include <stdio.h>
    #include <string.h>
    #include <android/log.h>
    #include <jni.h>
    #include <errno.h>
    #include <android_native_app_glue.h>
    #include <time.h>
    #include <unistd.h>
    #include "quake_two_android_Quake2.h"

    #define EXPORT_ME __attribute__ ((visibility("default")))

    static JavaVM *jVM;

    typedef unsigned char BOOL;
    #define FALSE 0
    #define TRUE 1

    //|------------------------------------------------------ NATIVE ACTIVITY ------------------------------------------------------|
    static jobject g_pActivity = 0;
    static jmethodID javaOnNDKTouch = 0;
    /**
    * Our saved state data.
    */
    struct TOUCHSTATE
    {
    int down;
    int x;
    int y;
    };

    /**
    * Shared state for our app.
    */
    struct ENGINE
    {
    struct android_app* app;
    int render;
    int width;
    int height;
    int has_focus;
    //ugly way to track touch states
    struct TOUCHSTATE touchstate_screen[64];
    struct TOUCHSTATE touchstate_pad[64];
    };

    void attach(){

    }

    /**
    * Process the next input event.
    */
    static
    int32_t
    engine_handle_input( struct android_app* app, AInputEvent* event )
    {
    JNIEnv *jni;
    (*jVM)->AttachCurrentThread(jVM, &jni, NULL);

    struct ENGINE* engine = (struct ENGINE*)app->userData;
    if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
    {
    int nPointerCount = AMotionEvent_getPointerCount( event );
    int nSourceId = AInputEvent_getSource( event );
    int n;

    for( n = 0 ; n < nPointerCount ; ++n )
    {
    int nPointerId = AMotionEvent_getPointerId( event, n );
    int nAction = AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
    int nRawAction = AMotionEvent_getAction( event );
    struct TOUCHSTATE *touchstate = 0;

    if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
    touchstate = engine->touchstate_pad;
    else
    touchstate = engine->touchstate_screen;

    if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
    {
    int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
    nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
    }

    if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
    {
    touchstate[nPointerId].down = 1;
    }
    else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
    {
    touchstate[nPointerId].down = 0;
    }

    if (touchstate[nPointerId].down == 1)
    {
    touchstate[nPointerId].x = AMotionEvent_getX( event, n );
    touchstate[nPointerId].y = AMotionEvent_getY( event, n );
    }
    int handled = 0;
    if( jni && g_pActivity ){
    //send the event to java code, sends both touch screen and touch pad events, I think the java code will still intercept touch screen events
    //so sending them probably isn't needed. If it is needed intercepting key events will be needed in native code as well.
    (*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0 );
    }
    }

    return 1;
    }
    return 0;
    }

    /**
    * Process the next main command.
    */
    static
    void
    engine_handle_cmd( struct android_app* app, int32_t cmd )
    {
    struct ENGINE* engine = (struct ENGINE*)app->userData;
    switch( cmd )
    {
    case APP_CMD_SAVE_STATE:
    // The system has asked us to save our current state. Do so if needed
    break;
    case APP_CMD_INIT_WINDOW:
    // The window is being shown, get it ready.
    if( engine->app->window != NULL )
    {
    engine->has_focus = 1;
    }
    break;

    case APP_CMD_GAINED_FOCUS:
    engine->has_focus = 1;
    break;

    case APP_CMD_LOST_FOCUS:
    // When our app loses focus, we stop rendering.
    engine->render = 0;
    engine->has_focus = 0;
    //engine_draw_frame( engine );
    break;
    }
    }

    /**
    * This is the main entry point of a native application that is using
    * android_native_app_glue. It runs in its own thread, with its own
    * event loop for receiving input events and doing other things (rendering).
    */
    void
    android_main( struct android_app* state )
    {
    struct ENGINE engine;

    // Make sure glue isn't stripped.
    app_dummy();

    memset( &engine, 0, sizeof(engine) );
    state->userData = &engine;
    state->onAppCmd = engine_handle_cmd;
    state->onInputEvent = engine_handle_input;
    engine.app = state;

    //setup(state);
    //JNIEnv *env;
    //(*jVM)->AttachCurrentThread(jVM, &env, NULL);

    if( state->savedState != NULL )
    {
    // We are starting with a previous saved state; restore from it.
    }
    // our 'main loop'
    while( 1 )
    {
    // Read all pending events.
    int ident;
    int events;
    struct android_poll_source* source;
    // If not rendering, we will block forever waiting for events.
    // If rendering, we loop until all events are read, then continue
    // to draw the next frame.
    while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
    //while( (ident = ALooper_pollAll( 100, NULL, &events, (void**)&source) ) >= 0 )
    {
    // Process this event.
    // This will call the function pointer android_app::eek:nInputEvent() which in our case is
    // engine_handle_input()
    if( source != NULL )
    {
    source->process( state, source );
    }
    // Check if we are exiting.
    if( state->destroyRequested != 0 )
    {
    return;
    }
    usleep(17000); //17 miliseconds
    }
    }
    }
    jint EXPORT_ME
    JNICALL Java_quake_two_android_Quake2_RegisterThis(JNIEnv * env, jobject clazz){
    g_pActivity = (jobject)(*env)->NewGlobalRef(env, clazz);
    return 0;
    }
    jint EXPORT_ME JNICALL
    JNI_OnLoad(JavaVM * vm, void * reserved)
    {
    JNIEnv *env;
    jVM = vm;
    if((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK)
    {
    return -1;
    }
    const char* interface_path = "quake/two/android/Quake2";
    jclass java_activity_class = (*env)->FindClass( env, interface_path );
    javaOnNDKTouch = (*env)->GetMethodID( env, java_activity_class, "OnNativeMotion", "(IIIII)V");
    javaOnNDKKey = (*env)->GetMethodID( env, java_activity_class, "OnNativeKeyPress", "(III)V");
    return JNI_VERSION_1_4;
    }

    Licences:
    /*
    * Copyright (c) 2011, Sony Ericsson Mobile Communications AB.
    * All rights reserved.
    *
    * Redistribution and use in source and binary forms, with or without
    * modification, are permitted provided that the following conditions are met:
    * * Redistributions of source code must retain the above copyright
    * notice, this list of conditions and the following disclaimer.
    * * Redistributions in binary form must reproduce the above copyright
    * notice, this list of conditions and the following disclaimer in the
    * documentation and/or other materials provided with the distribution.
    * * Neither the name of the Sony Ericsson Mobile Communications AB nor the
    * names of its contributors may be used to endorse or promote products
    * derived from this software without specific prior written permission.
    *
    * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
    * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
    * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
    * ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
    * LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
    * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
    * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
    * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
    * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
    * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
    * POSSIBILITY OF SUCH DAMAGE.
    */

    /*
    * This example uses the NDK and a helper library available in the NDK called 'native app glue',
    * which is available in %NDK_ROOT/source/android/native_app_glue. If you are new to NDK or to
    * the NativeActivity, look through the native_app_glue source to see how you should set up your
    * native app and handle callbacks and messages from Android. Note that the callbacks registered
    * in the ANativeActivity_onCreate() entry-point must return in a timely manner, as does
    * ANativeActivity_onCreate() itself. The Native App Glue does this by creating a pipe() and
    * synchronization objects to handle communication between the Android, the NativeActivity and
    * the game/sample logic.
    *
    * In this example, we read the 'pointer' information from touch events from both the touch-screen
    * and the touch-pad (if available). We store their positions and state (up or down), then draw
    * the touch positions scaled to the screen.
    *
    * Although we are using hard-coded values for the touch-pad resultion, you can and should read
    * those values at runtime in java by enumerating InputDevices and finding the touchpad device.
    *
    */

    /*
    * Kwaak3 - Java to quake3 interface
    * Copyright (C) 2010 Roderick Colenbrander
    *
    * This program is free software; you can redistribute it and/or
    * modify it under the terms of the GNU General Public License
    * as published by the Free Software Foundation; either version 2
    * of the License, or (at your option) any later version.
    *
    * This program is distributed in the hope that it will be useful,
    * but WITHOUT ANY WARRANTY; without even the implied warranty of
    * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
    * GNU General Public License for more details.
    *
    * You should have received a copy of the GNU General Public License
    * along with this program; if not, write to the Free Software
    * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
    */
    2
    Much appreciated - I was actually checking out reicast yesterday but I don't have my MvC rom with me (on holiday) - are you getting decent performance on the xperia play? That is one device that definitely needs a hardware upgrade soon.

    I saw you're using the full native version so not much I did will have any benefit. I use the native just to pass back to the Java for ease of use now since everything else is Java-based.

    It is starting to. It doesn't work on Gingerbread due to memory allocation issues with the build but I get the cutscenes at about 60 fps and gameplay is anywhere from 20 to 40 depending on how intense the scene is. That's what I've been working on, though. Every little option to squeeze another 10 or so out of it.
    1
    bigbison,
    ... Or am I missing something ? (I hope so ...)

    Just answering my own question. :eek:

    After posting my question I took yet another look at the android_native_app_glue code and how the ALoop_pollAll interacts with that. And looking at it again, I think I get it.

    It looks like the threading is OK and that (most of) the main loop won't be able to run when the call-back functions are running. So, that should be OK.

    So, I did overlook some things :eek:
    Sorry to disturb you a little bit too early with my question.
    1
    Hi,

    I am trying to follow this a little, I can access the touchpad just fine in my nativeactivity but I have lost all access to onKeyDown and onKeyUp using the Sony tutorial. Your code mentions OnNativeKeyPress but I can't see you calling this anywhere, is this expected?

    Thanks for any pointers you can offer!

    If you're still searching:

    Code:
    static
    int32_t
    engine_handle_input( struct android_app* app, AInputEvent* event )
    {
    	JNIEnv *jni;
    	(*jVM)->AttachCurrentThread(jVM, &jni, NULL);
    
    	struct ENGINE* engine = (struct ENGINE*)app->userData;
    	if( AInputEvent_getType(event) == AINPUT_EVENT_TYPE_MOTION )
    	{
    		int nPointerCount	= AMotionEvent_getPointerCount( event );
    		int nSourceId		= AInputEvent_getSource( event );
    		int n;
    
    		jboolean newTouch = JNI_TRUE;
    		for( n = 0 ; n < nPointerCount ; ++n )
    		{
    			int nPointerId	= AMotionEvent_getPointerId( event, n );
    			int nAction		= AMOTION_EVENT_ACTION_MASK & AMotionEvent_getAction( event );
    			int nRawAction	= AMotionEvent_getAction( event );
    			struct TOUCHSTATE *touchstate = 0;
    
    			if( nSourceId == AINPUT_SOURCE_TOUCHPAD )
    				touchstate = engine->touchstate_pad;
    			else
    				touchstate = engine->touchstate_screen;
    
    			if( nAction == AMOTION_EVENT_ACTION_POINTER_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_UP )
    			{
    				int nPointerIndex = (AMotionEvent_getAction( event ) & AMOTION_EVENT_ACTION_POINTER_INDEX_MASK) >> AMOTION_EVENT_ACTION_POINTER_INDEX_SHIFT;
    				nPointerId = AMotionEvent_getPointerId( event, nPointerIndex );
    			}
    
    			if( nAction == AMOTION_EVENT_ACTION_DOWN || nAction == AMOTION_EVENT_ACTION_POINTER_DOWN )
    			{
    				touchstate[nPointerId].down = 1;
    			}
    			else if( nAction == AMOTION_EVENT_ACTION_UP || nAction == AMOTION_EVENT_ACTION_POINTER_UP || nAction == AMOTION_EVENT_ACTION_CANCEL )
    			{
    				touchstate[nPointerId].down = 0;
    			}
    
    			if (touchstate[nPointerId].down == 1)
    			{
    				touchstate[nPointerId].x = AMotionEvent_getX( event, n );
    				touchstate[nPointerId].y = AMotionEvent_getY( event, n );
    			}
    			int handled = 0;
    			if( jni && g_pActivity ){
    				(*jni)->CallVoidMethod( jni, g_pActivity, javaOnNDKTouch, nRawAction, touchstate[nPointerId].x, touchstate[nPointerId].y, nSourceId, 0, newTouch);
    			}
    			newTouch = JNI_FALSE;
    		}
    
    		return 1;
    	}
    	else if (AInputEvent_getType(event) == AINPUT_EVENT_TYPE_KEY){
    		int action = AKeyEvent_getAction(event);
    		int keyCode = AKeyEvent_getKeyCode(event);
    		if(jni && g_pActivity){
    			if((*jni)->ExceptionCheck(jni)) {
    				(*jni)->ExceptionDescribe(jni);
    				(*jni)->ExceptionClear(jni);
    			}
    			(*jni)->CallIntMethod(jni, g_pActivity, javaOnNDKKey, action, keyCode, AKeyEvent_getMetaState(event));
    		}
    	}
    	return 0;
    }

    taken from: https://play.google.com/store/apps/details?id=zeus.arena.source&hl=en&rdid=zeus.arena.source

    Btw: The application should not be run. It doesn't work and will waste your time. Open the APK with a file browser and get the 7z out of the assets manually. There are plenty of places to copy that code from, but I guess it wasn't on Sony's developer page :/