Hi folks!
This thread explains a feature I first introduced in the Siyah kernel (available in 4.1beta5) that allows defining finger movement detection and triggering actions when certain gestures are made.
There are also apps available on the market to do it but this approach happens on the kernel level.
I welcome your feedback on any advantages and drawbacks you find.
There are 2 steps required to use this feature:
1. Defining the gestures - in other words, the path that the fingers are expected to make for the gesture to be detected
2. Reacting to detected gestures
Defining gestures
The sysfs entry /sys/devices/virtual/misc/touch_gestures/gesture_patterns provides access to the gesture definitions - the hot spots for the path that each finger must travel for a gesture to be triggered.
"cat /sys/devices/virtual/misc/touch_gestures/gesture_patterns" will show you the current definitions, and some comments on the expected structure:
Choosing the coordinates
Your S2 screen has the following X,Y coordinates:
Each hotspot is a rectangle from X1 to X2 and Y1 to Y2. For example, a hotspot for just the top half of the screen would be X between 0 and 479 and Y between 0 and 399 (~ half of 800).
A maximum of 10 gestures can be defined, each of them using 1 or more fingers (up to a maximum of 10 but in practice more than 4 might not be very feasible) and for each of them a maximum of 10 consecutive hotspots, which make a path.
All gestures must be defined in one go by writing multiple lines to /sys/devices/virtual/misc/touch_gestures/gesture_patterns, in the following form:
Writing to "gesture_patterns" will erase all previous definitions and replace with what you're writing.
Some examples that can be used in practice (or define your own gestures)
1. swipe one finger near the top and another near the bottom from left to right
Definition (bound to gesture 1; uses fingers 1 and 2):
2. swipe 3 fingers from near the top to near the bottom
Definition (bound to gesture 2; uses fingers 1, 2 and 3):
3. draw a Z with one finger while another is pressed on the middle left of the screen
Definition (bound to gesture 3; uses fingers 1 and 2):
(notice that I mixed the way the lines are written, in order to show how you can organize the entries)
To wrap it all up, you can use the following in an init.d script - as the definitions aren't persisted across reboots - in order to define all these gestures whenever the device starts:
There are 2 important things to keep in mind when defining gestures:
* The touches are still delivered to whatever applications are active. If a certain gesture proves to cause nuisance with the actual apps, change it to something different or use it only in certain situations;
* Whenever you're pressing or moving 2 fingers close together, at some point the screen will start detecting only one of them. For some of the gesture definitions this might cause the detection to fail or only work very rarely. Make sure to use the "Show pointer location" option in Settings / Developer in order to be able to track what the device detects, while you're setting things up the way you want.
Triggering actions
Defining gestures won't do anything by itself. Now you need to check the /sys/devices/virtual/misc/touch_gestures/wait_for_gesture entry to see which gesture is detected and do whatever you want.
Here's an example, also to be run from an init.d script:
What this will do is:
- for the 1st gesture, toggle mDNIe inverted / normal
- for the 2nd gesture, launch the Camera app no matter what app is active (quick, that chick is almost out of view! )
- for the 3rd gesture - after you edit and uncomment the appropriate line - a call will be established to that number (the wife is impatient, I don't even have time to enter my PIN!!! )
It loops eternally looking for the next detected gesture and triggering the appropriate action.
NOTE - this has been edited to no longer cause hangs on CM10 startup. The problem was with comments inside the script that contained chars like ' ( ) etc.; be careful when changing the script not to introduce these problems.
Reading from "wait_for_gesture" blocks until one of them is detected, and therefore no CPU is consumed nor deep sleep prevented because of the infinite loop.
In some rare occasions (e.g. multiple scripts waiting for gestures, which can be awaken at the same time but only one of them will get each gesture) the script can wake up with a value of 0, which should just be ignored.
If no script is reading "wait_for_gesture", multiple gestures can be detected and buffered (at most one instance of each one) and be send immediately as soon as something starts reading the entry.
Doing an "echo reset > ..../wait_for_gesture" will flush that buffer so no pending gestures are reported, only future ones.
Sample script
The attached file is a CWM installable package that contains a sample script with all this and more.
It has both the definition of 8 gestures and actions to be performed for each of those.
Remember to edit and uncomment the line with the intended phone, otherwise it won't do anything when you draw the Z on the screen.
Just flash it on your primary or your secondary ROM and you're good to go, with the behavior described below.
Gestures:
1. one finger on the top left, another on the bottom left; swipe both horizontally to the right edge
triggered action - invert mDNIe
2. swipe 3 fingers from the top of the screen to the bottom
triggered action - launch the camera app
(currently recognizes the apps from stock Sammy 4.0.*, AOKP 4.0.4 and JellyBean / CM10)
3. press one finger on the middle left of the screen; with another finger draw a Z starting on the top left edge
triggered action - immediately dial a predefined number on the script (must edit the script to put the number you want or it won't do nothing as it is)
WARNING: This has a nice bonus but you need to be aware of it - it will work even on a locked screen. Anyone that knows the gesture will be able to dial that destination even without knowing your PIN or Unlock Pattern. They won't however be able to press any of the other phone buttons like Contacts, etc.
4. hold one finger on the bottom right while another goes from top-left to the middle of the screen and back
triggered action - toggle Bluetooth on/off (will also vibrate for 100ms to provide feedback)
5. hold one finger on the bottom left while another goes from top-right to the middle of the screen and back
triggered action - toggle WiFi on/off (will also vibrate for 100ms to provide feedback)
6. hold one finger on the top left and another on the bottom left, move both to the middle right
triggered action - Media play / pause
7. draw an X on the screen - top-left, bottom-right, top-right, bottom-left - while holding another finger on the middle left
triggered action - Power button (to spare the physical button)
8. swipe one finger from the bottom left to the bottom right, then again bottom left (5 times)
triggered action - Home button (to spare the physical button)
9. hold one finger on the bottom left and with another swipe from the top right to top left and back to top right
triggered action - Toggle between the last 2 activities, excluding the TW Launcher (edit the script if you use another launcher)
10. hold one finger on the middle left and with another swipe top-right, bottom-right, top-right (3 times)
triggered action - force closes the current activity
11. press 3 fingers in the positions: top-left, top-right, bottom-left
triggered action - temporarily disables finger detection by the apps (or re-enables) so you can then swipe other gestures without causing effects in the apps
All other gestures automatically re-enable detection after it has been disabled by this gesture.
These gestures and actions are already an evolution over the original sample I shared, as a result of people posting their suggestions and ideas on the thread.
It's your turn now - think of what is useful to you and make sure to share it with others
This thread explains a feature I first introduced in the Siyah kernel (available in 4.1beta5) that allows defining finger movement detection and triggering actions when certain gestures are made.
There are also apps available on the market to do it but this approach happens on the kernel level.
I welcome your feedback on any advantages and drawbacks you find.
Change log
01.12.2012
Added instructions on how to use camera from the lockscreen (see post #3).
Added link to Flint2's Kernel Gesture Builder (see post #2).
Added index.
27.10.2012
Added 3 additional actions (see items 9, 10 and 11 at the end of this post): v1.2 sample script.
Fixed mDNIe negative toggle for newer JB kernels.
23.08.2012
Added action commands and explanations to the 3rd post, with all that has been identified so far.
18.08.2012
Added sample CWM file for the S3 (different coordinates) - thanks to Gokhanmoral
16.08.2012
CWM-flashable zip with ready to use examples
8 gestures
Actions: invert mDNIe; launch camera (3 different apps detected, including JB); direct dial (must edit script); toggle bluetooth; toggle WiFi; play/pause; simulate power button (save the physical button); simulate home key
fixed JB / CM10 hanging on boot when script is present
13.08.2012
Initial post
01.12.2012
Added instructions on how to use camera from the lockscreen (see post #3).
Added link to Flint2's Kernel Gesture Builder (see post #2).
Added index.
27.10.2012
Added 3 additional actions (see items 9, 10 and 11 at the end of this post): v1.2 sample script.
Fixed mDNIe negative toggle for newer JB kernels.
23.08.2012
Added action commands and explanations to the 3rd post, with all that has been identified so far.
18.08.2012
Added sample CWM file for the S3 (different coordinates) - thanks to Gokhanmoral
16.08.2012
CWM-flashable zip with ready to use examples
8 gestures
Actions: invert mDNIe; launch camera (3 different apps detected, including JB); direct dial (must edit script); toggle bluetooth; toggle WiFi; play/pause; simulate power button (save the physical button); simulate home key
fixed JB / CM10 hanging on boot when script is present
13.08.2012
Initial post
There are 2 steps required to use this feature:
1. Defining the gestures - in other words, the path that the fingers are expected to make for the gesture to be detected
2. Reacting to detected gestures
Defining gestures
The sysfs entry /sys/devices/virtual/misc/touch_gestures/gesture_patterns provides access to the gesture definitions - the hot spots for the path that each finger must travel for a gesture to be triggered.
"cat /sys/devices/virtual/misc/touch_gestures/gesture_patterns" will show you the current definitions, and some comments on the expected structure:
Code:
# Touch gestures
#
# Syntax
# <gesture_no>:<finger_no>:(x_min|x_max,y_min|y_max)
# ...
# gesture_no: 1 to 10
# finger_no : 1 to 10
# max steps per gesture and finger: 10
# Gesture 1:
...
Choosing the coordinates
Your S2 screen has the following X,Y coordinates:
Code:
+---------------+
|0,0 479,0|
| |
| |
| |
| |
| |
| |
|0,799 479,799|
+---------------+
A maximum of 10 gestures can be defined, each of them using 1 or more fingers (up to a maximum of 10 but in practice more than 4 might not be very feasible) and for each of them a maximum of 10 consecutive hotspots, which make a path.
All gestures must be defined in one go by writing multiple lines to /sys/devices/virtual/misc/touch_gestures/gesture_patterns, in the following form:
Code:
gesture_no:finger_no:(min_x|max_x,min_y|max_y)
gesture_no:finger_no:(min_x|max_x,min_y|max_y)
... additional hotspots for the same finger, or additional fingers, or additional gestures ...
Some examples that can be used in practice (or define your own gestures)
1. swipe one finger near the top and another near the bottom from left to right
Code:
+----+-----------+----+
| | | |
| +-|-----------|-> |
| | | |
+----+ +----+
| |
| |
| |
| |
| |
+----+ +----+
| | | |
| +-|-----------|-> |
| | | |
+----+-----------+----+
Code:
1:1:(0|150,0|150)
1:1:(330|480,0|150)
1:2:(0|150,650|800)
1:2:(330|480,650|800)
2. swipe 3 fingers from near the top to near the bottom
Code:
+---------------------+
| |
| + + + |
| | | | |
+---------------------+
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
+---------------------+
| | | | |
| v v v |
| |
+---------------------+
Code:
2:1:(0|480,0|200)2:1:(0|480,600|800)
2:2:(0|480,0|200)2:2:(0|480,600|800)
2:3:(0|480,0|200)2:3:(0|480,600|800)
3. draw a Z with one finger while another is pressed on the middle left of the screen
Code:
+----+-----------+----+
| | | |
| +--|-----------|-> |
+----+ +----+
| +--+ |
+----+ | |
| | +--+ |
| + | | |
| | +--+ |
+----+ | |
| +--+ |
+----+-+ +----+
| <-| | |
| +-|-----------|-> |
+----+-----------+----+
Code:
3:1:(0|150,0|150)
3:1:(330|480,0|150)
3:1:(0|150,650|800)
3:1:(330|480,650|800)
3:2:(0|150,300|500)
To wrap it all up, you can use the following in an init.d script - as the definitions aren't persisted across reboots - in order to define all these gestures whenever the device starts:
Code:
echo "
[COLOR="SeaGreen"]# Gesture 1 - swipe 1 finger near the top and one near the bottom from left to right
1:1:(0|150,0|150)
1:1:(330|480,0|150)
1:2:(0|150,650|800)
1:2:(330|480,650|800)
# Gesture 2 - swipe 3 fingers from near the top to near the bottom
2:1:(0|480,0|200)2:1:(0|480,600|800)
2:2:(0|480,0|200)2:2:(0|480,600|800)
2:3:(0|480,0|200)2:3:(0|480,600|800)
# Gesture 3 - draw a Z with one finger while another is pressed on the middle left
3:1:(0|150,0|150)
3:1:(330|480,0|150)
3:1:(0|150,650|800)
3:1:(330|480,650|800)
3:2:(0|150,300|500)
[/COLOR]
" > [COLOR="Blue"]/sys/devices/virtual/misc/touch_gestures/gesture_patterns[/COLOR]
There are 2 important things to keep in mind when defining gestures:
* The touches are still delivered to whatever applications are active. If a certain gesture proves to cause nuisance with the actual apps, change it to something different or use it only in certain situations;
* Whenever you're pressing or moving 2 fingers close together, at some point the screen will start detecting only one of them. For some of the gesture definitions this might cause the detection to fail or only work very rarely. Make sure to use the "Show pointer location" option in Settings / Developer in order to be able to track what the device detects, while you're setting things up the way you want.
Triggering actions
Defining gestures won't do anything by itself. Now you need to check the /sys/devices/virtual/misc/touch_gestures/wait_for_gesture entry to see which gesture is detected and do whatever you want.
Here's an example, also to be run from an init.d script:
Code:
( while [ 1 ]
do
GESTURE=`cat /sys/devices/virtual/misc/touch_gestures/wait_for_gesture`
if [ "$GESTURE" -eq "1" ]; then
mdnie_status=`cat /sys/class/mdnie/mdnie/negative | head -n 1`
if [ "$mdnie_status" -eq "0" ]; then
echo 1 > /sys/class/mdnie/mdnie/negative
else
echo 0 > /sys/class/mdnie/mdnie/negative
fi
elif [ "$GESTURE" -eq "2" ]; then
# Start the camera app
am start --activity-exclude-from-recents com.sec.android.app.camera
elif [ "$GESTURE" -eq "3" ]; then
# Edit and uncomment the next line to automatically start a call to the target number
### EDIT ### service call phone 2 s16 "133"
fi
done ) > /dev/null 2>&1 &
- for the 1st gesture, toggle mDNIe inverted / normal
- for the 2nd gesture, launch the Camera app no matter what app is active (quick, that chick is almost out of view! )
- for the 3rd gesture - after you edit and uncomment the appropriate line - a call will be established to that number (the wife is impatient, I don't even have time to enter my PIN!!! )
It loops eternally looking for the next detected gesture and triggering the appropriate action.
NOTE - this has been edited to no longer cause hangs on CM10 startup. The problem was with comments inside the script that contained chars like ' ( ) etc.; be careful when changing the script not to introduce these problems.
Reading from "wait_for_gesture" blocks until one of them is detected, and therefore no CPU is consumed nor deep sleep prevented because of the infinite loop.
In some rare occasions (e.g. multiple scripts waiting for gestures, which can be awaken at the same time but only one of them will get each gesture) the script can wake up with a value of 0, which should just be ignored.
If no script is reading "wait_for_gesture", multiple gestures can be detected and buffered (at most one instance of each one) and be send immediately as soon as something starts reading the entry.
Doing an "echo reset > ..../wait_for_gesture" will flush that buffer so no pending gestures are reported, only future ones.
Sample script
The attached file is a CWM installable package that contains a sample script with all this and more.
It has both the definition of 8 gestures and actions to be performed for each of those.
Remember to edit and uncomment the line with the intended phone, otherwise it won't do anything when you draw the Z on the screen.
Just flash it on your primary or your secondary ROM and you're good to go, with the behavior described below.
Gestures:
1. one finger on the top left, another on the bottom left; swipe both horizontally to the right edge
triggered action - invert mDNIe
2. swipe 3 fingers from the top of the screen to the bottom
triggered action - launch the camera app
(currently recognizes the apps from stock Sammy 4.0.*, AOKP 4.0.4 and JellyBean / CM10)
3. press one finger on the middle left of the screen; with another finger draw a Z starting on the top left edge
triggered action - immediately dial a predefined number on the script (must edit the script to put the number you want or it won't do nothing as it is)
WARNING: This has a nice bonus but you need to be aware of it - it will work even on a locked screen. Anyone that knows the gesture will be able to dial that destination even without knowing your PIN or Unlock Pattern. They won't however be able to press any of the other phone buttons like Contacts, etc.
4. hold one finger on the bottom right while another goes from top-left to the middle of the screen and back
triggered action - toggle Bluetooth on/off (will also vibrate for 100ms to provide feedback)
5. hold one finger on the bottom left while another goes from top-right to the middle of the screen and back
triggered action - toggle WiFi on/off (will also vibrate for 100ms to provide feedback)
6. hold one finger on the top left and another on the bottom left, move both to the middle right
triggered action - Media play / pause
7. draw an X on the screen - top-left, bottom-right, top-right, bottom-left - while holding another finger on the middle left
triggered action - Power button (to spare the physical button)
8. swipe one finger from the bottom left to the bottom right, then again bottom left (5 times)
triggered action - Home button (to spare the physical button)
9. hold one finger on the bottom left and with another swipe from the top right to top left and back to top right
triggered action - Toggle between the last 2 activities, excluding the TW Launcher (edit the script if you use another launcher)
10. hold one finger on the middle left and with another swipe top-right, bottom-right, top-right (3 times)
triggered action - force closes the current activity
11. press 3 fingers in the positions: top-left, top-right, bottom-left
triggered action - temporarily disables finger detection by the apps (or re-enables) so you can then swipe other gestures without causing effects in the apps
All other gestures automatically re-enable detection after it has been disabled by this gesture.
These gestures and actions are already an evolution over the original sample I shared, as a result of people posting their suggestions and ideas on the thread.
It's your turn now - think of what is useful to you and make sure to share it with others
Attachments
Last edited: