It all began when I had the task of undoing a user action in the app when the device was shaken. The main problem was how to know that a shake had occurred. After a couple of minutes of searching, it became clear that one should subscribe to accelerometer events and then somehow try to detect shakes. Of course, there were some ready-made solutions for that. They were all quite similar, but none of them suited me so I wrote my own implementation. This was a class that subscribed to sensor events and changed its state with every event. After that, my colleagues and I fine-tuned the solution to avoid false positives, but as a result it began to look like something from a “Mad Max” movie. I promised that I would rewrite this mess when I had free time.
Recently I was reading articles about RxJava and remembered that task. Hmm, I thought, RxJava looks like a perfect tool for such a problem. Without thinking twice, I wrote a solution using RxJava. I was impressed by the result - the whole logic was only 8 (eight) lines of code! I decided to share my experience with other developers, and that’s how this article was born.
I hope that this simple example will help you decide whether to use RxJava in your projects. I will first explain how to setup the Android project with RxJava and then go through the development of a sample application step-by-step, explaining all the operators used. I am writing from the perspective that people reading this will have some experience with Android development itself, so the focus will be on using reactive programming.
The source code of the finished application is available on GitHub.
Adding RxJava dependency
To use RxJava, we should add these lines to the build.gradle:
N.B: rxAndroid provides a Scheduler, which is bound to the UI thread.
Adding Lambdas support
RxJava is best when backed up with Lambdas. Without Lambdas, there is a lot of boilerplate code. There are two ways of adding Lambda support at the moment: using the Jack compiler from Android N Developer Preview or using the Retrolambda library. In both cases we should check that JDK 8 is installed first. I used Retrolambda in this example.
Android N Developer Preview
To use the Jack compiler from Android N Developer Preview, we can follow these instructions.
Add these lines to build.gradle:
To add the Retrolambda library to the project there are instructions by Evan Tatarka at https://github.com/evant/gradle-retrolambda
N.B: Please note that in the original instructions Maven Central repository is recommended. You probably already have the JCenter repo in your project since it is used by default when a project is created by Android Studio. JCenter already contains all the required dependencies, so we should not add Maven Central.
So now we have all the tools, we can start development.
Now we have a tool to transform events emitted by any sensor into an Observable. But which sensor fits our task best? In the screenshot below, the first plot is showing values from the gravity sensor TYPE_GRAVITY, the second plot - TYPE_ACCELEROMETER, the third plot - TYPE_LINEAR_ACCELERATION.
As you can see, the device was rotated smoothly and then shaken.
We are interested in events emitted by the sensor with type Sensor.TYPE_LINEAR_ACCELERATION. They contain acceleration values with Earth gravity already subtracted.
Now that we have an Observable with acceleration events, we can use all the power of RxJava operators.
Let’s check what “raw” values look like:
This will produce output:
As you can see, we have an event emitted by the sensor every 20ms. This frequency corresponds to the SensorManager.SENSOR_DELAY_GAME value passed as a samplingPeriodUs parameter when SensorEventListener was registered.
As a payload, we have acceleration values for all three axes but we’ll only use the X-axis projection values. They correspond to the gesture we want to detect. Some solutions use values from all three axes, so they trigger when the device is put on the table, for example (there is a significant acceleration for the Z axis when the device meets the table surface).
Let’s create a data class with only the necessary fields:
Convert SensorEvent into XEvent and filter events with an acceleration absolute value exceeding some threshold:
Now, to see some messages in the log we need to shake the device for the first time.
It’s really funny to see someone debugging the Shake Detection - they are constantly shaking their phone. You can only imagine what comes to my mind.
We only have events with significant acceleration values for the X axis in the log.
Now the most interesting part begins. We need to track the moments when acceleration changes to the opposite direction. Let’s try to understand when this happens. Imagine that a hand with a phone is being accelerated to the left; the acceleration projection on the X axis has a negative sign. Then the hand begins to slow its motion and stops, the acceleration projection on the X axis has a positive sign. It means that one shake corresponds to one sign change of acceleration projection. Let’s form a so-called “sliding window”: actually it’s just a buffer that contains two values, the current one and a previous one:
And here’s our log:
Excellent, as we can see each event is now grouped with the previous one. We can easily filter couples of events with a different sign.
Now every event corresponds to one shake. Only 4 operators are used and we can already detect rapid moves! But false triggering is still possible. Say the user was not shaking his device intentionally, but just took it in the other hand. There is a simple solution to avoid that - ask the user to shake the device several times during a short time period. Let’s introduce the parameters SHAKES_COUNT = number of shakes and SHAKES_PERIOD = the amount of time all shakes are to be made in. I have figured out that optimal values for these parameters are 3 shakes in 1 second. In other cases, some false triggering is possible or the user has to shake the device too hard.
So we want to detect the case when 3 shakes have been done within 1 second. Now we don’t need the values of acceleration, only the timestamp of each event is important. Let’s transform our buffered XEvents into timestamps of the last event in the buffer:
The timestamp values in SensorEvent are in nanoseconds (really, really precise!), so I divide the value by 10^9 to get seconds. Now let’s apply again the familiar trick with a sliding window but this time with different params:
In other words, for each event we’ll have an array containing that event along with two previous events. And, finally, we’ll filter only arrays that fit into 1 second:
If an event has passed the last filter we know the user has shaken their device 3 times during 1 second. But let’s assume our dear user is over enthusiastic in shaking the device and continues to shakes it diligently. We will receive events on every subsequent shake, but want to detect only every 3 shakes. A simple solution for that is ignoring events for SHAKES_PERIOD after gesture detection:
It’s done! This Observable can now be used in our app. Here is the final code snippet:
In my example I play a sound when a shake gesture is detected. Let’s add a field in the Activity class:
Initialise it in the onCreate method:
Subscribe to the onResume method:
And don’t forget to unsubscribe in onPause:
As you can see, we were able to create a solution in just a few lines of code that detects a given shake gesture. It is compact and easy to read and understand. You can compare this with regular solutions, e.g. seismic by Jake Wharton. RxJava is a great tool and when properly applied great results can be achieved. I hope this article will give you the impulse to learn RxJava and use reactive principles in your projects.
Let the stackoverflow.com be with you!
Arkady Gamza, Android developer.