Every year or so, Freescale runs "Technical Enrichment Matrix" (aka: TEM) series of local events. It's essentially an adult science fair where the engineering community within our various facilities gets together to share ideas.  Participating engineers put together posters and demos designed to teach their colleagues a bit about what they've been working on for the past year.  Last week we held our 2013 TEM in Tempe, Arizona - and were lucky to have our CEO Gregg Lowe as well as CTO Ken Hansen on hand.  It's a fun event, and I look forward to it every year.

 

This time around, my team and I got the opportunity to show off some of the sensor fusion work we've been doing.  We're using Android phones and tablets to communicate to a development board via Bluetooth link. The fusion is done on the development board. We use Android as a visualization tool.  The embedded software and boards are not (yet) available outside Freescale, but you CAN download Version 1.0 of the Android app today.

 

"Why would I bother?" is an obvious question. If you are a sensor fusion expert, you might just want to see what Freescale is up to.  But if you are interested in the topic, and still learning, then this will be (I hope) a good educational resource for you.

device-2013-02-25-140930.png

We call this application the "Xtrinsic Sensor Fusion Toolbox".  It's a free download from Google Play.  Just type "sensor fusion" in the search field and it should pop right up.  Or if you are viewing this posting from your phone, simply click here.  Or if you like QR codes, use this one: AndroidDemoQrCode.png

 

The tool was designed to allow us to benchmark our sensor fusion results versus other solutions already on the market, so it can play with fusion options already present on your Android phone or tablet.  The app uses standard Android interfaces to access sensor and orientation data from the Android framework.  Although optimized for tablets, it should run fine on any phone running Android 3.0 or above.  On smaller screen devices, some GUI components will extend offscreen.  Extensive use of Android ScrollViews mean you can just slide the appropriate tool bar or display to one way or the other with your finger to access "offscreen" components.

At this point, I'm going to assume you're intrigued enough to have downloaded the app from Google Play.  Open the app and tap on the "Source/Algorithm" button, and you will be presented with a number of options.

 

source-device-2013-04-04-162647.png

Try "Local accel" first.  You will immediately notice that the image of the PCB on the phone begins to respond to your movements. But there are limitations. Put your device on a table top and spin it.  You will see that the image does not respond with any change.  This is because the app is computing orientation with just an accelerometer, which we are using to measure gravity.  Spinning the device as described just rotates it about the very vector it is trying to measure, which doesn't change the measured value.  So you cannot see changes in "yaw".  But pick up your device and tip it right to left (roll) or bottom to top (pitch) and you will see the image of the PCB adjust itself such that it tries to remain stationary in space, regardless of how you hold your device. This inability to measure yaw is not an error in the algorithm, it is a limitation imposed by the use of a single sensor to compute orientation.

device2.gif

 

Now try the "Local mag/accel" option.  With the addition of a magnetometer, you have everything needed to create an electronic compass.  Now the app knows which way magnetic north is, and it will attempt to fix the PCB in space.  It's tough to describe, but if you download and try the app, you'll see what I mean.

Magnetic sensors tend to be a bit noisy, and you may notice this option has a bit of jitter.  If necessary, scroll the Fusion Settings Bar (where you see "Exit" and "Source" buttons to the left, and you'll find a checkbox labeled "LPF Enable".  This let's you apply a low pass filter to sensor readings before they are used to compute device orientation.  You can change the filter coefficient with the scroll bar that appears when you check the box.  Both notice that adding filtering also makes the display respond slower to changes in orientation.

 

At this point, you should start to see the point of this application.  It lets you explore options for sensor fusion.  And bear in mind that at this point, you are playing with options already in your Android device.  We're using standard Android routines and whatever sensors your device manufacturer decided to include.

Up to now, we've been exploring the "Device View." Other options you should try include the "Panorama View".  You get to that using the navigation button (shown below).

navigation.png

The Panorama View (shown below) places you in the center of a virtual room.  As you rotate and move your device, your view into that room changes to match your movements.  If your device includes a gyroscope, I would set the "Source/Algorithm" control to "Local 9-axis".  This will give you smoothest control.room1.png

I personally find the "Device View" more useful for visualizing fusion results. We included the "Panorama View" as an example of just how easy it is to use orientation results in a Virtual Reality application (see Orientation Representations: Part 2).

 

In June 2011, I presented Online data sets for inertial and magnetic sensors (part 1).  Regular readers may have noticed there was never a second part. This Android app addresses that deficiency. You can use it to capture your own data sets. Go back to the Navigation control and select "Log Window".

log.png

 

Assuming you've set the "Source/Algorithm" control to one of the three "Local" options, you should see a continuous display of sensor values scrolling down the screen.  If you click the "File logging enable" checkbox on the Fusion Settings Bar, these will be captured to an output file that you can then email to yourself using the "SHARE" option on the Android Action Bar (you must have an email client installed on your device for this to work).

 

There are a lot more options available on the tool.  All are documented in the "Documentation" view, again enabled via the NAV button.  In addition to tool features, the documentation also goes into some detail explaining the basics of sensor fusion and how some of the results are calculated, incorporating several of our previous blog topics on the topic.

documentationScreen.png

 

Please send me feedback and suggestions for improvement.  Although I can't promise that all suggestions will be incorporated in future versions, they will be considered.  I authored this application, and have a vested interest in making it useful.