Great post from the Blind Motion project that pushes all the right buttons for me, Machine Learning, autonomous driving, realtime signal processing. Definitely worth a read.
As a bonus the dashboard used in the article is available as open source for your own projects.
Perhaps some of you in your projects also compare video with sensors data, or just look at the sensor data. If so, the dashboard could be useful for you. It is quite easy to use, allows you to scale, transform data on the fly and has an open license, which means you can freely use it and modify.
The author also does some interesting normalization of the Gyro and Accelerometer so that regardless of the orientation of the sensor (phone) the library will normalize the sensor data so the x-axis is always the primary direction of the car and Z is perpendicular to the ground. This is something that we initially considered for our fitness app Yes Drill Sergeant!, however we would require much faster updates and recalibration, sensor data from a user running is much more erratic than a car driving.