Detecting Pitch, Roll and Yaw

Taking this as a description of the three rotation axes: https://en.wikipedia.org/wiki/Aircraft_principal_axes

Looking at an iPhone laying flat on it’s back, screen facing to the ceiling, and imagine the top of the phone (the end where the camera is) as the ‘nose’…

I would argue that it is currently not possible to detect Yaw - rotation of the phone while it lays on it’s back without being lifted.

I am currently testing this with Matt Pringle’s Corona Remote and using the various values returned to move a dot around the screen.

It doesn’t matter what I do, there is no motion I can put the phone through while laying on it’s back which will get the dot to move (an appreciable distance) - and this is while multiplying the values returned by 100.

My ultimate goal here is to use the phone like a laser pointer, but this is currently not looking like an option. Any ideas?

I was looking for the same functionality, and in creating a feature request, was pointed to this, but have yet to try it:

http://simon.fearby.com/blog/?p=2134

Let us know how you get on!

Hey, thanks!

There’s only one problem… That post doesn’t say which accelerometer data should be used: the gravity, instant or raw values. I suspect it’s the raw x, y and z but that presents a further problem in that Corona Remote doesn’t actually return that value.

I was looking for the same functionality, and in creating a feature request, was pointed to this, but have yet to try it:

http://simon.fearby.com/blog/?p=2134

Let us know how you get on!

Hey, thanks!

There’s only one problem… That post doesn’t say which accelerometer data should be used: the gravity, instant or raw values. I suspect it’s the raw x, y and z but that presents a further problem in that Corona Remote doesn’t actually return that value.