Hi There,
Im trying to identify if it is possible to identify the full area being covered on the tablet surface, e.g. a hand print?
I imagine it would start by using multitouch, can anyone help me?
Thanks,
Alan
Hi There,
Im trying to identify if it is possible to identify the full area being covered on the tablet surface, e.g. a hand print?
I imagine it would start by using multitouch, can anyone help me?
Thanks,
Alan
I think this may be pretty hard to do. You probably would need multi-touch, which leads to the first problem. Some devices support up to 10 touch points, some, if they do support multi-touch support as few as two. Secondly the points get the center of the touch. You probably could do some math and see if your points looked like a hand, but not until you get to the iPhone 6s and iPhone 6s Plus and the iPad Pro do you get pressure sensitivity, which would aid in spotting a hand. Some Android devices also have a pressure value available to to them, but it’s not consistent across all devices.
Thanks Rob, that’s very useful! do you think corona is the best solution for this?.. It is a college project so I have the option of choosing one device so could focus for example on iPad pro?
I don’t know any framework that’s going to be good for this. You may need raw data from the sensor which would mean native level access and I don’t know if Apple is going to give you that or not. Google may allow access to raw data more.
Corona SDK most certainly cannot get you an outline of the hand print, it at most can get you up to 10 points where the hand is touching. From that you might be able to determine if a hand is pressing on the screen and generate a skeleton from the points.
Rob
I think this may be pretty hard to do. You probably would need multi-touch, which leads to the first problem. Some devices support up to 10 touch points, some, if they do support multi-touch support as few as two. Secondly the points get the center of the touch. You probably could do some math and see if your points looked like a hand, but not until you get to the iPhone 6s and iPhone 6s Plus and the iPad Pro do you get pressure sensitivity, which would aid in spotting a hand. Some Android devices also have a pressure value available to to them, but it’s not consistent across all devices.
Thanks Rob, that’s very useful! do you think corona is the best solution for this?.. It is a college project so I have the option of choosing one device so could focus for example on iPad pro?
I don’t know any framework that’s going to be good for this. You may need raw data from the sensor which would mean native level access and I don’t know if Apple is going to give you that or not. Google may allow access to raw data more.
Corona SDK most certainly cannot get you an outline of the hand print, it at most can get you up to 10 points where the hand is touching. From that you might be able to determine if a hand is pressing on the screen and generate a skeleton from the points.
Rob