Multitouch - methods for detecting the number of touches

Multitouch question: Is there an undocumented method for retrieving the total number of touches on screen?

I thought I’d ask the basic question before tossing a whole bunch of code around.

All the methods I’ve developed for counting the number of touches work 99% of the time on device but I’ve found that insanely rapid touches (like say a kid pretending to play piano on the screen) can throw the count off and then the game effectively fails.

Nothing is 100% when it involves external sensors…  depending on frame rate, it is possible to miss a new touch if say 2 additional fingers touch in between frames.

I think that describes the problem perfectly.  Unfortunately, unless I come up with a workaround to get to 100%, I’ll have to drop my multitouch only dreams and reinsert “mode” buttons.

I’m going for:

1 finger  = select

2 fingers = zoom in / zoom out (pinch)

3 fingers = rotate

It’s an elegant interface if I do say so myself.  The alternative is to use 3 mode buttons to select the interaction type - which is an acceptable solution if not ideal.

As you may know I use a 2 finger system.  

1 = select and 2 = pinch/zoom and/or pan.  

Rotate I handle on a button as rotating a 130x130 3D array and then redrawing a 2.5D environment is very processor-intensive!

You could do 2 finger rotate if you worked out your starting vertex and ending vertex which would give you the rotation angle?

Yes, I like your 1 = select, 2 = pinch/zoom .

On your map, the 1 finger is always selecting a parcel of land so it has a “purpose”.  On my layout, there is a lot of empty real estate and so multiple touches register to the same object (an invisible sensor overlaid on top of everything else to process touches).  If the touch count = 1, then the touch goes through to the layer below and an object is selected.  If there is more than one touch, the touches are processed ; 2-touches = pinch/zoom, 3 touches = rotate, 4 touches = select all.  It works great until the initial object is “over-tapped”. 

Super-fast tapping with multiple fingers will eventually cause the touch counter to fail.  After looking at your set-up, I’m considering using a grid of sensors to detect touches.  I have a hunch that could be an improvement.  I’m also considering restricting the total number of touches allowed to 2 (which prevents over-tapping) but I would love to use 3 or 4 touches if I can work it out.

Ideally, the system could provide me with the number of touches and then my touch counter would be irrelevant and I wouldn’t have to worry about it crashing.

I’m might make this a requested feature even though I’m about out of votes  :slight_smile:

I have a runtime touch for map manipulation (this allows for movement of the map without requiring the map to be touched). Then a object touch for the background tiles (all using a single touch event instance. I work out the grid x,y based on the screen x,y) and finally a building touch handler (again this is shared by all buildings) to move the buildings around.

Managing focus and event propergation is a bit tricky.

I found a solution!  Tag every object associated with a touch or tap with the event.id of that touch or tap.

For instance, if I’m using dots to visually track the touches during development, I would tag them with the event.id before adding them to a table

[lua]

dot.id = tostring( event.id  ) – tag dot object

dotList[#dotList+1] = dot    – add dot object to table of dots.

[/lua]

That way I can be sure that the event.id matches the object.id before deleting the object from a table.

My multi-touch system was failing under extreme tap-rate pressure because simultaneous or nearly simultaneous taps would mess up the index of tables as events were added and removed from the various touch and object trackers.

I now have one, two and three touches playing nicely again. 

Nothing is 100% when it involves external sensors…  depending on frame rate, it is possible to miss a new touch if say 2 additional fingers touch in between frames.

I think that describes the problem perfectly.  Unfortunately, unless I come up with a workaround to get to 100%, I’ll have to drop my multitouch only dreams and reinsert “mode” buttons.

I’m going for:

1 finger  = select

2 fingers = zoom in / zoom out (pinch)

3 fingers = rotate

It’s an elegant interface if I do say so myself.  The alternative is to use 3 mode buttons to select the interaction type - which is an acceptable solution if not ideal.

As you may know I use a 2 finger system.  

1 = select and 2 = pinch/zoom and/or pan.  

Rotate I handle on a button as rotating a 130x130 3D array and then redrawing a 2.5D environment is very processor-intensive!

You could do 2 finger rotate if you worked out your starting vertex and ending vertex which would give you the rotation angle?

Yes, I like your 1 = select, 2 = pinch/zoom .

On your map, the 1 finger is always selecting a parcel of land so it has a “purpose”.  On my layout, there is a lot of empty real estate and so multiple touches register to the same object (an invisible sensor overlaid on top of everything else to process touches).  If the touch count = 1, then the touch goes through to the layer below and an object is selected.  If there is more than one touch, the touches are processed ; 2-touches = pinch/zoom, 3 touches = rotate, 4 touches = select all.  It works great until the initial object is “over-tapped”. 

Super-fast tapping with multiple fingers will eventually cause the touch counter to fail.  After looking at your set-up, I’m considering using a grid of sensors to detect touches.  I have a hunch that could be an improvement.  I’m also considering restricting the total number of touches allowed to 2 (which prevents over-tapping) but I would love to use 3 or 4 touches if I can work it out.

Ideally, the system could provide me with the number of touches and then my touch counter would be irrelevant and I wouldn’t have to worry about it crashing.

I’m might make this a requested feature even though I’m about out of votes  :slight_smile:

I have a runtime touch for map manipulation (this allows for movement of the map without requiring the map to be touched). Then a object touch for the background tiles (all using a single touch event instance. I work out the grid x,y based on the screen x,y) and finally a building touch handler (again this is shared by all buildings) to move the buildings around.

Managing focus and event propergation is a bit tricky.

I found a solution!  Tag every object associated with a touch or tap with the event.id of that touch or tap.

For instance, if I’m using dots to visually track the touches during development, I would tag them with the event.id before adding them to a table

[lua]

dot.id = tostring( event.id  ) – tag dot object

dotList[#dotList+1] = dot    – add dot object to table of dots.

[/lua]

That way I can be sure that the event.id matches the object.id before deleting the object from a table.

My multi-touch system was failing under extreme tap-rate pressure because simultaneous or nearly simultaneous taps would mess up the index of tables as events were added and removed from the various touch and object trackers.

I now have one, two and three touches playing nicely again.