Z-Index touch or tap

Hi

Thinking I am going about this the wrong way. I have a main actor on screen which reacts to a tap or touch to screen to animate it. However, I also have a music on / off on screen as well as a pause button.

The trouble I am facing is that if I click the pause button this is detected as a click to the screen and its animated. This was mainly an issue when I had a runtime:addEventListener. So I thought I would add a specific rect which acted as a tap absorber to the main stage and then overlay my specifc buttons on top so that the latter would absorb the tap.

This however doesn’t seem to work as both events are tapped as I assume its because it takes on the same x and y position. Is there a z order to these things or am I over complicating the touch process and some trick with id or name would suppress the click if its not the main absorbing rect.

You help would be very much appreciated as this one is consuming a lot of time and wasted code

J [import]uid: 103970 topic_id: 21086 reply_id: 321086[/import]

Insert a line at the end of your pause button handler

[lua]return true[/lua]
as it wont let the click goes through the pause button instead it might “absorb” the click where it should be.

Cheers,
Rodrigo. [import]uid: 89165 topic_id: 21086 reply_id: 83373[/import]

Thanks @RSCdev

I was very optimistic of this however both (pause button state and tapabsorber in background) are still firing even with the [lua]return true[/lua] against each handler [import]uid: 103970 topic_id: 21086 reply_id: 83391[/import]

I’m curious to see how your handling this as it shouldn’t really be complicated :slight_smile: [import]uid: 84637 topic_id: 21086 reply_id: 83423[/import]

I agree @Danny it shouldn’t be complicated but I may be over engineering it

First off I’m using the storyboard api for different scenes. On each scene I have a custom font which I am applying an event listener to for click events etc. this is particularly relevant on the main screen where I have two text event listeners for a sound on / off and pause function.

However the main actor is partially controlled by single or double tapping the playing area of which the two buttons described above are overlaid. If I make the main actor taps controlled by a specific newrect area or applying a runtime listene to the whole screen the results the same a click on the sound or pause button also controls the actor which obviously isn’t what I want

Additionally touch actions on different scenes when navigating back to the main game screen also applies an additional click and actor is controlled unexpectantly

It’s as though the touch event is bubbling up through each hierarchical touch event until it gets to main scene and parent which obviously isn’t ideal.

Adding [lua]return true[/lua] or [lua]return false[/lua] doesn’t suppress the bubbling up effect to different touch /tap events

J [import]uid: 103970 topic_id: 21086 reply_id: 83492[/import]

sorry posted in wrong thread. [import]uid: 13099 topic_id: 21086 reply_id: 105319[/import]

If your adding a runtime listener to detect touches, simply filter out what you don’t want to react to touches.

Eg

local pauseButton = display.newRect(100, 50, 20, 20)  
pauseButton.detectInRuntime = false  
  
local player = display.newRect(100, 150, 20, 20)  
player.detectInRuntime = true  
player.name = "player"  
  
local function detectTouches(event)  
 local target = event.target  
  
 if target.detectInRuntime == true then  
 --Do whatever with player  
 if target.name == "player" then  
 --Do something  
 end  
 --If the target isn't something were supposed to handle here just return  
 else  
 return  
 end  
  
 return true  
end  
  
Runtime:addEventListener("touch", detectTouches)   

there are several other ways to go about this. [import]uid: 84637 topic_id: 21086 reply_id: 106090[/import]