Understanding touch event in multiple objects on the screen

Hi,

I’m trying to be able to flick/drag  an object on my screen to throw it. If I add touch listener to the object (a sprite) this is what happens :

  1. The listener won’t receive moved or ended phase if I move my finger quickly until it reach outside of the object. 
  2. It seems both ended and moved phase only register if it happens on top of the object. Is this correct?
  3. Without the ended phase I can’t tell if the object has been released from the flick/drag state or not

What is the best way so I can able to flick an object in my screen individually?

My 2nd problem is this :

  1. I have a touch listener that I add to drag the screen for scrolling. I added it as a runtime event listener

  2. If I touch an object on myscreen and move it to drag the object, the event bubbles so the screen scrolls as well. I’ve been trying to experiment with different return values (true/false) to stop the events to be dispatched to all the objects on screen but failed to do so

  3. I thought this is something that I can’t do but I noticed that the button widget doesn’t cause this problem. If i touch on a button or even move it, it doesn’t dispatch touch event to runtime event listener or to any display objects.

Is there a way so I can use whatever the button use as an event handler that doesn’t dispatch events to any other listener?

Thanks
 

Check out setFocus for both of these scenarious: http://docs.coronalabs.com/api/type/StageObject/setFocus.html

Thanks!
This solve my problem completely

Check out setFocus for both of these scenarious: http://docs.coronalabs.com/api/type/StageObject/setFocus.html

Thanks!
This solve my problem completely