How to use finger as a physics body

I was hoping someone would be able to point me in the right direction.  I’ve tried differing methods and tried to find something similar in google and haven’t been able to find anything yet.  What I’d like to be able to do is use my finger as a physics body.  Being able to push and move things around at will based on how fast or slowly my finger moves.

I’ve tried several methods and they don’t work well.  Like trying to make an invisible physics body that follows my finger and interacts.  Some troubles here are when my finger moves too fast my object just passes right through the other physics bodies.  Didn’t seem to matter if I made the object dynamic or kinestic or even static which seemed to stop all the other physics objects in their tracks.  I was using the touch event to move my objects this way.

If someone knows of a tutorial or some code that could point me in the right direction that would be wonderful.

Thanks so much.

Try using .isBullet = true property with the invisible object

http://docs.coronalabs.com/api/type/Body/isBullet.html

If that doesnt work, check:

http://docs.coronalabs.com/api/library/physics/setPositionIterations.html

http://docs.coronalabs.com/api/library/physics/setVelocityIterations.html

but remember it might increase processor use, and you dont really want it.

Play with the values, if you are still having the same problem you might want to try another workaroud, decreasing the physical objects on the screen (and removing the current ones correctly) or do something different.

I actually did try to use the isBullet and setting the other values really high and it didn’t help me.  After playing with it more I see why.  I’m moving the invisible kinematic object with the touch event to the event.x and event.y location.  When moving your finger really fast the x and y coordinates move far past the object I wanted to hit and it really just repositions the invis object on other side of it without making any contact.  

 

I’m now using the touch event just to keep track of where my finger is positioned on the screen and saving its coordinates.  Then in the enterframe event I’m imparting a linear velocity on my invis object based on the distance it is away from the finger coordinates and the direction it needs to go.  So far it works okay but even with high velocity seems to be a tad slow as I was hoping it would keep up with the finger yet it lags a bit behind.  Maybe it is less noticeable on a device so I should give it a try there.

 

Thanks so much for your response.  I’m going to keep working and see if I can get it a bit more fluid.

The setFocus API might help here as well.

http://docs.coronalabs.com/api/type/StageObject/setFocus.html

Consider using a touch joint on your ‘phantom’ object.

http://docs.coronalabs.com/guide/physics/physicsJoints/index.html#touch

Back from trying various things and I think I’ve settled on using touch points with a dynamic body.  It reacts about the same as my kinematic object and setting linear velocities but much cleaner code wise.  

Appreciate all the help…thanks again.

Try using .isBullet = true property with the invisible object

http://docs.coronalabs.com/api/type/Body/isBullet.html

If that doesnt work, check:

http://docs.coronalabs.com/api/library/physics/setPositionIterations.html

http://docs.coronalabs.com/api/library/physics/setVelocityIterations.html

but remember it might increase processor use, and you dont really want it.

Play with the values, if you are still having the same problem you might want to try another workaroud, decreasing the physical objects on the screen (and removing the current ones correctly) or do something different.

I actually did try to use the isBullet and setting the other values really high and it didn’t help me.  After playing with it more I see why.  I’m moving the invisible kinematic object with the touch event to the event.x and event.y location.  When moving your finger really fast the x and y coordinates move far past the object I wanted to hit and it really just repositions the invis object on other side of it without making any contact.  

 

I’m now using the touch event just to keep track of where my finger is positioned on the screen and saving its coordinates.  Then in the enterframe event I’m imparting a linear velocity on my invis object based on the distance it is away from the finger coordinates and the direction it needs to go.  So far it works okay but even with high velocity seems to be a tad slow as I was hoping it would keep up with the finger yet it lags a bit behind.  Maybe it is less noticeable on a device so I should give it a try there.

 

Thanks so much for your response.  I’m going to keep working and see if I can get it a bit more fluid.

The setFocus API might help here as well.

http://docs.coronalabs.com/api/type/StageObject/setFocus.html

Consider using a touch joint on your ‘phantom’ object.

http://docs.coronalabs.com/guide/physics/physicsJoints/index.html#touch

Back from trying various things and I think I’ve settled on using touch points with a dynamic body.  It reacts about the same as my kinematic object and setting linear velocities but much cleaner code wise.  

Appreciate all the help…thanks again.