On an iPad, try this classic example of how to drag an object across the screen:
https://coronalabs.com/blog/2011/09/24/tutorial-how-to-drag-objects/
using the “modern ultimate config file” with the 320/480 settings:
https://coronalabs.com/blog/2013/09/10/modernizing-the-config-lua/
Since the config settings will cause Corona to scale up all display objects 4 times (using the @4x versions of images), the “content pixels” and “actual pixels on the device” will differ. For example, a newLine defined as 1 px wide (content pixels) will be displayed as 4 pixels wide (actual pixels on the device). This also means that you have to drag your finger 4 actual pixels on the device before the position changes by 1 content pixel. This creates an “unsmooth” effect since the dragged image jumps by 4 pixels at a time.
By printing event.x to the console, I have seen that the “moved” phase of the listener actually gets triggered although event.x remains the same (when dragging really slowly just a tiny amount). This means that the listener somehow hears the move although nothing has actually moved on the screen. I assume that the problem could be resolved if x and y coordinates could be read with decimals.
How can I get the listener to see the fractional changes in the x or y when an object is dragged?