Has anyone noticed that if they have an image bleeding off of a scene that’s being transitioned to that it pops in for a split second on the current screen before the transition is made? I’m wondering if there’s a way to fix this. [import]uid: 79620 topic_id: 25014 reply_id: 325014[/import]
This seems to be a long-running problem of Director. The problem is that the new scene is created adjacent to the old scene. (Which means the display group of the new scene gets the value of display.contentWidth as its x position before the transition occurs.)
When using the “letterbox” scaling mode and filling the “bleed” area around the content region (as explained here: http://blog.anscamobile.com/2010/11/content-scaling-made-easy), this becomes very noticeable.
If you only use “fade” and similar transitions in Director, you can try this:
In director.lua (version 1.4), change the lines 66 and 67 from
local _W = display.contentWidth
local _H = display.contentHeight
to:
local _W = display.contentWidth + 2 * math.abs(display.screenOriginX)
local _H = display.contentHeight + 2 * math.abs(display.screenOriginY)
or, in your case:
local _W = 10000
local _H = 10000
This is not a thorough solution. For example, now the “downFlip” transition doesn’t work without glitches anymore.
Maybe a proper solution is possible with a mask (graphics.newMask() and so on).
(Though currently the mask has to be created from an image file. It would be better if there was a way to create a rectangular mask by passing parameters. I think I read somewhere that that’s planned.) [import]uid: 146894 topic_id: 25014 reply_id: 104714[/import]