So do fragment shaders actually change the image they are on? I thought this was a purely rendering pass, so it didn’t touch the source, so no feedback really is possible. But then again, as fragments work on a pixel by pixel basis and we have no way of *writing* to anything other than the current pixel I’m not sure feedback would be a problem.
But as for rendering passes, well naturally if you do some heavy duty stuff and it runs slow, it is only your fault
Here’s an example of where 2 snapshots would be cool, and using the second snapshot as a ‘mask’ (although we are not explicitly talking about alpha channels - to be honest all my filters / composites would require a solid snapshot or things can go odd regardless).
I have my 2D platformer and I want water effects. But instead of the filter I currently have, which is simply a horizontal water level, where above a certain point it is air and below water, I want tiles to be able to be water or not, so you get pools of water dotted around the level and they aren’t connected.
An obvious way to do this with 2 versions of the tileset, and 2 renderings of the level. 1 is the normal visual representation of the level, and the other is a ‘water mask’, where tiles are normally drawn black, but any area within a tile you want (visually) water, you draw in white, so it actually becomes a per-pixel value.
Then, every frame, you draw the level into the 2 snapshots, with the second one matching the first except it is purely black apart from where water is. Then the composite will look into the second snapshot and if a pixel is white, it will manipulate the visual part accordingly, with a rippling distortion.
Essentially the second snapshot would be a dynamic glorified greenscreen (although naturally the colour is irrelevant, and you could simultaneously have various ‘greenscreens’ (using the RGB and possibly A channels individually).
Twould be awesome!
I still don’t see the problem of just adding the second snapshot as a parameter to a normal filter that lets you access it with CoronaSamplar1, but I’m not in the know so I don’t want to second guess too much.
As for snapshots being lagged or rendered out of order, I’ve already suggested an easy fix for this a long time ago (and several times since), you just give each snapshot a .renderPriority property, so you can manually choose in what order they are refreshed within a given draw cycle. This would also enable us to eliminate problems with nested snapshots, where the further down the heirarchy you go the more lagged each snapshot becomes.