Full Screen Filters Needed

We’ve alluded to this in another post but seems appropriate to have a fresh one for clarity.

Lots of the filters seem perfect for games as full screen realtime effects. It really feels like we are missing a trick not being able to do this. My understanding is we currently have 2 options:

  1. Use a snapshot with all game objects in it’s group. Filters can then be applied to this snapshot but we loose all touch events. Fine for Ouya, GameStick, etc. but not great for phones and tablets.
  2. We use display.captureScreen() to capture the screen every frame which just kills performance.

Surely there must be a solution to this? Is there a screen buffer object that could be made available that we could apply filters to?

Obviously, users cannot expect their touch events to work accurately when applying distortion effects like swirl or bulge.

@walter Come on, you know this makes sense! Or at least let us know why this isn’t possible in Corona.

Thanks a lot,

Ian

  1. I don’t know if loosing the touch events is true or not but if it is… Couldn’t you make a rect that is the same size as your snapshot, overlay it on top of the snapshot, set it to invisible, use .isHitTestable = true and use that to capture your touch events?

If you only needed a single ‘Rumtime’ style touch event for the entire game then that sounds like it would work. Unfortunately, most games (including ours) have all kinds of objects you can interact with via their own touch listeners.

(It’s true btw as Walter says this is expected behaviour, see: http://forums.coronalabs.com/topic/42049-resolved-events-not-working-on-objects-inside-snapshot/)

Any other ideas Danny?

Do you mean you need to be able to touch individual objects inserted into the snapshot? Or do you just need to be able to touch the snapshot as a whole?

I need a little more info :smiley:

Sure. We have a game with lots going on and lots to touch and do. We want to be able to apply full screen effects to it. For example, add a bloom to the entire screen. We still need to be able to interact with the game.

Back to my original post (in accidental stealth mode)… there seems to be no way to do this. Using a snapshot is an example of a method that doesn’t work - because we lose the touch events. Got a better way?

Tbh, it makes sense that (when using a snapshot) rendering a group of objects into a single image and then making it visible (and not in every frame, of course!) would lose touch tracking - what are you expecting it to know about?

As with my recommendation in the pinch-zoom-rotate tutorial, using other objects at the top layer (your own “tracking” or “interface” layer) would be one solution and quite inexpensive; If inner-snapshot object touch tracking were implemented by CoronaLabs, I would expect it to be quite expensive. Just IMHO.

I think my idea would still work well. Just create the rects for each child object you insert into the snapshot and position the rects where you position the child in the snapshot.

Then move the rects with your snapshot (or however you are handling your movement) and all should be good right?

This is just one idea off the top of my head, i’m sure there are other possibilities that others could suggest.

Thanks

I’d like to not use snapshots if possible as this is clearly not what they were intended for. I did not intent for this post to become about them. We just need to apply a filter to the screen. Should be easy right?

Danny, we will have a think about that. I’m worried about the complexity though as we’re using MTE to render thousands of tiles and have 500 - 1000 sprites above them that you can touch individually. Feels a bit like using a hammer to crack an egg.

Surely there must be a better way to apply a filter to the full screen?

spideri, to be honest it sounds like you are making fairly unreasonable demands of the poor old corona engine :slight_smile:

I think Danny’s idea is the best, you just have simple touch rects in the location of the buttons. Of course, he’s thinking in terms of a UI with only a few, and you seem to be talking about something else.

Do you need to be able to tap on each tile on-screen or something, because if so, touch events in each tile is not a good solution to this problem anyway, you’d be better off having one touch event for the whole screen and calculating the tile under a given point. And how are you managing thousands of tiles? My understanding of MTE is it only draws the tiles onscreen, so unless you have really tiny tiles or many many layers, there’s not many ways to get *that* many tiles on screen. If I run 4 full screen layers in my own engine, I’m still only pushing a little more than 1000 tiles per frame.

Perhaps with a bit more detail we can come up with a more tailored solution?

Hi Rakoonic. Thanks for the suggestions. To explain a little further about out scenario:

  • We have a tile map of around 30x40 tiles = 1200, and whilst the user is zoomed in most of the time to say 10x12, they often need to zoom out to explore the map. At this point we only have 1 layer of tiles.
  • We have 500 touchable objects on this map, each consisting of 1-2 images. These objects don’t move but do have specific locations (i.e. not aligned to tiles)
  • As you can see when zoomed out that a LOT to draw but actually in Corona and with MTE we were getting decent speeds until G2.0 hit… though we’re waiting for the new MTE update before we panic about that.

Are you saying each touchable object having its own touch listener will be very slow? If so that could be great benefit. As they don’t move we could easily ‘bsp’ them.

Having said that we don’t want to make too many changes until we see what effect the new version of MTE has on everything.

  1. I don’t know if loosing the touch events is true or not but if it is… Couldn’t you make a rect that is the same size as your snapshot, overlay it on top of the snapshot, set it to invisible, use .isHitTestable = true and use that to capture your touch events?

If you only needed a single ‘Rumtime’ style touch event for the entire game then that sounds like it would work. Unfortunately, most games (including ours) have all kinds of objects you can interact with via their own touch listeners.

(It’s true btw as Walter says this is expected behaviour, see: http://forums.coronalabs.com/topic/42049-resolved-events-not-working-on-objects-inside-snapshot/)

Any other ideas Danny?

Do you mean you need to be able to touch individual objects inserted into the snapshot? Or do you just need to be able to touch the snapshot as a whole?

I need a little more info :smiley:

Sure. We have a game with lots going on and lots to touch and do. We want to be able to apply full screen effects to it. For example, add a bloom to the entire screen. We still need to be able to interact with the game.

Back to my original post (in accidental stealth mode)… there seems to be no way to do this. Using a snapshot is an example of a method that doesn’t work - because we lose the touch events. Got a better way?

Tbh, it makes sense that (when using a snapshot) rendering a group of objects into a single image and then making it visible (and not in every frame, of course!) would lose touch tracking - what are you expecting it to know about?

As with my recommendation in the pinch-zoom-rotate tutorial, using other objects at the top layer (your own “tracking” or “interface” layer) would be one solution and quite inexpensive; If inner-snapshot object touch tracking were implemented by CoronaLabs, I would expect it to be quite expensive. Just IMHO.

I think my idea would still work well. Just create the rects for each child object you insert into the snapshot and position the rects where you position the child in the snapshot.

Then move the rects with your snapshot (or however you are handling your movement) and all should be good right?

This is just one idea off the top of my head, i’m sure there are other possibilities that others could suggest.

Thanks

I’d like to not use snapshots if possible as this is clearly not what they were intended for. I did not intent for this post to become about them. We just need to apply a filter to the screen. Should be easy right?

Danny, we will have a think about that. I’m worried about the complexity though as we’re using MTE to render thousands of tiles and have 500 - 1000 sprites above them that you can touch individually. Feels a bit like using a hammer to crack an egg.

Surely there must be a better way to apply a filter to the full screen?

spideri, to be honest it sounds like you are making fairly unreasonable demands of the poor old corona engine :slight_smile:

I think Danny’s idea is the best, you just have simple touch rects in the location of the buttons. Of course, he’s thinking in terms of a UI with only a few, and you seem to be talking about something else.

Do you need to be able to tap on each tile on-screen or something, because if so, touch events in each tile is not a good solution to this problem anyway, you’d be better off having one touch event for the whole screen and calculating the tile under a given point. And how are you managing thousands of tiles? My understanding of MTE is it only draws the tiles onscreen, so unless you have really tiny tiles or many many layers, there’s not many ways to get *that* many tiles on screen. If I run 4 full screen layers in my own engine, I’m still only pushing a little more than 1000 tiles per frame.

Perhaps with a bit more detail we can come up with a more tailored solution?