Full screen effects?

Is there an easy way of writing a full screen shader (fragment) effect?

I *imagine¨* you’d stick a screen-sized rectangle over the screen, set to be on top of everything else, but are there any tricks to make the shader not go via any texture or other settings, IE just work from whatever is already rendered into the screen buffer?

I’ll admit, this is all quite new to me, so I may ask some stupid questions :slight_smile:

I believe you’re talking about post-processing effects. The way to do that on a cross-platform basis is to render the whole scene to a snapshot object (render-to-texture) and then apply an effect on the snapshot.

Normally, OpenGL doesn’t let you read from the framebuffer directly in a shader. However, there is a way to do it specifically on iOS that avoids the need for snapshot using the features of the EXT_shader_framebuffer_fetch extension which enable the shader to read back from the frame buffer.

You could try this tutorial (http://www.raywenderlich.com/70208/opengl-es-pixel-shaders-tutorial). Skip to sections called “Procedural Textures”. These would be equivalent to what we call “generator” effects.

Also, has some interesting effects: https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson6

In those examples, whatever is on the right side of the equal (=) of gl_FragColor, should be what your fragment kernel returns.

OK thanks, but yeah things need to be cross platform.

I must admit though, I did think it was precisely these types of shaders that allowed for some of the awesome water / ripple effects we see in games, but maybe what you mention is right, they render the screen into a buffer (this is surely basically the same as a snapshot - render to texture right?) and then process that.

It is fairly obvious I don’t know much about this, but I’m trying to gather information so that when I have time, I hit the ground running :slight_smile:

BTW regarding snapshots, did you ever look into my ancient request of giving snapshots a ‘priority’ or ‘order’ property, so they can be forced to render in a specific order all within the same frame? At the moment snapshots always have a 1 frame delay, and this accumulates as they are nested, but there’s a simple enough way around this I feel, and it would make snapshots more useful (particularly in, say, something like water!).

I believe you’re talking about post-processing effects. The way to do that on a cross-platform basis is to render the whole scene to a snapshot object (render-to-texture) and then apply an effect on the snapshot.

Normally, OpenGL doesn’t let you read from the framebuffer directly in a shader. However, there is a way to do it specifically on iOS that avoids the need for snapshot using the features of the EXT_shader_framebuffer_fetch extension which enable the shader to read back from the frame buffer.

You could try this tutorial (http://www.raywenderlich.com/70208/opengl-es-pixel-shaders-tutorial). Skip to sections called “Procedural Textures”. These would be equivalent to what we call “generator” effects.

Also, has some interesting effects: https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson6

In those examples, whatever is on the right side of the equal (=) of gl_FragColor, should be what your fragment kernel returns.

OK thanks, but yeah things need to be cross platform.

I must admit though, I did think it was precisely these types of shaders that allowed for some of the awesome water / ripple effects we see in games, but maybe what you mention is right, they render the screen into a buffer (this is surely basically the same as a snapshot - render to texture right?) and then process that.

It is fairly obvious I don’t know much about this, but I’m trying to gather information so that when I have time, I hit the ground running :slight_smile:

BTW regarding snapshots, did you ever look into my ancient request of giving snapshots a ‘priority’ or ‘order’ property, so they can be forced to render in a specific order all within the same frame? At the moment snapshots always have a 1 frame delay, and this accumulates as they are nested, but there’s a simple enough way around this I feel, and it would make snapshots more useful (particularly in, say, something like water!).