Deferred Shading, is it possible?

I’m planning on adding more detail one of our projects and normal mapping came into my mind. I know i saw normal mapping somewhere on the blog and know that there is a normal mapping shader, which only supports 1 light.

BUT! Seeing as Corona has implemented the *graphics.newTexture* API, can these be used as multiple render targets so that we can actually do deferred shaders/lighting, thus, make make a cool looking game with lights?

Here is a small but pretty good article about this. I’m getting kind of excited since i think this can be accomplished with the *newTexture* API. Can it?

I think it’s feasible. At any rate, it’s crossed my mind to investigate. It would probably be a terrible idea to add to a work-in-progress project, though.  :slight_smile:

The limits you’d be up against are that Corona currently only supports two textures and that you’ll presumably want batched geometry, which mostly limits you to four shader inputs (with some encoding you might squeeze out a little more), unless you can spare a texture and cram it full of data. By the sounds of it you could pack the three-component normal and luminance together into RGBA textures (if you adopt the model in the link)?

Another issue is that you’ll need to draw your scene twice, and how best to achieve that.

What are your implementation ideas at the moment?

For now i’m not really sure, i’ll experiment tomorrow. I’m thinking about using the graphics.newTexture API, with the canvas type. Just need to see how :invalidate() works and see if i can get a canvas to draw inside another canvas, sort of like the passes. I’ll see what i can do with snapshots, try different things and see how it really works, limitations, etc. so it’s just a matter of experimenting for now. (Maybe i just said a bunch of nonsense) Right now the game uses a fullscreen snapshot, so i can apply shaders to all the game. i’ll see if i can mix things up.

I also remember reading something about uniform userdata for shaders, but i still need to read more about all that. I just have a feeling that i could be done somehow. Devs here at Corona are constantly opening up apis and new stuff for us.

btw, @StarCrunch, i know you have tinkered around with shaders more than anyone around here, you saying it’s feasible means it’s a real possibility! hehe

@basiliogerman It’s great that you are experimenting to see what works. I’ve done a lot of that in my project which makes heavy use of the new Canvas texture. I had experimented at various times with Snapshots and Shaders but in the end found that the canvas textures, for my purposes, are very fast and reliable. You mention that you use full screen snapshots. It will depend on your specific use case but that sounds expensive to me. Be sure to test that out on slower devices. I use an iPhone 3GS and an old $50 Android phone as my baselines. If something works on both those devices, then it’s good as far as I am concerned. Besides testing the speed, using older devices for testing shaders is especially important.

@jerejigga Yes, snapshots are a bit expensive but they work well on newer devices. What do you use the canvas textures for? (And how)

Weirdly, I stumbled across this earlier today…
https://www.codeandweb.com/spriteilluminator

Don’t know if that’s exactly what you’re after or if it’s any help…

(I’ll be honest I really don’t understand any of this type of stuff, but it fascinates me)

@Appletreeman yes, it’s something like that, but with more lights. This is the old blog post i was talking about, and this link here has a nice animation with it in action. Something like that but on a larger scale, like this video on youtube. Pretty busy right now with other things, hope i do get some time to try things out!

@basiliogerman I can’t go into too much depth about how I use the canvas textures yet since the project I am working on is still in development and therefore under wraps for now. I can tell you that I use canvas textures in multiple ways but here is the main one.

I create a new texture using graphics.newTexture( { type=“canvas”, width=width, height=height } ). I draw onto the canvas using various display objects. I use the texture to create a new ImageRect of the same width and height as the canvas, I apply a predefined filter (e.g. blur, bulge, contrast, etc.) to the ImageRect. I then draw this ImageRect onto yet another canvas texture! I then use this canvas texture to create any number of new ImageRect objects. The code looks something like what is below. Blurs and other filters can be expensive, especially if the object is moving. This approach allows me to apply the filter once to the first canvas and then replicate the pre-filtered second texture onto as many display objects as I want while only incurring the filter cost once.

Note that Android flushes textures on application suspend. This forces me to flush the canvas texture caches upon application resume. And since I am using nested canvases, I have to flush the inner most canvas first and then work my way outward. Also, I allow a small time lapse between canvas cache flushes to allow the system to catch up between texture flushes.

All of the code below was extracted from my actual code and is not runnable as-is. It’s just meant as a starting point for those who are interested. For example, production code should check if the OS is Android before bothering to add the system event handler shown below.

local tex1 = graphics\_newTexture( { type="canvas", width=width, height=height } ) tex1:draw( content1 ) tex1:invalidate() local rect = display.newImageRect(tex1.filename, tex1.baseDir, width, height) rect.fill.effect = "filter.blurGaussian" local tex2 = graphics.newTexture( { type="canvas", width=width, height=height } ) tex2:draw( rect) tex2:invalidate() for i = 1, 100 do local rect2 = display.newImageRect(tex2.filename, tex2.baseDir, width, height) end

local function onSystemEvent( event ) if (event.type=="applicationResume") then tex1:invalidate("cache") timer.performWithDelay(100, function() tex2:invalidate("cache") end) end end Runtime:addEventListener("system", onSystemEvent)

Awesome @jerejigga ! That’s something i wanted to try out!  :smiley:

EDIT:

Okay, so i went ahead and wrote a bit of code, borrowed some assets and read some good stuff and this is what i came up with:

https://www.youtube.com/watch?v=BbfrL9da4i8

What i am essentially doing is using the graphics.newTexture API as a framebuffer object, got two shaders, one to make a lightmap and the other to apply the lightmap and ambient color.

I’m not sure if the approach is the best one or if it even is deferred shading. I do have a for loop for each light in an enterframe, which i think is not deferred shading. As for the shaders, i essentially dissected This shader in two. One as a filter and the other as a composite. (I did have to compensate for Corona flipping Y space coords.)

I’ve been using Sprite Illuminator https://www.codeandweb.com/spriteilluminator to generate normal maps and then using a composite paint fill to produce shading.

Check out the advanced graphics guide https://docs.coronalabs.com/guide/graphics/effects.html

[lua]

object.fill = compositePaint

object.fill.effect = “composite.normalMapWith1DirLight”

object.fill.effect.dirLightDirection = { 1, 0, 0 }

object.fill.effect.dirLightColor = { 0.3, 0.4, 1, 0.8 }

object.fill.effect.ambientLightIntensity = 1

[/lua]

You can use sin and cos to figure out the values of x and y and assign them to the object.fill.effect.dirLightDirect table.  Experiment with the z value.

If you update the x and y values every frame you will have dynamic lighting system.

I’m experimenting with 1PointLight as well but find it little trickier (attenuation values)

[lua]

object.fill = compositePaint

object.fill.effect = “composite.normalMapWith1PointLight”

[/lua]

The first image below has shading built into the graphic file, the second is a screen shot of an experiment in dynamically lighting the image in real time using 3 light sources (3 normal maps - white, yellow, red).

*be mindful of how you set up your shading system as it can quickly become cumbersome to processor speed and the frame rate.

pug_b_tm_200.jpg    pug3D_tm_200.jpg

I’m taking a break for building my app to try to make a shading module for the Corona Marketplace contest.  I’m not sure if I can make the deadline but I’ll repost if I’m successful.

-Jonathan

A shading module would be super slick!

also Texture Packer has a Normal Map packing option.

@sharp100 Thanks for sharing your results. As you can read in previous posts, i already accomplished this  some months ago, using custom shaders. You can start your custom shading modules already.

Here is the video link of my results again.

The difference between using “composite.normalMapWith1PointLight” and the custom shader i already made is the amount of lights you can render in a scene. As you can see in the video, i have 3 simultaneous lights with no performance drop.

And thanks for sharing the tools. I am aware of those and already own both texturepacker and sprite illuminator.

@basiliogerman Cool, I like the varying intensities of the different lights.  Were you able to use different colors for each light source?

@sharp100 Yes, parameters on the light shader are position, falloff, and light color, the apply shader has an ambient light parameter.

I submitted my shading module to the Corona Marketplace, just in time for the contest deadline (11:59pm)!

Check out the product site and the video on the video page

http://dynamicshader.com

I’m not sure when I’ll get approval but the plugin is tested and ready to go.

-Jonathan

I think it’s feasible. At any rate, it’s crossed my mind to investigate. It would probably be a terrible idea to add to a work-in-progress project, though.  :slight_smile:

The limits you’d be up against are that Corona currently only supports two textures and that you’ll presumably want batched geometry, which mostly limits you to four shader inputs (with some encoding you might squeeze out a little more), unless you can spare a texture and cram it full of data. By the sounds of it you could pack the three-component normal and luminance together into RGBA textures (if you adopt the model in the link)?

Another issue is that you’ll need to draw your scene twice, and how best to achieve that.

What are your implementation ideas at the moment?

For now i’m not really sure, i’ll experiment tomorrow. I’m thinking about using the graphics.newTexture API, with the canvas type. Just need to see how :invalidate() works and see if i can get a canvas to draw inside another canvas, sort of like the passes. I’ll see what i can do with snapshots, try different things and see how it really works, limitations, etc. so it’s just a matter of experimenting for now. (Maybe i just said a bunch of nonsense) Right now the game uses a fullscreen snapshot, so i can apply shaders to all the game. i’ll see if i can mix things up.

I also remember reading something about uniform userdata for shaders, but i still need to read more about all that. I just have a feeling that i could be done somehow. Devs here at Corona are constantly opening up apis and new stuff for us.

btw, @StarCrunch, i know you have tinkered around with shaders more than anyone around here, you saying it’s feasible means it’s a real possibility! hehe

@basiliogerman It’s great that you are experimenting to see what works. I’ve done a lot of that in my project which makes heavy use of the new Canvas texture. I had experimented at various times with Snapshots and Shaders but in the end found that the canvas textures, for my purposes, are very fast and reliable. You mention that you use full screen snapshots. It will depend on your specific use case but that sounds expensive to me. Be sure to test that out on slower devices. I use an iPhone 3GS and an old $50 Android phone as my baselines. If something works on both those devices, then it’s good as far as I am concerned. Besides testing the speed, using older devices for testing shaders is especially important.

@jerejigga Yes, snapshots are a bit expensive but they work well on newer devices. What do you use the canvas textures for? (And how)