Camera Fill - multible filter effects not working

I am trieing to create a camera preview and adjustment screen applieng multible filters to

the preview object which is filled with the camera feed. Everything is fine as long as I only use one filter in my custom effect list.

As soon as I add a second effect the result is just black. Any ideas?

Thanks in advance.

Here is a code snippet:

local imgCamPreview = display.newRect( _W/2, _H/2 - 20, 480, 640 )

imgCamPreview.fill = { type=“camera” }

graphics.defineEffect({
            language = “glsl”,
            category = “filter”,
            name = “camSetting”,
            graph = {
                nodes = {
                    brightnessFilter = { effect = “filter.brightness”, input1 = “paint1” }
                   – contrastFilter = { effect = “filter.contrast”, input1 = “brightnessFilter” }
                },
                output = “brightnessFilter”

                --output = “contrastFilter”
            }
        })
        
       imgCamPreview.fill.effect = “filter.custom.camSetting”
       imgCamPreview.fill.effect.brightnessFilter.intensity = mnBrightness
    –   imgCamPreview.fill.effect.contrast = mnContrast

I believe you may want to try using display.newSnapShot. Or see if this helps:

https://forums.coronalabs.com/topic/40597-multiple-effects-on-single-object/

Rob

I tried working on a camera app using Corona SDK a while ago…

From my tests, you can apply single filter effects to the camera feed, if you use a SnapShot as a container of the object with the camera feed fill.

However, that doesn’t work well/it’s buggy with multi-pass shaders.

When using a multi-pass filter on a snapshot, containing a camera feed object, only one filter of the multi-pass nodes is applied (I couldn’t figure out exactly why only a certain filter is applied though…).

The only way I found to apply multi-pass filters to a camera feed is:

  1. have an object with the camera feed fill somewhere in the screen 

  2. have a runtime function that, on every frame, captures a frame of the camera feed object 

  3. add that image to a snapshot object canvas (with discard method)

  4. apply the multi-pass filter to the snapshot

HOWEVER, that only works if the multi-pass filter is relatively simple.

As soon as you start adding chains of 4/5 filters (or using some specific filters like bloom or frostedGlass) there are memory leaks somewhere and the app slows down and crash after a while. 

Ok, so I went back to my old code and tried using some new features that were not available when I tried the first time (TextureResourceCanvas)… and now it works, and actually pretty well without crashing or slowing down too much…

It’s a bit convoluted (and of course needs to be optimised… it’s only a proof of concept right now)… but what I did:

  1. I created a textureBuffer using graphics.newTexture (type=“canvas”) (to use as camera buffer).

  2. A runtime function, running every frame updates the textureBuffer with these steps:

2.1 Creates a new display.newRect, set the fill as camera and then add the rect to the buffer using textureBuffer:draw(rect)

2.2 Invalidate the buffer using textureBuffer:invalidate() then cleanup the cache groupobject of the buffer (textureBuffer.cache)… otherwise it’s leaking memory

Now you have a real time buffer of the camera in a textureResource, you can then just create a view using display.newRect and setting the fill to use the textureResource with type=“image”, filename=textureBuffer.filename, baseDir=textureBuffer.baseDir…

The new view will show the real time camera and you can apply multi-pass filters.

Check out the textureResourceCanvas api

https://docs.coronalabs.com/daily/api/type/TextureResourceCanvas/index.html

https://docs.coronalabs.com/daily/api/library/graphics/newTexture.html

I believe you may want to try using display.newSnapShot. Or see if this helps:

https://forums.coronalabs.com/topic/40597-multiple-effects-on-single-object/

Rob

I tried working on a camera app using Corona SDK a while ago…

From my tests, you can apply single filter effects to the camera feed, if you use a SnapShot as a container of the object with the camera feed fill.

However, that doesn’t work well/it’s buggy with multi-pass shaders.

When using a multi-pass filter on a snapshot, containing a camera feed object, only one filter of the multi-pass nodes is applied (I couldn’t figure out exactly why only a certain filter is applied though…).

The only way I found to apply multi-pass filters to a camera feed is:

  1. have an object with the camera feed fill somewhere in the screen 

  2. have a runtime function that, on every frame, captures a frame of the camera feed object 

  3. add that image to a snapshot object canvas (with discard method)

  4. apply the multi-pass filter to the snapshot

HOWEVER, that only works if the multi-pass filter is relatively simple.

As soon as you start adding chains of 4/5 filters (or using some specific filters like bloom or frostedGlass) there are memory leaks somewhere and the app slows down and crash after a while. 

Ok, so I went back to my old code and tried using some new features that were not available when I tried the first time (TextureResourceCanvas)… and now it works, and actually pretty well without crashing or slowing down too much…

It’s a bit convoluted (and of course needs to be optimised… it’s only a proof of concept right now)… but what I did:

  1. I created a textureBuffer using graphics.newTexture (type=“canvas”) (to use as camera buffer).

  2. A runtime function, running every frame updates the textureBuffer with these steps:

2.1 Creates a new display.newRect, set the fill as camera and then add the rect to the buffer using textureBuffer:draw(rect)

2.2 Invalidate the buffer using textureBuffer:invalidate() then cleanup the cache groupobject of the buffer (textureBuffer.cache)… otherwise it’s leaking memory

Now you have a real time buffer of the camera in a textureResource, you can then just create a view using display.newRect and setting the fill to use the textureResource with type=“image”, filename=textureBuffer.filename, baseDir=textureBuffer.baseDir…

The new view will show the real time camera and you can apply multi-pass filters.

Check out the textureResourceCanvas api

https://docs.coronalabs.com/daily/api/type/TextureResourceCanvas/index.html

https://docs.coronalabs.com/daily/api/library/graphics/newTexture.html