snapshot as paint input?

Hello, everyone!
So here is a new feature. Give it a spin, we would like to get your feedback on it.
Half a year ago we introduced texture apis. Building on that API here is modifiable texture. It resembles Snapshot object, but it is quite different, even if APIs may seem similar.
 
New feature is TextureResourceCanvas type. It is in-memory texture, you can render stuff into it, and than assign it to other objects. Note that you can assign it to several different objects.
It is a subject to manual texture management, so beware of memory leaks. If you don’t use it anymore, you have to release it. After you release it, you’re loosing pointer Lua handle to it, but Display Objects using it would not become invalid.
 
I will put some code here to explain how it works.
 
Creating with newTexture:
[lua]
local canvasTexture = graphics.newTexture( {
     type = “canvas”
     , width = 128
     , height = 128
     , pixelWidth = 256
     , pixelHeight = 256
} )
[/lua]
Here width and height here is dimensions of canvas where you would draw your objects, basically dimensions of visible world in canvas.
Pixel width/height - size of underlying texture. If omitted, this values would be selected to correspond to pixel dimensions of display object with width/height. You can set it to like 32 for cool pixely effects (with proper scaling modes), or just some other number, for example to save memory etc.
 
After it is created you can assign it to the fill to other objects. Use “canvasTexture.filename” and “canvasTexture.baseDir” pretty much anywhere where those parameters are expected. For example, in composite paint or creating image rect:
[lua]
local circle = display.newCircle( display.contentCenterX, display.contentHeight*3/4, w/2 )
circle.fill = {
    type=“composite”,
    paint1={ type=“image”, filename=“corona.png” },
    paint2={ type=“image”, filename=canvasTexture.filename, baseDir=canvasTexture.baseDir } – magic here!
}
circle.fill.effect = “composite.phoenix”
 

local rect = display.newImageRect(
    canvasTexture.filename,  – “filename” property required
    canvasTexture.baseDir,   – “baseDir” property required
    display.contentWidth,
    display.contentHeight
)

[/lua]
 
To draw to canvas, you would use method canvasTexture:draw(). This would put your display objects off the screen and into internal queue. You have to manually update the texture with canvasTexture:invalidate().

This call would schedule to render objects in internal queue before next frame, and after rendering them it would move them to cache.
You can also set custom background colour. This colour would fill the texture when it is cleared.
[lua]
canvasTexture:draw( someCircle )
canvasTexture:draw( someRect )

canvasTexture:setBackground( 0,0,0,1 )

canvasTexture:invalidate()
[/lua]

Note that (0; 0) point is in the centre of the canvas.
 
Thing to keep in mind is that when app is put in background, Android will make all GPU textures invalid, so you would have to redraw your canvas resource. Read documentation on how to do it.
 
Reading documentation is good in any way. It describes more edge cases and methods:
 
newTexture documentation <-- read me before using
TextureResourceCanvas documentation <-- me too

Canvas texture resources has a lot of fun applications, but also some limitations. Like nested textures can work weirdly, or native objects would not work at all.

Also, there is no possibility to add setPixel/getPixel. Textures are stored on GPUs and do not provide direct buffer access.

 
Also, I rewrote snapshot paint example, to use canvas. It has fire in the middle to demonstrate manual updates. It is attached to this post. Also, some other random example with star in it and composite paint.

Hi Vlads

Thanks for this. Great stuff, I will be testing it to destruction shortly.

Matt

So far everything seems fine. The texture seems to be sized a little different to just using a normal image as I needed to alter some settings in the shaders to get everything to line up again.

I will dig deep into the code, see if anything jumps out as being different or if its just my code.

Anyway I’m processing 2 x 1024 canvases, then sending them to 2 different shaders, each a composite paint at the moment ( but that might not be needed now ).

After that I take a snapshot of the result for additional processing on screen.

So far, testing on device I’m not seeing any different as to when I’m just using 1024 sized images.

Works as far back as an iPod Touch 5, which has the chip below the A7, and I can get 60fps no problem using optimised settings in the shaders.

So, all in all, this is perfect for what I need. It really opens up Corona to a lot of new techniques.

@ vlads Seems to blow up with masks.  :( Sprites seem okay, in my limited test. (Mixing it into the second example wasn’t the best way to test.  :stuck_out_tongue: )

Anyhow, I’ve got a ton of (image-based) use cases, so I’ll report if anything turns up.

I should had mentioned. Yes, for technical reasons you can not use TextureResourceCanvas as a mask (at this point). But you can set mask to objects using it as a fill without a problem.

Picking some low-hanging fruit from among my ideas, here’s a thing to paint “time” into a texture:

-- UV map thing -- -- Permission is hereby granted, free of charge, to any person obtaining -- a copy of this software and associated documentation files (the -- "Software"), to deal in the Software without restriction, including -- without limitation the rights to use, copy, modify, merge, publish, -- distribute, sublicense, and/or sell copies of the Software, and to -- permit persons to whom the Software is furnished to do so, subject to -- the following conditions: -- -- The above copyright notice and this permission notice shall be -- included in all copies or substantial portions of the Software. -- -- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, -- EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -- MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. -- IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY -- CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, -- TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE -- SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. -- -- [MIT license: http://www.opensource.org/licenses/mit-license.php] -- local CW, CH = display.contentWidth, display.contentHeight local tex = graphics.newTexture{ type = "canvas", width = CW, height = CH } tex:setBackground(0, 0, 0, 1) tex:invalidate() local img = display.newImageRect(tex.filename, tex.baseDir, CW, CH) img:translate( display.contentCenterX, display.contentCenterY ) local previousX, previousY local threshold = 2 local thresholdSq = threshold \* threshold local time = 0 local function draw (x, y) if time \< 256 then local o = display.newImage("brush.png", x, y) o:setFillColor(time / 256, 0, 0) tex:draw(o) tex:invalidate("cache") previousX, previousY, time = x, y, time + 1 end end local function listener (event) local x, y = event.x - img.x, event.y - img.y if event.phase == "began" then draw(x, y) elseif event.phase == "moved" then local dx = x - previousX local dy = y - previousY local deltaSq = dx^2 + dy^2 if deltaSq \> thresholdSq then draw(x, y) end end end Runtime:addEventListener("touch", listener) -- do local kernel = { category = "composite", group = "uv\_map", name = "basic" } kernel.vertexData = { { index = 0, name = "t", min = 0, max = 1, default = 0 } } kernel.fragment = [[P\_COLOR vec4 FragmentKernel (P\_UV vec2 uv) { P\_COLOR float r = texture2D(CoronaSampler1, uv).r; if (CoronaVertexUserData.x \>= r) return vec4(0.); // check time vs. uv return CoronaColorScale(texture2D(CoronaSampler0, uv)); }]] graphics.defineEffect(kernel) end local start = display.newCircle(CW \* .6, CH \* .7, 55) start:setFillColor(1, 0, 0) start:addEventListener("touch", function(event) if event.phase == "ended" then local r = display.newRect(display.contentCenterX, display.contentCenterY, CW, CH) r.fill = { type = "composite", paint1 = { type = "image", filename = "Image1.jpg" }, -- your image here paint2 = { type = "image", filename = tex.filename, baseDir = tex.baseDir } } r.fill.effect = "composite.uv\_map.basic" img.isVisible, event.target.isVisible = false, false transition.to(r.fill.effect, { t = 1, time = 9000 }) -- can get reverse effect going 1 -\> 0, of course end return true end)

It takes an image of your choice and the brush from the CanvasPaint example (probably should have just used a circle, since the brush has its own alpha and you get a peculiar tapering fade).

When you run it, drag around on the screen to paint in time values, which gradually increase from 0 to 1, going into the red channel. Once it hits 1 it stops painting.

Click the red circle when you’re ready and the effect’s time threshold will transition from 0 to 1, unveiling pixels with the right time values. You can adjust the “// check time vs. uv” line to change the behavior.

This is a pretty boring proof of concept. The idea can get a lot fancier, though.

@ vlads  I’m attempting something quite a bit more ambitious this time, which involves storing vertices in a texture.

I’ve tried to use vertex texture fetch. GLSL sources suggest this should “Just Work” provided the textures were already bound.

So far I’ve done something like this, in the fragment kernel:

CoronaSampler0 += 8.; // Shader fails to compile, but sampler name shows up in log

to suss out the sampler name. Then, after fixing that, in the vertex kernel:

uniform sampler2D u\_FillSampler0; P\_POSITION vec2 VertexKernel (P\_POSITION vec2 pos) { // ... stuff P\_POSITION vec2 new\_pos = texture2DLod(u\_FillSampler0, uv, 0.0).rg; return new\_pos; // what our work gave us }

This kind of thing compiles and runs, but I seem to be consistently getting “black”, suggesting the sampler is unbound.

Are there complications in Corona’s surrounding boilerplate, or am I just missing something obvious here?

(This is on Windows and I do have a way to detect that vertex texture units are available beforehand, although it’s pretty clumsy: basically, a dummy pixel with an auxiliary shader that branches on  gl_MaxVertexTextureImageUnits , followed by display.colorSample() and such.)

Also, might legitimate VTF support be a possibility? I think this would bring a lot of power in situations similar to my own.

I don’t see where you can bound this sampler. We’re bounding textures to CoronaSampler0 (also to CoronaSampler0 if you use composite paint). Try using CoronaSampler0 instead of u_FillSampler0, and don’t declare it, it’s already declared & bound.

EDIT: I see it, it is in our sample. Well… basically CoronaSampler# is a synonym to u_FillSampler#. Use that instead and don’t declare it.

Argh, I must retract what I wrote.  :slight_smile:

This much smaller test

-- do local kernel = { category = "generator", group = "custom", name = "vtf\_test" } kernel.vertex = [[uniform sampler2D u\_FillSampler0; P\_UV varying vec3 v\_uv; P\_POSITION vec2 VertexKernel (P\_POSITION vec2 p) { v\_uv = texture2DLod(u\_FillSampler0, CoronaTexCoord, 0.).rgb; return p; }]] kernel.fragment = [[P\_UV varying vec3 v\_uv; P\_COLOR vec4 FragmentKernel (P\_UV vec2 \_) { return vec4(v\_uv, 1.); }]] graphics.defineEffect(kernel) end local r = display.newImage("Image1.jpg") r.x, r.y = display.contentCenterX, display.contentCenterY r.fill.effect = "generator.custom.vtf\_test"

is giving me exactly what I would expect (a blend of the four corner colors) for a variety of images, so apparently it DOES work. Time to restore some earlier code!

(Actually, it doesn’t even seem picky about the name, only that there’s a sampler there… and their order, I suppose, with two? I’ll dig into the spec, when I have a chance.)

Would another option to system.getInfo() that returns  GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS be possible?

@vlads thanks for the help. I managed to achieve what I wanted, and more. Ive just up a post about it here https://forums.coronalabs.com/topic/61473-mode-7-3d-at-60fps/

@ vlads  Well, I’ve got a good proof of concept going for what I was after: mesh test

It seems to be working well on Windows desktop but on somebody else’s OSX machine it’s not.  :frowning: (On the latest daily build, too.) His screenshots from OSX were consistent with the texture not being read. I haven’t attempted on a device.

This basically tests two approaches. In the first, when vertex texture fetch is available, positions and uvs are stored in a texture. Otherwise, positions (which must change) are stored in vertex userdata and uvs plus offsets-to-uvs are stored in the texture. Both techniques are used to sculpt some boring stock triangles into arbitrary ones dynamically.

Actually, if anybody else reading this would care to give the linked code a spin on Windows and / or OSX desktop (or device, I guess, though I’m not sure it’s ready) and confirm (or not) what I said above, it’d be most appreciated!

Here you go, latest OSX.  Untitled_3.png

This is what I got

@ vlads Yours is what I’m hoping to get. Probably a dumb question, but that’s also the simulator? (Just noting the differences at the top from Matthew’s shot.)

What Matthew has matches what I was shown earlier. The vertices in that case come from userdata, whereas the uvs should be in the texture but seem to resolve to 0. (I think.) If the same thing’s happening with the other triangles, the vertex information is all 0 too and so we have a zero area triangle.

I’m stumped.  :huh:

I got around to working on this on Android and was running into the same problems. On investigation, it looks like the shader compiler on Windows was overly lenient about the ordering of certain keywords, namely varying and the precision qualifier. (When it comes to out params, the others are the offenders. Go figure.) With that I have it running on an older tablet.

Could I get another couple tests on OSX, to see if this fixed the issues? I’d be interested in any iOS results, as well. (Same link as before.)

There are now three instances of the mesh showing at once, though the right-hand one should gracefully go transparent when vertex texture fetch is unavailable. This is probably unlikely on desktop, but a decent possibility on mobile.

Each of these three implements a different technique. The newest is a fairly brute force approach, but its vertices are able to go offscreen. The other two use some encodings that depend on coordinates being in the range [0, 1024)… to fix that I either need to write a ton of shaders for the corner cases or perform clipping, which would really complicate the internals. For now I’m just clamping in these cases.  :stuck_out_tongue:

I’ve added some touch controls to move the vertices around (the blue circles on one of the meshes) as well as the texture coordinates (the red ones, below).

Thanks for any help!

we are converting snapshot images into png files via Flash and using as image files, we have had some successful results with scaling small simple objects. We have not looked into any bump mapping yet. Testing the idea of setting these up in sprite sheets for object textures. We are going to try building the sprite sheets using texture packer.

Hi Steve!

I can´t begin to explain how excited I am about this!

I tested the mesh_test and here is what I got running Version 2015.2802 (2015.12.31) on my brand new iMac:

Simulator:

Only two “meshes” and the one that I can edit is grey (like Matthew had). The other one is not grey.

iPhone 6 plus:

Three meshes and none of them are grey. It seems to work!!!

OSX:

Same as simulator.

-Tom

I downloaded latest daily build 2824 and tested again with the same results.

@sirmania

Well, it sounds like there’s at least enough to work with.  :)  In practice, there could be noticeable speed differences among the techniques; for now, all I have is this tiny program, so I can’t say.

I have some vague ideas where the problem might be happening, but will need to do some digging.

If you add

mesh.DoTest(100, 100, 1, 0, 0, -- Look for a red pixel at (100, 100) function(vtf) print("Supports VTF?", vtf) end)

do you get true or false? If false, that will explain the missing parts. (I’m getting mixed results trying to learn which architectures support vertex texture fetch, sometimes blaming the driver as well.)

The new public release gave me the same issues on Windows, which seem to have been fixed by amending my half-texel offsets (these were my “some vague ideas”).

This might have fixed the issues on OSX. If I could get another set of eyes on it, that’d be great!

It works now both in the mac simulator and as an osX app :smiley:

I used the latest build AND a previous build (2808) and it worked on both…Great stuff!

-Tom