Corona is the only runtime not supporting meshes?

@Rob Miracle. Fair enough. Any clues as to why this request has been upgraded to: planned?

Cause I set it to planned for tracking purposes. Normally each week in the team engineering meeting I present the team several items from the Feedback site asking: Yes, we will do this, Maybe, or No. Usually when I get a yes, a work ticket is created in the task manager we use and gets assigned to an Engineer. When that happens, I set the feedback request to “Started”. But with this one, it’s we want to do this, but it’s not assigned to an Engineer yet. It’s in a limbo state and I need a way to track it, so I marked it “planned” which is exactly what “planned” should mean. We intend to do it, but we’ve not worked it in to the schedule.

Because things come up and change our priorities, “planned” can’t be a promise. It’s simply something we want to do.

Thanks for the insight. Sounds like a good plan.

I´m super excited about this!

Thanks to Steven, Vlad and anyone else that has worked on this  :rolleyes:

Steven sent me sample codes that blew me away :blink: Spine meshes working in Corona  B)

That guy is a star  :rolleyes:

@sirmania sounds promising :slight_smile:

As a bit of an update, the changes in the (non-Corona) spine-lua parts have made it into the runtimes. I haven’t yet submitted the Corona-specific part, but that’s all in the spine.lua file in the link I posted earlier.

It might still need a few minor tweaks but should be usable. As I mentioned in another post, I doubt the (planned) Corona API will differ too drastically; in any case, any such changes should be limited to the library. (As an aside, I’m already brainstorming how I could repurpose this stuff, once the mesh burden is lifted. :))

If you do animations with non-skinned meshes, there’s still a looping glitch. In hindsight I was probably looking in the wrong place for it; got a little tunnel vision after puzzling out skinned animations. If I feel motivated one day maybe I’ll try to track it down. Apart from that, I’m not aware of anything major (there might be a render order issue, but that ought to be an easy fix).

Exciting Progress. Thanks for everyones efforts and voting. Rob and team we look forward to your upcoming efforts. Thanks Nick (kadlugan/online2)

FWIW, I posted some info on the feature request.

Thanks Nate.

Just curious to clarify what you said in the feature request comments:

  1. Are you offering some general advice that the implementation of the meshes in Corona (when/if that starts happening) should favour single scene graph node as it’s more effective?

  2. Or are you commenting on work that’s already been done on this (in Corona or the StarCrunch work) that their individual scene node approach is not ideal?

Yes, to both. I haven’t had a change to look into StarCrunch’s work, but using individual scene graph nodes is the only option because Corona doesn’t have an API to render meshes. An API that did that would be relatively simple:

clear()
Clears all vertex data.

add(texture, vertices, uvs, triangles, r, g, b, a)
This stores vertex data that will be rendered later. Parameters:

texture is the image to use to texture the mesh.
vertices is an array of x,y pairs for each vertex.
uvs is an array of u,v pairs for each vertex (matches the vertices array).
triangles is an array of vertex index triplets.
r,g,b,a is the color to tint all vertices.
additive if true, the mesh should be drawn with additive blending.

Let’s call the Corona object that has these two methods a “MeshBatch”. The Spine runtime would go thru the attachments and call add() for each one. The add() method would store (copy) all the parameters in a list. Presumably the MeshBatch has a single, large VAO (or VBO). When the MeshBatch needs to draw, it goes through the list, copies the vertex data into the VAO, and renders it.

This API is simple to use and allows the MeshBatch to handle texture and additive blending changes. If the MeshBatch encounters a texture (or additive blending if PMA is not used) that is different from the last texture, it renders the VAO and starts batching for the new texture.

Many game toolkits use exactly what I’ve described. Often they are able to drawing immediately when add() is called if the texture or blending changes. Examples of that are:

libgdx, Java:
https://github.com/libgdx/libgdx/blob/master/gdx/src/com/badlogic/gdx/graphics/g2d/SpriteBatch.java
Starling, AS3:
https://github.com/EsotericSoftware/spine-runtimes/blob/master/spine-starling/spine-starling/src/spine/starling/PolygonBatch.as
cocos2d-x, C++:
https://github.com/EsotericSoftware/spine-runtimes/blob/master/spine-cocos2dx/3/src/spine/PolygonBatch.cpp

Corona’s MeshBatch would need to store the data passed to add() because it isn’t drawing until later. Other game toolkits also do this, eg:

XNA, C#:
https://github.com/EsotericSoftware/spine-runtimes/blob/master/spine-xna/src/MeshBatcher.cs

This API is slightly fancier than I described because it pools “MeshItem” objects. Instead of add() you call NextItem() and configure the returned MeshItem with the vertex data. After it draws it clears the MeshItems, but Corona’s API would keep the MeshItems until clear() is called.

Anyway, Corona could either provide an API like the above (which is generally useful for mesh rendering even without Spine), or a single scene graph node that renders an entire Spine skeleton. Since a single node would do the batching at draw time, it could draw immediately when add() is called (like libgdx, Starling, cocos2d-x) and wouldn’t need to cache as much or store it for as long (like XNA).

The other thing I’d like to see for Corona is a texture atlas with named regions. Using numbers for regions is extremely limiting.

Hi Nate.

Awareness of the possibility of being able to swap attachments in and out was one of the reasons I’ve been hesitant to PR the parts specific to Corona (I’m the same  ggcrunchy from GitHub, by the way). I did try to follow the approach used by images but could only really guess about how it would all unfold in practice.

The technique I used was to generate all the triangles I would need and stuff them into a group. Every triangle has the same geometry: (0, 0), (1, 0), (0, 1). The x- and y-coordinates are supplied to them as shader inputs, aggregated into arrays. Ditto the uvs.

In the vertex shader, the index is resolved as x * 2 + y (or maybe the other way around, I forget), so 0, 1, or 2. This is used to look up the x and y, which the shader emits, and the uv, which is passed along to the fragment shader. From there it’s business as usual.

On mesh creation, I build a vertex-to-triangle lookup data structure. When a position or uv is updated, the change is propagated to all affected triangles.

This is a bit redundant and obviously not as performant as a dedicated solution would be, but does the job. In small meshes we should be below the break-even point anyhow. (I have two alternative techniques that forgo the lookup shenanigans, storing the uvs, and in one case the positions as well, in textures. This allows for batching, but imposes some annoying limits. Namely, they need to satisfy some GLSL limits, which constrain x and y to being integers in [0, 1024]. Some of this can be avoided by shifting the local coordinates, but that falls apart if the mesh itself has dimensions in excess of 1024.)

I actually assumed color worked at a finer granularity (e.g. per vertex), so was meaning to do some refactoring. If not, both that and the blend mode would simply entail iterating the triangles and setting the appropriate property.

The texture switch might be a bit more brutal. I think swapping the fill will wipe all the shader properties. (Maybe I could pester a little bit to get some support for reading out the properties. I’m doing this in a slightly not-yet-official way…)

A clear could be effected by just throwing away this whole data structure, really. At the moment this won’t be terribly fast, but some judicious recycling of triangles ought to mitigate that. For one, the last paragraph’s concern would be irrelevant.

I never looked into the atlas thing, simply because I didn’t have a test case (kind of a vicious cycle  :)) but I think this could be a simple wrapper over a few other things.

Agreed about the general usefulness of meshes. Sadly, I’ve been distracting myself with other little projects and haven’t gotten around to playing with this stuff more generally.

First off, A big thanks to StarCrunch for getting meshes working. 

I have been working with the library from StarCrunch to use meshes on my project, but I am running into an issue. I need to be able to scale the animated character, but when I try scaling the character, everything scales except for the parts that are meshes. Maybe I am missing something, but is there a way to get the meshes to scale with the rest of the character? Or at least get a handle for the meshes to scale them manually? Hopefully this makes sense. 

Thanks!

@ tymadsen

As I said in the previous post, the underlying geometry is a soup of triangles with local coordinates: (0, 0), (1, 0), (0, 1). I take advantage of the fact that these will also happen to be the texture coordinates. In the mesh code, in impl/unbatched.lua , you’ll see this line:

return corner + pos - CoronaTexCoord;

CoronaTexCoord will match one of the expected local coordinates. It’s subtracted from pos (Corona’s calculated position for the corner) since we don’t actually care about this little local triangle: we only needed something that wasn’t degenerate. So all three corners will map to the content coordinate of (0, 0), and then corner is added to that.

Scaling will slightly break this assumption. It should already be baked into pos (are the meshes scaling, but wrongly?), so you’d need to scale the correction as well as the corner’s contribution. Something like:

return pos + (corner - CoronaTexCoord) \* scale;

There’s still some unused space in the mesh data structure, so this might be possible to hack in. My thinking is, in either Populate() or ResolveIndices() in that same file, you could sneak in a little dummy rect:

mesh.dummy = display.newRect(mesh, 0, 0, 1, 1) mesh.isVisible = false

then, in UpdateUV(), before the loop, check its size:

local w, h = mesh.dummy.contentWidth, mesh.dummy.contentHeight

and in the loop itself replace the lines

for i = 7, 9 do uvs[i] = 0 end

with

uvs[7], uvs[8], uvs[9] = w, h, 0

to spam its scaling to all triangles. I believe the uvs are constantly updated so this should stay in sync.

(I used a 3x3 matrix to be able to randomly access the corners’ three coordinates but had no use for the third column.)

Finally, amend that line earlier to:

return pos + (corner - CoronaTexCoord) \* vec2(u\_UserData3[2][0], u\_UserData3[2][1]);

and cross your fingers.  :slight_smile:

This is all off the cuff (it’s a bit late here) but seems at least plausible to me. In theory a similar idea could be used to effect rotations, but that’s left as an exercise for the reader.

@StarCrunch

Thanks for the reply! I tried your suggestion, and the mesh is scaled initially, but it is not updated while animations are going if the scale is changed. I tried multiplying the height and width of the dummy rect by a scale I pass into the UpdateUV function, but to no avail. Maybe I am trying to scale the wrong thing here? 

I am not familiar enough with the code you have written to understand really what is going on with the UVs and triangles. I am fairly new to using spine and meshes, so most of the technical stuff goes over my head. Anyhow, I appreciate you taking the time to help. For now I will have to find a different solution.

Thanks again.

@ tymadsen If I find a chance maybe I’ll putter around with it, though it might be a while. Seems like a reasonable thing to have available.

I’m curious now. Went to the Spine site and there is a Corona run-time. Has anyone ever used this successfully with Corona?

http://esotericsoftware.com/

Yes… Many of us have

Aaaah!

I need meshes for most of my animations in my current project. Otherwise they will look like *bliiip*. And I have coded to much to start over again in another SDK. What to dooo…?! Blah  :( When I started my project, I did´t know corona was the only runtime not supporting meshes.

I hope @roj will hire @StarCrunch to. I bet he would fix this quickly with his beautiful mind  :slight_smile:

Please vote for meshes here:

http://feedback.coronalabs.com/forums/188732-corona-sdk-feature-requests-feedback/suggestions/8614108-add-meshes-support-for-spine

Great to see this has been started on http://feedback.coronalabs.com/forums/188732-corona-sdk-feature-requests-feedback/suggestions/8614108-add-meshes-support-for-spine

Will keep an eye on developments. Any word as to how long this will take?

We can’t provide dates. Got a lot of testing to do.

Rob