Great work! Polygons are coool! Fill is cool as well.
We need antialiasing! Critical feature. Without that all vector stuff just doesn’t look right.
It’s great that touch listeners on polygons are limited to it’s area, but it’s not perfect when stroke is involved.
Add support to be able to round sharp corners of polygons and polylines. – not critical, but would be really nice
Direct manipulation of vertices. Like poly.vertices = {{0, 2}, {3, 4}} or just {0, 2, 3, 4}. Maybe it will be required to write one line above something like poly.vertices = poly:getVerticesTableByReference()
Set display.newImage() or other display object as a fill for anything.
DIRECT PIXEL CONTROL , if we can render to texture now, we must be able to generate images on the fly in Lua. Create a new blank image (or use already opened, that will require image:copy() or something to preserve old texture for all other image objects) and manipulate it’s pixel values with obj:getPixel() and obj:setPixel().
Custom filters in Lua. (solvable with number 6)
Ability to rasterize effects (make changes permanent at runtime).
It would be really cool if you could just implement something similar to what Python Imaging Library is (basic functionality), http://effbot.org/imagingbook/image.htm
Thanks for feedback! Please give use cases, as this will help us prioritize, not to mention help us verify we’re talking about the same thing!
In terms of raster/texture/pixel stuff, the challenge we have is to make everything work in real-time on the GPU. This is very different from the old Flash days where all graphics was done with a general purpose CPU. A GPU pipeline is designed to optimize rendering to the screen, so in GPU-land, direct pixel accesses kill performance. In fact, this is one of the reasons why Flash performance is so bad on mobile.
With that in mind:
#3: That’s on the list, but it falls under convenience functions. Today, you have the ability to create any polygon you want, including ones with smoothed corners.
#4: Still searching for the right API design here…
#5: Yes, this is in essence a generalization of snapshot objects, so we’re not there yet.
#6: Render to texture is very different from the direct pixel access you are talking about. When you create new textures with rtt, the goal is to avoid memory accesses between CPU and GPU-land — they are all in the GPU.
#7: Custom filters would be done via a filter kernel written in GLSL, not via direct pixel control.
#8: I’m not sure what you mean. All filter effects are rasterized in real time.
#9: If you mean build textures directly from display objects, then yes. And in some ways, this may let you achieve many of the things you might otherwise want with direct pixel control.
#6 - Is this on the roadmap? I think that I, like many others, have assumed it would be given the original use of Corona as a mobile photoshop app (back in the day) but I can see that for games it would be almost immeasurably hard to make performant.
Thanks for the answers! I see the problem with the GPU land.
Firstly, I am very glad that antialiasing actually made it! I suppose with time it’s gonna be supported in the simulator as well.
#3 Yeah, that’s why not critical. It would just simplify some things on developer’s side.
#4 I think it’s wise to use tables, because you can do transitions with them. And it’s easy to store and restore all the table. Since not a whole lot of people are gonna use this feature my proposition is to have a function, that will open the table to Lua side. Like obj:getVerticesTable(), it might use some methametods to detect changes and update the object. And I am totally fine with :getVetrex() and :setVertex() if nothing better can be made.
Performance wise it is better to keep such table of vertices flat (without nesting), that’s totally fine too.
Another option with table would be not to use metamethods but directly tell the object it has to update itself. Like obj:updateVertices()
#6 and #9 Ok, direct pixel access in GPU is hard. But I want to be able to prepare/generate images at runtime in Lua (CPU) and then be able to load such images into GPU. I don’t believe this could be hard. Something like loadImageFromString() or FromTable(). There is gonna be a new object type - in memory bitmap. It’s better to have the implementation in C rather than using Lua tables for big images. Although it will work too.
With that in mind I would love to make various manipulations on such images. Like copy region, paste, overlay, resize, crop, compute histogram, apply function to each pixel, transpose, fill, apply some built in filters like edge detection, contrast increase and so on. That would lead to convenient Digital Image Processing in corona (like OpenCV - look at https://github.com/marcoscoffier/lua—opencv). That would be really cool!
Such a big library for image manipulation should become a plugin for sure. Maybe even integrating OpenCV is not a bad idea. And releasing this plugin in opensource.
#8 I mean that I can get a *snapshot* of filtered image and use it as a base for next filter. Maybe load it into in memory bitmap in CPU land. Chaining filters.
#6,#9, our focus right now is to give you image processing effects on the GPU via filters. We’ll explore other ways to load CPU-generated images later. My guess is the only way to do this in a high-performance way is to offer such access via a native C API.
#8, yes you can use that to chain. But we are working on something better
Thanks for feedback! Please give use cases, as this will help us prioritize, not to mention help us verify we’re talking about the same thing!
In terms of raster/texture/pixel stuff, the challenge we have is to make everything work in real-time on the GPU. This is very different from the old Flash days where all graphics was done with a general purpose CPU. A GPU pipeline is designed to optimize rendering to the screen, so in GPU-land, direct pixel accesses kill performance. In fact, this is one of the reasons why Flash performance is so bad on mobile.
With that in mind:
#3: That’s on the list, but it falls under convenience functions. Today, you have the ability to create any polygon you want, including ones with smoothed corners.
#4: Still searching for the right API design here…
#5: Yes, this is in essence a generalization of snapshot objects, so we’re not there yet.
#6: Render to texture is very different from the direct pixel access you are talking about. When you create new textures with rtt, the goal is to avoid memory accesses between CPU and GPU-land — they are all in the GPU.
#7: Custom filters would be done via a filter kernel written in GLSL, not via direct pixel control.
#8: I’m not sure what you mean. All filter effects are rasterized in real time.
#9: If you mean build textures directly from display objects, then yes. And in some ways, this may let you achieve many of the things you might otherwise want with direct pixel control.
#6 - Is this on the roadmap? I think that I, like many others, have assumed it would be given the original use of Corona as a mobile photoshop app (back in the day) but I can see that for games it would be almost immeasurably hard to make performant.
Thanks for the answers! I see the problem with the GPU land.
Firstly, I am very glad that antialiasing actually made it! I suppose with time it’s gonna be supported in the simulator as well.
#3 Yeah, that’s why not critical. It would just simplify some things on developer’s side.
#4 I think it’s wise to use tables, because you can do transitions with them. And it’s easy to store and restore all the table. Since not a whole lot of people are gonna use this feature my proposition is to have a function, that will open the table to Lua side. Like obj:getVerticesTable(), it might use some methametods to detect changes and update the object. And I am totally fine with :getVetrex() and :setVertex() if nothing better can be made.
Performance wise it is better to keep such table of vertices flat (without nesting), that’s totally fine too.
Another option with table would be not to use metamethods but directly tell the object it has to update itself. Like obj:updateVertices()
#6 and #9 Ok, direct pixel access in GPU is hard. But I want to be able to prepare/generate images at runtime in Lua (CPU) and then be able to load such images into GPU. I don’t believe this could be hard. Something like loadImageFromString() or FromTable(). There is gonna be a new object type - in memory bitmap. It’s better to have the implementation in C rather than using Lua tables for big images. Although it will work too.
With that in mind I would love to make various manipulations on such images. Like copy region, paste, overlay, resize, crop, compute histogram, apply function to each pixel, transpose, fill, apply some built in filters like edge detection, contrast increase and so on. That would lead to convenient Digital Image Processing in corona (like OpenCV - look at https://github.com/marcoscoffier/lua—opencv). That would be really cool!
Such a big library for image manipulation should become a plugin for sure. Maybe even integrating OpenCV is not a bad idea. And releasing this plugin in opensource.
#8 I mean that I can get a *snapshot* of filtered image and use it as a base for next filter. Maybe load it into in memory bitmap in CPU land. Chaining filters.
#6,#9, our focus right now is to give you image processing effects on the GPU via filters. We’ll explore other ways to load CPU-generated images later. My guess is the only way to do this in a high-performance way is to offer such access via a native C API.
#8, yes you can use that to chain. But we are working on something better