Round rect aliasing

I was wondering if there is a way to change the way the round rect renders it’s corners?

I’m testing on two devices, one with a 720p resolution screen, and one that has a 1440p resolution screen (the Samsung Galaxy S7).

On the high res screen, round rect corners lose their smoothness when rendering a large corner radius. In the image below you can see each corner is made up of 4 lines, with the corner radius being half the rect height.

5c27b8d5c6.png

To be clear, I want to increase the amount of lines/points used when rendering the corner.

I’d like not to do this manually with a rounded-side texture and rects or something.

I’m on build 2016.2828

Thanks for your time.

The deactivation of anti-aliasing is discussed in the below thread:

https://forums.coronalabs.com/topic/62430-pixelization-anti-aliasing-issue

I’m not sure if this really has to do with anti-aliasing.

I was talking about the subdivisions the graphics renderer makes to plot the curve for each corner.

The amount of subdivisions is the value I want to be able to increase.

It seems like you are talking about the effect that anti-aliasing would have; a more smooth curved line. If not, my apologizes for the misunderstanding.

No apology needed, my fault for having the word aliasing in the title probably.

I’m currently working on a function to generate a rounded rect with the option to specify subdivision count, using  display.newPolygon.

After I get that working, I’ll probably try to extend it to measure pixel count from start-end points of a curve and have it automatically determine the necessary subdivisions to make. The aim is to scale proportionately between display sizes. This also causes me to wonder, how many points in a polygon would it take to start causing lag in a particular device? I may have to do some benchmarking, but this is getting off-topic…

The deactivation of anti-aliasing is discussed in the below thread:

https://forums.coronalabs.com/topic/62430-pixelization-anti-aliasing-issue

I’m not sure if this really has to do with anti-aliasing.

I was talking about the subdivisions the graphics renderer makes to plot the curve for each corner.

The amount of subdivisions is the value I want to be able to increase.

It seems like you are talking about the effect that anti-aliasing would have; a more smooth curved line. If not, my apologizes for the misunderstanding.

No apology needed, my fault for having the word aliasing in the title probably.

I’m currently working on a function to generate a rounded rect with the option to specify subdivision count, using  display.newPolygon.

After I get that working, I’ll probably try to extend it to measure pixel count from start-end points of a curve and have it automatically determine the necessary subdivisions to make. The aim is to scale proportionately between display sizes. This also causes me to wonder, how many points in a polygon would it take to start causing lag in a particular device? I may have to do some benchmarking, but this is getting off-topic…