Why the fuss with config.lua and variable aspect ratios? (Alternative code inside)

This is the latest article on config.lua:

http://coronalabs.com/blog/2013/09/10/modernizing-the-config-lua/

I remember reading the precursor to this and being a bit ‘eh?’, but now they’ve released a new article and I’m struggling to understand what the issue is with config.luas regarding screen resolutions.

In particular what are they trying to achieve with the variable set up?

They talk of doing this to enable using fullscreen backgrounds (of which they have several), but then you need to make your code work with variable sized windows (the idea of changing the resolution of an app just to match a background image strikes me as absurd).

I have what I believe to be a much simpler method, that works on any aspect ratio.

The core of the config.lua would be:

[lua]application =

{

    content =

    {

        width              = 320,

        height             = 480,

        scale              = “letterbox”,

        xAlign             = “left”,

        yAlign             = “top”,

}[/lua]

Naturally you’d add your image suffixes for content scaling as you want, and change the default resolution to whatever you want as the minimum area.

Now, how does this handle changing aspect ratios?

Well, it doesn’t directly, but there-in lies the power. What it will do is stick a 320x480 area in the top left of your device display, stretched as big as possible without overlapping the borders (so you might get borders on the right side, or the bottom).

So now what? Well, it is easy. Just add the following code to your main.lua:

If running a portrait app:

[lua]_W   = math.floor( display.actualContentWidth + 0.5 )

_H  = math.floor( display.actualContentHeight + 0.5 )[/lua]

If running a landscape app:

[lua]_W   = math.floor( display.actualContentHeight + 0.5 )

_H  = math.floor( display.actualContentWidth + 0.5 )[/lua]

And now your globals _W and _H are the ACTUAL number of pixels you have for width and height, so instead of doing display.contentWidth you’d use _W etc. And… that’s it really, it just works. Far simpler, and you are always working from at least one guaranteed size of axes.

I’ve been using this approach for ages now and never had any problems at all regardless of device resolution or aspect ratio, and I don’t really get all the fuss about complicated config.lua.

Would welcome feedback, as I can’t help feeling I’m missing something… :slight_smile:

I tried substituting your code above, and it make my screenview about 3 times what it normally is, causing all objects to be very small. The _W and _H didn’t really set anything in the correct places either. 

If we originally patterned out config files after the “ultimate config.lua” post best practices, would we need to re-evaluate our code to fit your examples above?

Well, did you set the width and height in the config.lua to what you want as your ‘minimum’? I think most people nowadays assume a larger ‘default’ resolution but I still work at old iphone resolution and let content scaling take care of the rest.

Basically what my code will do is give you a resolution that is 0,0 in the top left, and extends to _W, _H, which will be the resolution closest to what you specify in the config.lua, while filling the entire screen and without distortion of aspect ratios.

What it does mean is you need to make your app position stuff relative to the _W, _H screen size, but my understanding of the ultimate config.lua file is that you end up with different resolutions depending on the device anyway - so using it your code must already handle variable resolutions.

If you set up the code as I have it above, then say you want to position something at the top right, it would be x, y = _W, 0. Bottom right would be x, y = _W, _H etc, which I figure most people are already doing (or some variant thereof), unless they are strictly using letterbox with borders.

I’ve never touched these config files, as they just seem too complicated for what they are trying to achieve, but I believe if your code already works on variable resolutions, then you’d just have to replace display.contentWidth with _W and display.contentHeight with _H throughout the project and it should just work.

If you had a simple example of a project using a complex config file you don’t mind me looking at, I’d happily see what was required to make the change, so we could determine if it was actually worth it or not in general.

Here are some updated screenshots showing the different in config files:

normal:

application = { content = { width=320, height=480, fps = 60, scale = "letterbox", }, }

updated:

application = { content = { width = 320, height = 480, scale = "letterbox", xAlign = "left", yAlign = "top", }

Seems kinda strange that it stretches so much even though basically nothing has changed. Whadda ya think?

That is extremely odd - but then the first one isn’t using any complicated config file.

In-game how do you calculate the width and height of the display - display.contentWidth etc?

You used the other code I listed to create _W and _H right? Did you use the right one per landscape mode? It looks to me like it may have the _W and _H values mixed up - note for landscape you set _W to actualDisplayHeight, not width, and ditto for _H.

On second look, I had a curly bracket out of place :frowning: The “updated” config build looks the same as the “normal” build. 

I have another project I want to try this on, because if it works, I will have pulled all my hair out for nothing!

I agree Rakoonic, and this is exactly how I do things. I would much rather work from the top left like I’m used to. Especially with all the headers and things like that. As long as you know the actual screen dimensions then people should size and place objects according to percentages of the _W & _H. Just makes more sense to me.

The only slight ‘issue’ I have with any scaling at all is that it can make pixel perfect alignment a slight issue. Maybe it’s best not to scale at all and work completely with percentages of the actual resolution?

First of all, display.contentWidth and display.contentHeight are supposed to be read-only varaibles (see: http://docs.coronalabs.com/api/library/display/contentHeight.html).  You probably should not be trying to change them.  The display.actualContentHeight would seem to handle getting you the height of the content area including the letterBoxed bars and you could use that if needed, but other values like display.contentCenterX and display.contentCenterY are set based on display.contentWidth and display.contentHeight.  You in code have to change those values if you want to use them.

While you are an experienced programmer and understand the pitfalls of using globals and know when you can safely use them, that isn’t the case for many of the developers out there.  We are trying to discourage usage of global variables and while _W and _H are handy short cuts, it’s a better practice to use the values provided with the display object.

The beauty of Corona SDK is things like are not made of right and wrong ways to do things. You should use what works for you.  But with the wide level of skills helping the community solve a problem is an easy manner is a good thing.

Rob - Yeah I would not recommend changing values like that either, and I was also hesitant about using the globals sample, but I’ve seen plenty of people using them.

Regarding the other values not being correct, you are right, but then again I never used them to be honest - I just needed to know the width and height and I do the rest myself. I just felt I was missing something because this config.lua thing is something I’ve seen raised many times in the past.

As for Corona (or rather Lua) having multiple ways of doing things I agree. I consider myself a reasonable programmer, but am new to Lua, and still learn new things all the time that makes me love it more and more :slight_smile:

@craig - I have this issue in some games, where I want pixel perfect control. What I do in this case is have no scaling mode, and for front ends and menus etc I wrote my own alternative to content scaling. It works essentially the same, but the difference is you can toggle it on and off, so I get the best of both worlds.

@Rob, I don’t think anyone is changing those read-only variables. Just using them as a reference. Always referring to the _W & _H when sizing and placing elements on the screen (adding differences depending on which device) gives a developer more control imo, rather than sort of ‘hoping for the best’.

But agreed, whatever is easier for each developer.

Also what is wrong with using just a few globals? Surely they don’t take up too much memory, and as long as you don’t keep making them throughout the app would it really matter? I would only create them when the app initializes (main.lua) and never create them on my “objects” (external scripts that a called multiple times) - I can definitely see why that would be an issue

Craig I think the ‘no globals’ is best practice. I guess they try to hammer it home early on so people don’t get accustomed to using them. I must admit I did use a few and then went down to one (I had a global table which is where I stored my ‘globals’), before someone kindly posted code which removed the need for all globals but maintained the same functionality.

Here’s the code - put it into a new lua file, call it globals.lua and you are almost ready.

[lua]local _data = {}

return data[/lua]

To use it in any file you need globals just do something like:

[lua]local _myGlobals = require( “globals.lua” )[/lua]And that is it! You then just create and read globals by putting them into the _myGlobals variable, eg:

[lua]_myGlobals.playerLives = 10

print( _myGlobals.playerLives )[/lua]Simple ‘globals’ without any globals. I’d probably not use _myGlobals as a variable name, I soon tire of anything longer than a few characters!

Right, I get that. but it still makes accessing these same variables between different objects a little difficult. Every time you ‘require’ that table, you are essentially resetting all the variables. I guess you could pass it as an argument into an objects ‘new’ function in order to keep referencing it.

But yeah, that data table would be fine for static variables.

Craig, no, you don’t reset the table.

The first time you require a file, it loads it into memory. However, future requires don’t reload it, they merely pass a reference to the already-loaded module, which is why the values persist.

Ah cool thanks for that. All my external scripts have tables created within function scopes, so I am used to doing things like: local newHeader = require(“Classes.Objects.Header”).new(“titleHere”), and it of course would return a new ‘instance’. I was not sure how the require operation actually operated itself.

I tried substituting your code above, and it make my screenview about 3 times what it normally is, causing all objects to be very small. The _W and _H didn’t really set anything in the correct places either. 

If we originally patterned out config files after the “ultimate config.lua” post best practices, would we need to re-evaluate our code to fit your examples above?

Well, did you set the width and height in the config.lua to what you want as your ‘minimum’? I think most people nowadays assume a larger ‘default’ resolution but I still work at old iphone resolution and let content scaling take care of the rest.

Basically what my code will do is give you a resolution that is 0,0 in the top left, and extends to _W, _H, which will be the resolution closest to what you specify in the config.lua, while filling the entire screen and without distortion of aspect ratios.

What it does mean is you need to make your app position stuff relative to the _W, _H screen size, but my understanding of the ultimate config.lua file is that you end up with different resolutions depending on the device anyway - so using it your code must already handle variable resolutions.

If you set up the code as I have it above, then say you want to position something at the top right, it would be x, y = _W, 0. Bottom right would be x, y = _W, _H etc, which I figure most people are already doing (or some variant thereof), unless they are strictly using letterbox with borders.

I’ve never touched these config files, as they just seem too complicated for what they are trying to achieve, but I believe if your code already works on variable resolutions, then you’d just have to replace display.contentWidth with _W and display.contentHeight with _H throughout the project and it should just work.

If you had a simple example of a project using a complex config file you don’t mind me looking at, I’d happily see what was required to make the change, so we could determine if it was actually worth it or not in general.

Here are some updated screenshots showing the different in config files:

normal:

application = { content = { width=320, height=480, fps = 60, scale = "letterbox", }, }

updated:

application = { content = { width = 320, height = 480, scale = "letterbox", xAlign = "left", yAlign = "top", }

Seems kinda strange that it stretches so much even though basically nothing has changed. Whadda ya think?

That is extremely odd - but then the first one isn’t using any complicated config file.

In-game how do you calculate the width and height of the display - display.contentWidth etc?

You used the other code I listed to create _W and _H right? Did you use the right one per landscape mode? It looks to me like it may have the _W and _H values mixed up - note for landscape you set _W to actualDisplayHeight, not width, and ditto for _H.