androidDisplayApproximateDpi

I have some androidDisplayApproximateDpi related questions:

A) Why isn’t there a displayApproximateDpi that covers both Android and iOS (and other) devices all in one? There are all kind of hacks and workarounds for determining the DPI of iOS devices. Are there any good  reasons not to just gather this in the SDK so that us users doesn’t have to mess with this all the time for all our apps?

B) The androidDisplayApproximateDpi returns nil when running in the simulator, even for (Android) devices that are listed in the View As list (Samsung Galaxy S3 etc). Is this really necessary?

I’m starting to shy away from using the content width/height in the config.lua and instead try to place the GUI elements with some kind of intelligence and getting it right depends on having the correct DPI.

First, the “androidDisplayApproximateDpi” property does not return the screen’s actual DPI.  If you look at our documentation (link below), it returns the hard coded DPI constant values of Google’s mdpi, hdpi, xhdpi xxhdpi, etc. screen designators.  These are set by the device manufacture and are expected to be *close* to the screen’s DPI.

   https://docs.coronalabs.com/daily/api/library/system/getInfo.html#androiddisplayapproximatedpi

And you can use the above approximate DPI to determine what the Android device’s recommended native scale factor should be.  Or really how to convert pixels to Andorid’s “dp” (Device-independent pixel) coordiantes, which is kind of the equivalent to Apple native point system.  On Android, you would calculate it via the following pseudo code…

   androidScale = androidApproximateDpi / mdpi

…which in Lua would look like this…

   local androidScale = system.getInfo(“androidDisplayApproximateDpi”) / 160

Note that Android documents what these mdpi/hdpi/etc. values are here…

   http://developer.android.com/guide/practices/screens_support.html#range

   http://developer.android.com/reference/android/util/DisplayMetrics.html

So, as you can see above, this property really is an Android exclusive feature. 

Now if what you are really after is the screen actual DPI (not the approximate DPI), then unfortunately there is no 100% reliable way of doing this.  While Android does have APIs for fetching this (Corona exposes these via “androidDisplayXDpi” and “androidDisplayYDpi”), some Android device manufacturers actually return the wrong value.  Again, making it unreliable.  And last I’ve checked, iOS does not provide an API for fetching the screen’s DPI, forcing you to assume the screen’s DPI based on the running device model, which is not future proof.

So, there you go.  That’s the reason why.  Nothing is ever easy, eh?

I don’t buy this explanation completely.

I don’t need a 100% correct DPI. If a button turns out to be 10 or 11 mm I don’t really care. I guess there’s a good reason why the function is called androidDisplayApproximateDpi  and not androidDisplayDpi.

I don’t care much about the “future proof” argument either to be honest.

If you encounter an unknown device you could return nil as DPI. But for the devices that you recognize, man, why not pass the DPI on to us so that WE don’t have to keep messing with this stuff?

As things work now, each of us is sitting with their own iOS DPI code/table that has to be maintained. It’s really something that the SDK should let us don’t have to.

Isn’t is so that you can, if you want to, know exactly which kind iOS device you’re running on? Then it’s just a matter of maintaining a device/DPI list. I can google the DPI per known iOS device for you if that’s the problem.  :smiley:

 

>> I can google the DPI per known iOS device for you if that’s the problem.

My point here is that it’s not future proof.  If someone downloads your app on a new Apple device model and you haven’t updated your app in-time to compensate, then it’ll mishandle it.  In our opinion, that’s the wrong way to handle it.  It’s unreliable and error prone.

Now, if you want to handle it this way, then go for it.  Our system.getInfo() API can be used to query the device model.  You already have the tools to do it yourself, but again, I’m saying it’s not future proof so it’s up to you to update your apps in-time before a new device model gets released or provide a proper fallback mechanism.

First, the “androidDisplayApproximateDpi” property does not return the screen’s actual DPI.  If you look at our documentation (link below), it returns the hard coded DPI constant values of Google’s mdpi, hdpi, xhdpi xxhdpi, etc. screen designators.  These are set by the device manufacture and are expected to be *close* to the screen’s DPI.

   https://docs.coronalabs.com/daily/api/library/system/getInfo.html#androiddisplayapproximatedpi

And you can use the above approximate DPI to determine what the Android device’s recommended native scale factor should be.  Or really how to convert pixels to Andorid’s “dp” (Device-independent pixel) coordiantes, which is kind of the equivalent to Apple native point system.  On Android, you would calculate it via the following pseudo code…

   androidScale = androidApproximateDpi / mdpi

…which in Lua would look like this…

   local androidScale = system.getInfo(“androidDisplayApproximateDpi”) / 160

Note that Android documents what these mdpi/hdpi/etc. values are here…

   http://developer.android.com/guide/practices/screens_support.html#range

   http://developer.android.com/reference/android/util/DisplayMetrics.html

So, as you can see above, this property really is an Android exclusive feature. 

Now if what you are really after is the screen actual DPI (not the approximate DPI), then unfortunately there is no 100% reliable way of doing this.  While Android does have APIs for fetching this (Corona exposes these via “androidDisplayXDpi” and “androidDisplayYDpi”), some Android device manufacturers actually return the wrong value.  Again, making it unreliable.  And last I’ve checked, iOS does not provide an API for fetching the screen’s DPI, forcing you to assume the screen’s DPI based on the running device model, which is not future proof.

So, there you go.  That’s the reason why.  Nothing is ever easy, eh?

I don’t buy this explanation completely.

I don’t need a 100% correct DPI. If a button turns out to be 10 or 11 mm I don’t really care. I guess there’s a good reason why the function is called androidDisplayApproximateDpi  and not androidDisplayDpi.

I don’t care much about the “future proof” argument either to be honest.

If you encounter an unknown device you could return nil as DPI. But for the devices that you recognize, man, why not pass the DPI on to us so that WE don’t have to keep messing with this stuff?

As things work now, each of us is sitting with their own iOS DPI code/table that has to be maintained. It’s really something that the SDK should let us don’t have to.

Isn’t is so that you can, if you want to, know exactly which kind iOS device you’re running on? Then it’s just a matter of maintaining a device/DPI list. I can google the DPI per known iOS device for you if that’s the problem.  :smiley:

 

>> I can google the DPI per known iOS device for you if that’s the problem.

My point here is that it’s not future proof.  If someone downloads your app on a new Apple device model and you haven’t updated your app in-time to compensate, then it’ll mishandle it.  In our opinion, that’s the wrong way to handle it.  It’s unreliable and error prone.

Now, if you want to handle it this way, then go for it.  Our system.getInfo() API can be used to query the device model.  You already have the tools to do it yourself, but again, I’m saying it’s not future proof so it’s up to you to update your apps in-time before a new device model gets released or provide a proper fallback mechanism.