T, it’s just the device screen height in inches that’s needed, not DPI.
The reason is that when we set a point size, say 20, we are asking for a font that is 7.52 mm tall – for the device we are targeting (what our contentScale is set for, and all our assets are sized for). If your content scale is 320x480 or 640x960, you are likely targeting a standard iphone, whos screen is 3.5 inches tall.
I’m curious as to how bad the height reporting is on android. Is it just lesser brands, or, as is hinted in your post, major brands like the Samsung S3 (hinted at in your post).
Is it like 5% of android phones report the wrong screen height, or more like 25%? Off brand, or mainstream phones? Any idea?
Also, I started putting together code to deal with this, starting with cleaning up my iOS side (the old methods use display.contentScaleX and have problems dealing with 3gs vs 4, etc)… At this point, my iOS side looks like this (stole a little detection code from a BeyondtheTech forum post).
[lua]
print(" – -- iOS font size calc…")
local fontScale = 1
local heightIn = 3.5 – default to stnadard iPhone size
if( sysType == “iPhone2,1” ) then – 3GS
heightIn = 3.5
print(" – iPhone 3GS detected")
elseif( string.match(sysType, “iPhone3”) or string.match(sysType, “iPhone4”) ) then – iPhone 4, 4S
heightIn = 3.5
print(" – iPhone 4 detected")
elseif sysType == “iPad2,5” or sysType == “iPad2,6” or sysType == “iPad2,7” or sysType == “iPad2,8” then – 2,8 is unknown, just a guess at next mini #
heightIn = 6.25 – mini
print(" – iPad mini detected")
elseif( string.match(sysType, “iPad”) ) then – standard iPad
heightIn = 7.75
print(" – iPad detected")
else
print(" – unrecognized iOS device, using default 3.5 inches tall") – iPods will fall to here, they are 3.5 anyways…
end
fontScale = heightIn / 3.5 – 3.5 is actual iPhone height
inputFontSize = targetFontSize*fontScale – inputFontSize is point size to use in native call
[/lua]
Only got to test it on my iPad (retina) so far, but the calculated font size appears to be dead on. Also, note that the heightIn used is the straight up and down (NOT diagonal) height of the screen - as this is what the point size relates to (direct, vertical height, not diagonal).
Edit: oh snap, the actual height of 3gs, 4, 4s, etc is 3 inches… Seems to work with diag height of 3.5, but the numbers aren’t quite perfect… Hmmm.
OK, I think I understand the issue (3 inches vs. 3.5…). I think 3 is the correct value to calc the proper point size - however, there is an additional source of error I realized that needs to be considered.
When you use content scaling, the SDK will try and fill the device space as best it can with your app. This can leave empty space above or below (or to the sides, which aren’t a concern for the font calc).
What this means is that the app might be mapped onto just 3 inches of the device. Not a problem on the 3gs, 4, or 4s. But the iPhone 5 is taller.
To deal with it, corona chops off a little on the top and bottom, making the effective area for your app 3 inches tall. Using the straight up real height of the iphone 5 would throw the calc off (as the sdk has already chopped the effective height being mapped onto).
This same issue, the space the content scaling throws away above/below the app would also need to be factored in on the droid side as well.