In my game. But lua looses precision after numbers greater than 2^53.
It’s not Lua at fault, it’s IEEE754 doubles.
But what’s the real issue - just output fomatting or actual internal precision?
I’ll tell you that the vast majority of incremental games out there (including ones that I’ve worked on) just use IEEE754 doubles and get away with it just fine. Because they’re NOT trying to add 1 to 1E33, for example.
That is, it’s the nature of the game that in order to get a ridiculously large “bank”, you need correspondingly ridiculously large “producers”. So all the math is always happening “within range” of what double precision is capable of.
Yes, you’d start losing “pennies” eventually, as you run out of bits to represent them, but who cares?! Most of these games, once beyond a certain point (usually about 1e9) start using some sort of modified scientific notation and maybe give you four or five decimals at most (fe 1.2345De for decillions, rarely would you see all 33 digits printed)
Consider: If your balance is in the 1e33 range, and you THINK you need to be worried about some “producer” making $1/second, … it would take something like 1e12 years before it even affected the 15th decimal place! Few of your players will live that long.