Problem with loadsave

I’m using loadsave (https://coronalabs.com/blog/2014/10/14/tutorial-saving-and-loading-lua-tables-with-json/) to store a lua table and load it. But I’ve noticed I can’t access some infos in the loaded version of the table.

Example:

When storing a table which can be accessed like for example for getting the size of  #MyTable [1][1][2] I get a correct value (like for example 3 for three elements there) BUT when I saved the table, reload it via loadsave.loadTable I can’t get the info like before. Can it be the loadTable is not getting the exact format of the table back?

Not knowing the data you’re working with, I can’t be sure… But a common problem I’ve seen with my own data is that when de/serialising tables to JSON you want to ensure that the values in the table are EITHER numerically indexed OR name indexed - NEVER both in the same table.

So, if you are keeping a list of values and looping through them as 1, 2, 3 etc make sure you don’t have any named entries, like “.dave” as keys. Similarly, if you have any values such as .gameScore or .playerCount in there, make sure you don’t access values with [1], [2] or [3], and so on.

This could be it. But how can I get the info for a nested tables length then after the table was saved and loaded if this is “messing” up the values like this? How can I still access the info with [1], [2] or [3] and so on?

That’s my point - you can’t. That’s what “messed up” means, in this case.

Saving a table which has both numerically and name indexed entries will cause problems because the json serialiser will write the numerical indexes as strings and not numbers - they will no longer be numbers. Yes, I know that sounds weird and unintuitive, but that’s what happens.

You might still have access to them as [“1”], [“2”] and [“3”] which looks like it should automatically work as [1], [2] and [3], but it doesn’t and it won’t.

Look in your json files which have been saved to your documents folder and check for “1”: “someValue” in your json structures. Somewhere in that same structure you’ll also see other keys like “someKey”: “someValue” and the table which produced that file is the culprit.

Is there some kind of “workaround” to use the “new” table exact the same way as the original one? Is it possible to somehow convert the wrong values back to the original ones?

Yes, edit the broken json file.

Can’t you just recreate the file?

You need to change your code logic so it doesn’t write the file that way to begin with.

Not knowing the data you’re working with, I can’t be sure… But a common problem I’ve seen with my own data is that when de/serialising tables to JSON you want to ensure that the values in the table are EITHER numerically indexed OR name indexed - NEVER both in the same table.

So, if you are keeping a list of values and looping through them as 1, 2, 3 etc make sure you don’t have any named entries, like “.dave” as keys. Similarly, if you have any values such as .gameScore or .playerCount in there, make sure you don’t access values with [1], [2] or [3], and so on.

This could be it. But how can I get the info for a nested tables length then after the table was saved and loaded if this is “messing” up the values like this? How can I still access the info with [1], [2] or [3] and so on?

That’s my point - you can’t. That’s what “messed up” means, in this case.

Saving a table which has both numerically and name indexed entries will cause problems because the json serialiser will write the numerical indexes as strings and not numbers - they will no longer be numbers. Yes, I know that sounds weird and unintuitive, but that’s what happens.

You might still have access to them as [“1”], [“2”] and [“3”] which looks like it should automatically work as [1], [2] and [3], but it doesn’t and it won’t.

Look in your json files which have been saved to your documents folder and check for “1”: “someValue” in your json structures. Somewhere in that same structure you’ll also see other keys like “someKey”: “someValue” and the table which produced that file is the culprit.

Is there some kind of “workaround” to use the “new” table exact the same way as the original one? Is it possible to somehow convert the wrong values back to the original ones?

Yes, edit the broken json file.

Can’t you just recreate the file?

You need to change your code logic so it doesn’t write the file that way to begin with.