Getting Table Length

Hi all,

I’m loading and saving a table as json so the format needs to be as follows:

“myTable”:{
“1”:{“min”:0,“value”:8,“hour”:0,“day”:7},
“2”:{“min”:0,“value”:10,“hour”:2,“day”:7},
“3”:{“min”:0,“value”:7,“hour”:6,“day”:7},
}

The problem is I now need to find the length / number of children for “myTable”. I’ve tried myTable.getn and also #myTable but they don’t work due to the format. Anyone know of a quick way to get the table length?

Thanks!

Neil [import]uid: 173326 topic_id: 32737 reply_id: 332737[/import]

The issue is that 1, 2, 3 keys are strings (hence the ").
You can use a JSONArray structure

{  
 "myTable":  
 [  
 {"min":0,"value":8,"hour":0,"day":7},  
 {"min":0,"value":10,"hour":2,"day":7},  
 {"min":0,"value":7,"hour":6,"day":7}  
 ]  
}  

Then when you load your file:

local tbl = require("json").decode([MyJsonAsString])  
for idx, val in ipairs(tbl.myTable) do  
 print(val.min, val.value, val.hour, val.day)  
end  

If you use [] the numerical indices are implied in Json, This also makes it much easier to insert more elements later, since you won’t have to reorganize the index keys either. [import]uid: 134101 topic_id: 32737 reply_id: 130147[/import]

Hi Ntero,

The problem with using [] comes when you try to encode it back into json. For some reason it doesn’t pick up anything inside [] so that is why I had to convert everything to { }.

Your second paragraph does work for calculating the length though (in a dirty kind of way). I converted it to this:

[lua]dataLength = 0;
for key,value in pairs(myTable) do
if(tonumber(key)>dataLength)then
dataLength = tonumber(key);
end
end
print(dataLength);[/lua]

Seems to do the trick. Thanks! [import]uid: 173326 topic_id: 32737 reply_id: 130154[/import]

The issue is that 1, 2, 3 keys are strings (hence the ").
You can use a JSONArray structure

{  
 "myTable":  
 [  
 {"min":0,"value":8,"hour":0,"day":7},  
 {"min":0,"value":10,"hour":2,"day":7},  
 {"min":0,"value":7,"hour":6,"day":7}  
 ]  
}  

Then when you load your file:

local tbl = require("json").decode([MyJsonAsString])  
for idx, val in ipairs(tbl.myTable) do  
 print(val.min, val.value, val.hour, val.day)  
end  

If you use [] the numerical indices are implied in Json, This also makes it much easier to insert more elements later, since you won’t have to reorganize the index keys either. [import]uid: 134101 topic_id: 32737 reply_id: 130147[/import]

I think you are doing something incorrect when encoding.

Using strings as numerical keys is messy and is only going to make your work worse as you continue to use them. Using integers or actual array objects is saving you a lot of hassle later.

Playing with encoding and decoding I see no issues with []

Here is a quick test code for processing the json file with it re-encoding and printing to verify the file is ok, as well as how to go about inserting new elements.

local function test()  
 local vars = {}  
  
 local path = system.pathForFile("test.json", system.ResourceDirectory)  
 local file = io.open(path, "r")  
 if file then  
 local fileStr = file:read("\*a")  
 file:close()  
  
 print("Json", fileStr)  
 if fileStr then  
 vars = require("json").decode(fileStr) or {}  
 end  
  
 --alternatively you could do 'for i = 1, #vars.myTable do local val = vars.myTable[i] for iterating  
 for idx, val in ipairs(vars.myTable) do  
 print(val.min, val.value, val.hour, val.day)  
 end  
  
 table.insert(vars.myTable, {min = 0, value = "Some String", hour = 24, day = 7})  
  
 print(require("json").encode(vars))  
 end   
end  

And using the same json file (called test.json in here) with [] there are no issues. It also adds the benefit that they are guaranteed first to last ordered (and therefore reorderable on LUA or Json sides and both will maintain that order), and you don’t lose your # operator functionality as you do with string-based numerical keys. [import]uid: 134101 topic_id: 32737 reply_id: 130165[/import]

Hi Ntero,

The problem with using [] comes when you try to encode it back into json. For some reason it doesn’t pick up anything inside [] so that is why I had to convert everything to { }.

Your second paragraph does work for calculating the length though (in a dirty kind of way). I converted it to this:

[lua]dataLength = 0;
for key,value in pairs(myTable) do
if(tonumber(key)>dataLength)then
dataLength = tonumber(key);
end
end
print(dataLength);[/lua]

Seems to do the trick. Thanks! [import]uid: 173326 topic_id: 32737 reply_id: 130154[/import]

I think you are doing something incorrect when encoding.

Using strings as numerical keys is messy and is only going to make your work worse as you continue to use them. Using integers or actual array objects is saving you a lot of hassle later.

Playing with encoding and decoding I see no issues with []

Here is a quick test code for processing the json file with it re-encoding and printing to verify the file is ok, as well as how to go about inserting new elements.

local function test()  
 local vars = {}  
  
 local path = system.pathForFile("test.json", system.ResourceDirectory)  
 local file = io.open(path, "r")  
 if file then  
 local fileStr = file:read("\*a")  
 file:close()  
  
 print("Json", fileStr)  
 if fileStr then  
 vars = require("json").decode(fileStr) or {}  
 end  
  
 --alternatively you could do 'for i = 1, #vars.myTable do local val = vars.myTable[i] for iterating  
 for idx, val in ipairs(vars.myTable) do  
 print(val.min, val.value, val.hour, val.day)  
 end  
  
 table.insert(vars.myTable, {min = 0, value = "Some String", hour = 24, day = 7})  
  
 print(require("json").encode(vars))  
 end   
end  

And using the same json file (called test.json in here) with [] there are no issues. It also adds the benefit that they are guaranteed first to last ordered (and therefore reorderable on LUA or Json sides and both will maintain that order), and you don’t lose your # operator functionality as you do with string-based numerical keys. [import]uid: 134101 topic_id: 32737 reply_id: 130165[/import]

Hi Ntero,

Thanks for your help. I have the json saving and loading ok now using the emulator, however, for some reason it won’t even read/decode the json when it’s installed onto my iPhone 4?

I’m guessing it’s something to do with system.ResourceDirectory and that it can’t even find the .json file? Any ideas what is going wrong?

Thanks,

Neil [import]uid: 173326 topic_id: 32737 reply_id: 130300[/import]

Hi Ntero,

Thanks for your help. I have the json saving and loading ok now using the emulator, however, for some reason it won’t even read/decode the json when it’s installed onto my iPhone 4?

I’m guessing it’s something to do with system.ResourceDirectory and that it can’t even find the .json file? Any ideas what is going wrong?

Thanks,

Neil [import]uid: 173326 topic_id: 32737 reply_id: 130300[/import]

on iOS, system.ResourceDirectory is read-only. You should use caches or documents if you want to do saving.

The other issue is only ResouceDirectory is uploaded with the build, so you may need to intially load from resources, but once you save to Documents, you’ll need to load from there. [import]uid: 134101 topic_id: 32737 reply_id: 130303[/import]

on iOS, system.ResourceDirectory is read-only. You should use caches or documents if you want to do saving.

The other issue is only ResouceDirectory is uploaded with the build, so you may need to intially load from resources, but once you save to Documents, you’ll need to load from there. [import]uid: 134101 topic_id: 32737 reply_id: 130303[/import]