Performance Using Tables

Hi everyone, just had a quick performance related questions. To speed up development when using Storyboard I have started using a system for loading objects that makes it easier to unload memory codewise in exitScene and destroyScene functions.

What I do is at the top of my lua file I specify several tables such as f = {} for functions, v for variables, and dos for display objects

I insert ALL of my display objects in that scene into the dos table when I load them up.

Ex:

local dos = {}  
  
dos.object = display.newImage("example.png", more parameters)  
  

I use these objects in runtime listeners and am forced to refer to them with the “dos” extensions such as

  
dos.object.x = 100  
  

I do not mind doing this at all since it pays off when I can just iterate through most of the table in destoryScene and remove these objects VERY fast. I was wondering if this causes performance issues (haven’t personally noticed a difference in my games using a 2 year old android device and a fast moving game). Also if this does cause performance issues, exactly how bad are they? [import]uid: 164950 topic_id: 36303 reply_id: 336303[/import]

You might actually gain a bit of performance from this, albeit probably not noticeable. Generally, if you can deep-localize something, you’re better off. For example:

Case 1:

local a = 1  
local b = 2  
local function foo()  
 a = 2  
 b = 3  
end  

Case 2:

local t = { a = 1, b = 2 }  
local function foo()  
 local t = t  
 t["a"] = 2  
 t["b"] = 3  
end  

Case 2 is probably a shade faster, because you’re only up-referencing the table “t” once and then setting values within it. Real world though, I doubt you’d see any difference in performance… maybe you would if it was a loop of 100,000 set-variable actions or something.

What could help is that by storing your items in holding tables, you’ll stay clear and safe from the “upvalue limit” which happens if you put too many locals just sitting up there by themselves. Not to mention the organizational/cleanup benefits, which seems to be a key reason you did all of this. :wink:

Brent [import]uid: 200026 topic_id: 36303 reply_id: 144246[/import]

You might actually gain a bit of performance from this, albeit probably not noticeable. Generally, if you can deep-localize something, you’re better off. For example:

Case 1:

local a = 1  
local b = 2  
local function foo()  
 a = 2  
 b = 3  
end  

Case 2:

local t = { a = 1, b = 2 }  
local function foo()  
 local t = t  
 t["a"] = 2  
 t["b"] = 3  
end  

Case 2 is probably a shade faster, because you’re only up-referencing the table “t” once and then setting values within it. Real world though, I doubt you’d see any difference in performance… maybe you would if it was a loop of 100,000 set-variable actions or something.

What could help is that by storing your items in holding tables, you’ll stay clear and safe from the “upvalue limit” which happens if you put too many locals just sitting up there by themselves. Not to mention the organizational/cleanup benefits, which seems to be a key reason you did all of this. :wink:

Brent [import]uid: 200026 topic_id: 36303 reply_id: 144246[/import]

You might actually gain a bit of performance from this, albeit probably not noticeable. Generally, if you can deep-localize something, you’re better off. For example:

Case 1:

local a = 1  
local b = 2  
local function foo()  
 a = 2  
 b = 3  
end  

Case 2:

local t = { a = 1, b = 2 }  
local function foo()  
 local t = t  
 t["a"] = 2  
 t["b"] = 3  
end  

Case 2 is probably a shade faster, because you’re only up-referencing the table “t” once and then setting values within it. Real world though, I doubt you’d see any difference in performance… maybe you would if it was a loop of 100,000 set-variable actions or something.

What could help is that by storing your items in holding tables, you’ll stay clear and safe from the “upvalue limit” which happens if you put too many locals just sitting up there by themselves. Not to mention the organizational/cleanup benefits, which seems to be a key reason you did all of this. :wink:

Brent [import]uid: 200026 topic_id: 36303 reply_id: 144246[/import]

You might actually gain a bit of performance from this, albeit probably not noticeable. Generally, if you can deep-localize something, you’re better off. For example:

Case 1:

local a = 1  
local b = 2  
local function foo()  
 a = 2  
 b = 3  
end  

Case 2:

local t = { a = 1, b = 2 }  
local function foo()  
 local t = t  
 t["a"] = 2  
 t["b"] = 3  
end  

Case 2 is probably a shade faster, because you’re only up-referencing the table “t” once and then setting values within it. Real world though, I doubt you’d see any difference in performance… maybe you would if it was a loop of 100,000 set-variable actions or something.

What could help is that by storing your items in holding tables, you’ll stay clear and safe from the “upvalue limit” which happens if you put too many locals just sitting up there by themselves. Not to mention the organizational/cleanup benefits, which seems to be a key reason you did all of this. :wink:

Brent [import]uid: 200026 topic_id: 36303 reply_id: 144246[/import]